DWG Sheet Combination failing on AutoDesk Forge - autodesk-forge

We are using Forge to import a STEP file into the modelspace of an output.DWG. Then a DLL combines modelspace geometry of several DWG files into several layout/paperspace of a single DWG. This sheet combination was working perfectly until just recently, when the combination process completely stopped happening.
Has something in Forge changed recently that we're not aware of? Updates/patches, or something like that which could have caused this issue?
This is an issue for a production application and is considered an outage at this point, and is very time-sensitive.
Edit: Here are some differences we noticed between the log files generated by this process. In this first section, the verbiage being written by AutoCAD has changed slightly during an extraction process:
[08/01/2019 17:15:35] End downloading https://.... 1556909 bytes have been unpacked to folder T:\Aces\Jobs\a43e5ca7faaa4db8b5374aaef71b36d3\cadlayouts.
[08/19/2019 17:25:53] End downloading file https://.... 1771363 bytes have been written to T:\Aces\Jobs\d12f3bed13b84d29b31226222e3cf3c9\cadlayouts.
In the log from 8/19, all lines logged in between:
Start AutoCAD Core Engine standard output dump.
And:
End AutoCAD Core Engine standard output dump.
Are being written twice, but this did not happen in the log file from August 1st or any of the logs before that date.
Edit 2:
Yesterday we used the .NET DirectoryInfo class to pull all directories into one list and all files into another and write them all to the log. The cadlayouts entity that should be recognized as a directory (because it's a zip that is extracted by Forge) is instead listed as a file. Our process runs a Directory.Exists() check before the work item merges the DWGs into the output, and this call returns false for the cadlayouts folder, bypassing our combination logic. How can the Forge zip extraction process be working correctly if the resulting entity on the file system is not considered a directory?

It sounds like you have an input argument that is a zip and you expect it to be unzipped into a folder. Please look row 4 in the table below. I suspect that this is what you are experiencing. There WAS a recent change here: we used to look at downloaded bits and unconditionally uncompressed if we found a zip header. (i.e. we acted identically for row 3 and row 4). We now only do this if you ask us to do it.
EDIT: The first column in the table is the value of the zip attribute of Activity's parameters while the second column is the pathInzip attribute of Workitem's arguments.
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| # | Activity | Workitem | Arg direction | Comments |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 1 | zip==true | pathInZip!=null | input | Zip is uncompressed to the folder specified in localname. Any path reference to this argument will expand to full path of pathInZip. |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 2 | zip==false | pathInZip!=null | input | Zip is uncompressed to the folder specified in localname. Any path reference to this argument will expand to full path of pathInZip. |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 3 | zip==true | pathInZip==null | input | If zip is provided then it is uncompressed to the folder specified in localname. Any path reference to this argument will expand to full path of localName. |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 4 | zip==false | pathInZip==null | input | If zip is provided then it is left compressed. Any variable referencing this argument will expand to full path of localName. |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 5 | zip==true | pathInZip!=null | output | Workitem will be rejected. |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 6 | zip==false | pathInZip!=null | output | Workitem will be rejected. |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 7 | zip==true | pathInZip==null | output | Output(s) at localName will be zipped if localName is a folder. |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 8 | zip==false | pathInZip==null | output | Output at localName will not be zipped. |
+---+------------+-----------------+---------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+

Related

Insert Object Array or CSV file content into Kusto Table

Unable to insert data from object array or csv file into kusto table
My goal is to build a pipeline in Azure DevOps which reads data using PowerShell and writes the data into Kusto Table.
I was able to write the data which I have read from PowerShell to object Array or csv file but I am unable to figure out the ways in which this data can be inserted into Kusto table.
Could any one suggest the best way to write the data into kusto
one option would be to write your CSV payload to blob storage, then ingest that blob into your target table, by:
using a "queued ingestion" client in one of the client libraries: https://learn.microsoft.com/en-us/azure/kusto/api/
note that the .NET ingestion client library also provides you with methods to IngestFromStream or IngestFromDataReader, which handle writing the data to intermediate blob storage so that you don't have to
or by
issuing an .ingest command: https://learn.microsoft.com/en-us/azure/kusto/management/data-ingestion/ingest-from-storage. though using "direction ingestion" is less recommended for Production volumes
another option (not recommended for Production volume), would be using the .ingest inline (AKA "ingest push") option: https://learn.microsoft.com/en-us/azure/kusto/management/data-ingestion/ingest-inline
for example:
.create table sample_table (a:string, b:int, c:datetime)
.ingest inline into table sample_table <|
hello,17,2019-08-16 00:52:07
world,71,2019-08-16 00:52:08
"isn't, this neat?",-13,2019-08-16 00:52:09
which will append the above records to the table:
| a | b | c |
|-------------------|------|-----------------------------|
| hello | 17 | 2019-08-16 00:52:07.0000000 |
| world | 71 | 2019-08-16 00:52:08.0000000 |
| isn't, this neat? | -13 | 2019-08-16 00:52:09.0000000 |

Mkdocs hyperlink not working in static pages

I'm trying to build a documentation with mkdocs.
The problem is that the links in the static created pages are not working.
Instead of going to [folder]/index.html I'm presented with the following page like in the following image
The problem however doesn't exist when i try mkdocs serve
Set the use_directory_urls setting to false in your mkdocs.yml config file:
use_directory_urls: false
The documentation explains:
This setting controls the style used for linking to pages within the
documentation.
The following table demonstrates how the URLs used on the site differ
when setting use_directory_urls to true or false.
Source file | Generated HTML | use_directory_urls: true | use_directory_urls: false
------------ | -------------------- | ------------------------ | ------------------------
index.md | index.html | / | /index.html
api-guide.md | api-guide/index.html | /api-guide/ | /api-guide/index.html
about.md | about/index.html | /about/ | /about/index.html
The default style of use_directory_urls: true creates more user
friendly URLs, and is usually what you'll want to use.
The alternate style can occasionally be useful if you want your
documentation to remain properly linked when opening pages directly
from the file system, because it create links that point directly to
the target file rather than the target
directory.
The last paragraph is the key to why this makes a difference.

How to set up system properties per user?

We're upgrading to play 2.3.5 and it's the first time I've used the activator.
If I run the activator headless, I can still pass in a bunch of command line flags, but if I try out the new UI I don't know how to pass in overrides for my developer setup (which are different from other developers). I don't see a way to set unique java properties in a meta activator config that we would exclude from version control.
-Dlogger.file=./conf/my-special-logger.xml -Dprop1=special -Dconfig.file=./conf/my-special-file.conf
I can symlink my-special-file.conf to application.conf and get most of what I want. It's not really an ideal solution and if I leave the symlink in place during bundling, the packager blows up.
[error] (*:stage) Duplicate mappings:
[error] ./my-project/target/universal/stage/conf/my-special-file.conf
[error] from
[error] ./my-project/conf/application.conf
[error] ./my-project/conf/my-special-file.conf
Typesafe Activator uses ~/.activator/activatorconfig.txt as a means of setting Java system properties.
With the following ~/.activator/activatorconfig.txt:
-Dhello=world
I could query for the hello property in the shell:
[play-new-app] $ eval sys.props("hello")
[info] ans: String = world
As a reference - this is for Play 2.3.5:
[play-new-app] $ dependencies
...
+------------------------------------------------------------+------------------------------------------------------------+--------------------------------------------+
| Module | Required by | Note |
+------------------------------------------------------------+------------------------------------------------------------+--------------------------------------------+
...
+------------------------------------------------------------+------------------------------------------------------------+--------------------------------------------+
| com.typesafe.play:play_2.11:2.3.5 | com.typesafe.play:play-ws_2.11:2.3.5 | As play_2.11-2.3.5.jar |
| | com.typesafe.play:play-jdbc_2.11:2.3.5 | |
| | play-new-app:play-new-app_2.11:1.0-SNAPSHOT | |
| | com.typesafe.play:play-cache_2.11:2.3.5 | |
+------------------------------------------------------------+------------------------------------------------------------+--------------------------------------------+

How to load a json data file into a variable in robot framework?

I Am trying to load a json data file into a variable directly in Robot Framework. Can anyone please elaborate with an e.g. giving the exact syntax as to how to do it?
Thanks in advance :)
One way would be to use the Get File keyword from the OperatingSystem library, and then use the built-in Evaluate keyword to convert it to a python object.
For example, consider a file named example.json with the following contents:
{
"firstname": "Inigo",
"lastname": "Montoya"
}
You can log the name with something like this:
*** Settings ***
| Library | OperatingSystem
*** Test Cases ***
| Example of how to load JSON
| | # read the raw data
| | ${json}= | Get file | example.json
| |
| | # convert the data to a python object
| | ${object}= | Evaluate | json.loads('''${json}''') | json
| |
| | # log the data
| | log | Hello, my name is ${object["firstname"]} ${object["lastname"]} | WARN
Of course, you could also write your own library in python to create a keyword that does the same thing.
There is a library available for this: HttpLibrary.HTTP
${json}= | Get file | example.json
${port} | HttpLibrary.HTTP.Get Json Value | ${json} | /port
log | ${port}
API Document is available here: http://peritus.github.io/robotframework-httplibrary/HttpLibrary.html
A common use is passing the json data to another library like Http Library Requests. You could do:
*** Settings ***
Library OperatingSystem
Library RequestsLibrary
*** Test Cases ****
Create User
#...
${file_data}=
... Get Binary File ${RESOURCES}${/}normal_user.json
Post Request example_session /user data=${file_data}
#...
No direct python involved and no intermediary json object.
Thanks Vinay .. that helped now we can retrieve data from json file in robot framework as well
*** Settings ***
Library HttpLibrary.HTTP
Library OperatingSystem
*** Test Cases ***
Login_to_SalesForce_Json
${jsonfile} Get File c:/pathtojason/Data/testsuite.json
${username} Get Json Value ${jsonfile} /test_case1/username
log ${username}
Below is the json file structure
{
"test_case1":
{
"username":"User1",
"password":"Pass1"
}
,
"test_case2":
{
"username1":"User2",
"password1":"Pass2"
}
}
Prerequiste is:pip install --trusted-host pypi.python.org robotframework-httplibrary
I had similar issue and this work fine with me:
${json} Get Binary File ${json_path}nameOfJsonFile.json
It works for me on API testing, to read .json and POST, like here
*** Settings ***
Library Collections
Library ExtendedRequestsLibrary
Library OperatingSystem
*** Variables ***
${uri} https://blabla.com/service/
${json_path} C:/home/user/project/src/json/
*** Test Cases ***
Name of Robot Test Case
Create Session alias ${uri}
&{headers} Create Dictionary Content-Type=application/json; charset=utf-8
${json} Get Binary File ${json_path}nameOfJsonFile.json
${resp} Post Request alias data=${shiftB} headers=${headers}
Should Be Equal As Strings ${resp.status_code} 200
There are also cases when you will need to transform read binary file (in my case ${json} to a dictionary but first try this simple solution.

Is there something like csv or json but more graphical and better to read for humans?

For example CSV and JSON are human and machine readable text formats.
Now I am looking for something similar even more graphical for table data representation.
Instead of:
1,"machines",14.91
3,"mammals",1.92
50,"fruit",4.239
789,"funghi",29.3
which is CSV style or
[
[1,"machines",14.91],
[3,"mammals",1.92],
[50,"fruit",4.239],
[789,"funghi",29.3]
]
which is JSON style, and I am not going to give an XML example, something similar like this is what I have in mind:
1 | "machines"| 14.91
3 | "mammals" | 1.92
50 | "fruit" | 4.239
789 | "funghi" | 29.3
There should be reader and writer libraries for it for some languages and it should somehow be a standard. Of course I could roll my own but if there is also a standard I'd go with that.
I have seen similar things as part of wiki or markup languages, but it should serve as a human easily editable data definition format and be read and also written by software libraries.
That's not exactly what markup and wiki languages are for. What I am looking for belongs more to the csv,json and xml family.
I would checkout textile. It has a table syntax almost exactly like what you described.
For example, the table in your example would be constructed like this:
| 1 | machines | 14.91 |
| 3 | mammals | 1.92 |
| 50 | fruit | 4.239 |
| 789 | funghi | 29.3 |
An alternative (albeit not optimized for tabular data), is YAML, which is nice for JSON-ish type data.
Alternatively you could also look at the CSV editor's i.e.
CsvEd
CsvEasy
ReCsvEditor
There whole purpose is to display CSV and update data in a more readable Format. The ReCsvEditor will display both Xml and Csv files in a a similar format.
Google CsvEditor, you will find plenty