Missing object properties in design metadata SQLite file - autodesk-forge

My model in Autodesk construction cloud contains several properties which I need to track. I have downloaded the sqlite file using fetch derivative download URL without any errors.
However when comparing export with what I see in ACC, or the Fetch all properties call, some properties are entirely missing. For instance, the fetch all properties call returns the 20 expected property values whereas only 3 of these values exist in the SQLite download. Any explanation why the SQLite file may be incomplete? There doesn't seem to any size restriction or filter in the call returning SQLite which may explain partial results.

It is expected that the design metadata returned via sqlite and via JSON may not be the same. For example, the metadata captured in the sqlite database use "instancing" where multiple design elements may inherit certain properties from another element (and the resolution of the inherited properties is left to whomever is reading this file). The JSON format on the other hand does not support any kind of inheritance, so properties are duplicated for each individual design element.

Related

Which is the best way of parsing CSV-data in a logic app without using a custom connector?

I have an SFTP trigger in a logic app which fires when a file is added to a certain file area. It is a CSV-formatted file and I want the rows to be parsed and coverted into json. Which is the best way to convert CSV-data into json without using any custom connectors?
I cannot find any built-in connectors doing this job. And as far as I know there are no logic apps functions doing the job either.
Right now, there is no connector/action in logic app that can provide the out of box solution for your requirement. You need to loop in through the array and perform the calculation as per your requirement but I will not suggest you leverage the loop, variables action as it may take time and cost you more.
The alternative would be leveraging the inline code (JavaScript code) to do the calculation as per your requirement. Please note that you will need Integration Account to run your inline code.
Please refer to javascript code and modified if needed according to your needs. I have used '_' for differentiating the nested objects. For more details you can refer to previous discussion here.
For complex calculation you can offload this functionality to azure function and write your code as per the supported languages and call azure function from logic app.
1.Created logic app as shown below:
2 .Created container in storage account and uploaded a CSV file in container.
3.Next using compose action to split the contents of the CSV file on every new line into an array.
a. Here is the expression used in SplitLines compose action:
split(body('Get_blob_content_(V2)'),decodeUriComponent('%0D%0A'))
b. Follow the below MS Doc to write expressions:
4. Removing last(empty) line from previous output using another compose action as shown below ,
take(outputs('SplitLines'),add(length(outputs('SplitLines')),-1))
5.Separating filed names using compose action
split(first(outputs('SplitLines')), ',')
Forming json as shown below using Select action,
**From**: **`skip(outputs('RemoveLastLine'), 1)`**
**Map:**
**`outputs('SplitFieldName')[0]`** **`split(item(), ',')?[0]`**
**`outputs('SplitFieldName')[1]`** **`split(item(), ',')?[1]`**
Tested logic app and it is running successfully. 
Content of CSV file is as shown below:
Csv data is formatted as json:
Reference:Use data operations in Power Automate (contains video) — Power Automate | Microsoft Docs
Credit: #Iason Koulas

Is it possible to get urns of models which are translated as references via zip translation?

When I upload and translate a zip-file with one rootFile and some models which act as references to Autodesk-Forge, I could only find one model-urn afterwards. Are all models uploaded separately under the hood and do you have the possibilty to get the urns of each model?
One usecase would be to open any other model from the package than the predefined root, to get to view the 2D-sheets from this model.
Another usecase would be to save data in relation to elements/referenced models with their dbId/guid and urn.
I was expecting to get each models urns by selecting parts from different models and running this.viewer.getAggregateSelection().lastItem.model as it would do the trick if I would've translated them separately and aggregated the view. But this way there's just one urn for all elements.
I also tried inspecting the buckets and objects via the awesome "Autdesk Forge Tools" extension for VSCode, but couldn't get any deeper than the .zip file as an object in the bucket.
Is the only possibility to upload/translate the same .zip-package for every model i want to open with a new defined rootFilename again? Is this still the only possibility as stated in an answer from 2016? (https://stackoverflow.com/a/38720162/19956654)
Appreciate any help with this one, thanks in advance!
Unfortunately, one ZIP will have one URN only. So, you will need to have the ZIP uploaded with different names and request translations with different rootFilenames separately.
However, you don't really need to upload the same file several times. Just call PUT buckets/:bucketKey/objects/:objectKey/copyto/:newObjectKey to duplicate the uploaded ZIP with different names.

Model Derivative API object ids don't match PropertyDatabase object ids

I have developed an application that gets the JSON object tree of a BIM 360 revit model's view, using the Forge Model Derivative endpoint, then downloads the SQLite PropertyDatabase to query properties of several object tree entities. That was working fine until now. Recently, I am having trouble with some models where the object tree derivative object ids don't match the PropertyDatabase object ids.
I have seen this post Temporary workaround for mapping between SVF1 and SVF2 IDs but this method is not valid in my case because my app works on the server side and not uses viewer API at all.
My question is: if there is a workaround using APIs from the server side, and if there are plans to solve this inconsistency between APIs shortly.
Unfortunately, this behaviour is expected with your approach. SVF and SVF2 do not share the same IDs, SVF2 IDs are optimised to process data faster and to try keeping them identical across versions. The article you make reference to is only working in the context of the Viewer, for server side processing you need to get the dbid.idx file to map IDs. This utility has a command to help you downloading the file. Try:
./forge.js version-svf2-idmap project_id version_id output_dbid.idx
This file is a gzip compressed file of a uint32 array.
What happened is that you did download the SVF SQLlite db using SVF ids. But when using the MD endpoints, you actually using the SVF2 ids because the target format is SVF2. If you did request a SVF target format, the MD endpoint would work with SVF ids. Unfortunately, you do not control BIM360 target format which could either be SVF or SVF2 depending of the source file format. For example, IFC, RVT, NWD, DWG are SVF2, but others are not. You can determine which format is used by reading the outputType and overrideOutputType. If overrideOutputType says 'svf2', then you should do the mapping.
The relevant code for the version-svf2-idmap is here
On the other hand, if you got the SQLite database, why do you need to call the MD endpoints, you got everything you need, and can extract the information much faster from there. See my example here. It has functions for properties extraction, and/or building the hierarchy tree.

Duplicate object in SVF converted from linked revit

I am trying to convert a SVF using linked Revit files. I uploaded a zip to model derivative API which contains all the Revit files.
After conversion, the revit files are combined into one single SVF, but I discovered that at least some of the objects might be duplicated in a weird way.
For example, I have an object which have an unique attribute defined. When I select that object, I can see its position inside the model browser, as well as its properties. (Screenshot 1)
When I try to search in the whole model using that unique attribute, I discovered that another object has the same attribute and properties. When I try to select it using its dbId, I found that the object is invisible in the model, and it does not show where does it belong inside the model browser. (Screenshot 2)
Any idea why is there a duplicated object inside the SVF? Thanks.

Configuring Object Instances from JSON in conf Files

So I want to be able to basically make instances of a class from JSON definitions in a conf file. In looking through the docs, I found that there are ways to reference things that are defined in JSON structures, but I was wondering about how best to instantiate objects from such definitions. For instance, suppose I had a class called RemoteRepository with 4 or 5 properties. I'd like to make a bunch of entries in a JSON file then read in at startup and get back a collection of RemoteRepository objects. I could do this with a database, of course, including a graph one, but would like to just do JSON if possible.
Assuming a static class definition that represents the JSON structure is acceptable, you can try the JSON C# Class Generator
Once you've generated your classes you can simply create a new instance or array of instances by passing in the json to the constructor that this tool creates on the generated class(es).
So I can make instances, but as usual, once I need a bunch of instances, it's time for a database. In this case, I ended up doing some simple XML files to trigger these instances. As much of a mess as XML is, for things like this, it does work best. Namely:
Some nesting of instance information
Not an exact mapping to the target class, e.g. field mappings are part of my config. I am going to load a few fields from the config file, but then create instances of a different class, hence the immediate conversion to a java class I would get from JSON is not meaningful
One other thing I figured out in doing this is that processing XML in Java is still kind of a mess. Still, this was the right way to go in this case.