Is there any PyTorch API to generate a graph? - deep-learning

'ONNX' or 'TensorFlow', they provides a fucntion like tf.compat.v1.GraphDef() and onnx_model.graph.
I'd want to read out the graph of PyTorch DL model as same as ONNX and TF.
In my knowledge, PyTorch provides the API for the graph with torch.cuda.Graph and torch.fx.Graph.
But, I'm not sure these two API is correct or not.
Please let me know if these two API is correct or if there is another one.

Related

dbIds from the Forge Viewer and the Model Derivative API are different

I'm working on creating a forge viewer-based web system linking with my client's BIM360 environment. For the model data extraction part, I refered to the examples below.
(1) https://github.com/xiaodongliang/forgeviewer_embed_in_powerbi_report/tree/master/forge-model-properties-excel
(2) https://github.com/Autodesk-Forge/bim360appstore-model.derivative-nodejs-xls.exporter
However, I realized that the dbIds from the Forge Viewer and the sample codes are different. Is this the reason because of the SVF version discrepancies (i.e., SVF1 vs. SVF2)? If yes, any suggestion to resolve it?
Moreover, some models were not correctly processed when I tried to extract model data using the second example code (i.e., ForgeXLS.js example). The code was not processed after calling "prepareTables" function. It seems like it has issues to conduct "getMetadata", "getHierarchy", and "getProperties" functions. Could you let me know some possible reasons?
SVF version discrepancies
yes, the dbId of SVF is not consistent in all versions of the model. SVF2 dbId can be same in different versions, but the ideal is to take advantage of external id of the object. i.e. build a map from the first version of the model with external id. When you want to use dbId, search the dbId by external id
extract model data using the second example code (i.e., ForgeXLS.js example)
I'd believe it is just because the metadata / properties of the model version sis too large. you may try to add the header forceget when fetching the properties
https://forge.autodesk.com/en/docs/model-derivative/v2/reference/http/urn-metadata-guid-properties-GET/

How import model from forge scene to map(Google, arcGis etc)?

I use forge API in my application. I added an arcGis map to display the model using the methods from the post https://forge.autodesk.com/blog/dump-geometries-2d-curve in GEOMETRY_LOADED_EVENT.
When I work with small models, the data is displayed on the ArcGis map well, but info about layers are lost.
When the models are large, I get a very large data object(100mb + ) and my program does not work. Is it possible to save the model in threejs-friendly formats or to get a more compact object with model and layers data? Or are there still ways to display the model on the map? I would be very grateful for the advice.
I tried methods from the post https://forge.autodesk.com/blog/dump-geometries-2d-curve in GEOMETRY_LOADED_EVENT.
Not sure if this fits your use case, but I've been working on a tool for converting Forge models into glTF format (which is pretty three.js-friendly I'd say): https://github.com/petrbroz/forge-extract.

DerivativesApi.GetModelviewProperties for subset of properties

The model viewer has the ability to get properties by passing a filter: viewer.model.getBulkProperties(dbIds, ['externalId', 'Category'], function) where we can limit the results to just the two properties 'externalId' and 'Category'.
It would be a huge benefit for us to have this same filtering capability from the model derivative api:
https://developer.autodesk.com/en/docs/model-derivative/v2/reference/http/urn-metadata-guid-properties-GET/
We have Revit files with 40,000+ parts, and it can take over 15 minutes to query for properties, but we are getting far more data than we need.
it is a reasonable enhancement. I logged it as an internal ticket DERI-4610.
If you have used Extractor to download the whole SVF dataset to local , you could try with extract the properties from properties.db (the other post tells more). This is a lite sql database which is actually used by Derivative API on Forge cloud. I'd think there is some smart ways to filter the specific properties by the db file.

What process should I follow to map Ember-Data to a series of non-standard APIs?

I am starting an Ember application from scratch that will connect to many non-standard JSON APIs which I don't control and from which I only need bits and pieces of data. My first attempt was to use jQuery alone but the code quickly became hard to read and maintain.
I want to use the Ember-Data RESTAdapter with some Serializers. I may need multiple Adapters and Serializers for the different APIs.
I am trying to figure out a good way to break down the work into logical steps.
What process should I follow?
For example:
"Start with what I need" approach:
Model ALL my objects using the FixtureAdapter as the ApplicationAdapter
Implement sample app using the models to ensure it's logically correct
Switch the FixtureAdapter for the RESTAdapter
Extend the RESTAdapter for each Model to map to the different APIs
Create a Serializer for each Model Adapter
-or-
"Start with what I can get" approach:
Extend a SINGLE ModelAdapter at a time, mapping it to the necessary API end-point
Create the Model for my ModelAdapter
Create the Serializer for that ModelAdapter
Implement model in the app
Repeat

Google Realtime API - How to view existing collaboration model?

How do I view existing realtime collaboration data model? I call getRoot method:
var collaborative_model = rtpg.realtimeDoc.getModel().getRoot()
When I vew collaborative_model object in debug, I see cryptic properties only. Not sure if or how my model is saved.
Can I do some kind of variable dump of the model?
You can use https://gist.github.com/cowsrule/6348393 as a mostly plug-and-play dumper for the realtime API collaborative objects. As this relies on internals of the realtime API it will need to be updated (read: break) the next time they update the API.
To use, include on your webpage and set window.remoteDoc to be your realtime document.
To call, pass in the ID of the CollaborativeObject you are interested in inspecting.
The root is just a CollaborativeMap, so you can use the standard map methods to explore its values.
The relevant methods there for digging into the model are keys() and values().
A lot of these data model classes have obfuscated methods that are a part of the internal implementation. The best way to see what methods are publicly available is to look at the API reference.