Derivatives API/Viewer fails with some models - autodesk-forge

We are transforming all of our models using the derivatives API - and after downloading all assets for offline use (we must support offline users), two things happens:
The viewer crashes (the model disappears) when rotation/zooming the model
Some of the texture files are empty (0 bytes)
The first issue also happens on https://viewer.autodesk.com/, you can play with it in the following link:
https://viewer.autodesk.com/id/dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6YTM2MHZpZXdlci90MTUzMjUwMzMxOTc4OF8wOTAyMDU4OTE5MjU1OTc0N18xNTMyNTAzMzE5Nzg4Lm53ZA?=&designtype=nwd&sheetId=NTMyMTY1ZDAtNTFlZS00OTNlLTk1NDItMmIzMjM3NjRjYmFh
This is a Navisworks model can be found in the following link:
https://drive.google.com/file/d/1SPjsrOTDHucX35HHqaa_EbkQ5mytC_cu/view?usp=sharing))
What can cause these issues?
Thanks,
Amir

Related

Some BIM360 RVT files show references, some don't - isCompositeDesign seems to distinguish both cases

We have several Revit projects hosted on BIM360 and need to extract link information from the files.
The links where created in Revit following the proposed workflow described here.
Querying the references endpoint of the data management API {{FORGE_HOST}}/data/v1/projects/:project/versions/:version/relationships/refs, we sometimes get empty arrays, while sometimes all works as expected.
We tried to find the differences between the files that worked and those that didn't and queried additional version information using {{FORGE_HOST}}/data/v1/projects/:project/versions/:version.
The one/only? distinguishing factor that we found is that all files that do not show links where those that had attributes.extension.data.isCompositeDesign = true. However, we have no idea how to avoid isCompositeDesign upon creating the files and links in Revit.
We would be grateful for any hints regarding
if this flag has indeed direct effect on the way links are processed.
what this flag means and what leads to it being true
When the main model and linked models are all synced and published to BIM360, isCompositeDesign = false.
When either of the link models is updated (locally) but unpublished to cloud (BIM360), while the host model has been published with updated contents of the linked models, isCompositeDesign = true. This help document tells such behavior about cloud work sharing models with linked models:
Downloaded source file from BIM 360 does not contain linked Revit files.
And in this case, it will be an zip file when downloading the main model.
In any case (isCompositeDesign =true/false), I think /relationships/refs should return linked models info for developers to re-publish the linked models by the tutorial below.
https://forge.autodesk.com/en/docs/data/v2/reference/http/PublishModel/
However, with my test, it looks when isCompositeDesign = true, /relationships/refs return empty array, as you have observed. I am checking with engineer team about this behavior or anything I missed.
Coming up with an answer after some more investigation:
The flag isCompositeDesign has indeed a direct effect on the way links are processed, see 2
The flag is set if all linked files, including the host file are bundled in a zip file. If a zip file is used depends on the way the links are hosted, see below.
We found the following sources that adress this flag:
stackoverflow discussion: Here, Bret Thompson asks in a comment:
What controls if a BIM 360 project publishes "Composite" files or not?
Dion Moult answers:
I believe it is due to two factors: 1) are there links, and 2) are the links at a particular version which is not the latest
He also refers to this discussion, where bogdan.ciobanu goes into detail:
When a model is published, if any linked models are High Trust (directly linked from the source folder) and the latest version of the linked model is not published, Revit Cloud Worksharing will include the linked models as a zip so that the extractor has all the data it needs. If a newer version of the host model is published and the linked model version is already published then no zip is created.

Viewer quality settings (for one viewer on the page)

I have two viewers on the same page. And I want to set performance settings for only one viewer with following code:
this.viewer.setQualityLevel(false, false);
this.viewer.setGroundShadow(false);
this.viewer.setGroundReflection(false);
this.viewer.setProgressiveRendering(true);
BUT (!) settings are applied for both viewers for some reason. Is there any way to apply them only for one viewer on the page?
The viewer settings are kept in localStorage, so changing them using methods like viewer.setQualityLevel probably propagates the updates to other viewer instances as well. Let me discuss this behavior with the dev team as (I think) it could be considered a bug.
In the meantime, if you need to change settings for a single instance of the viewer, consider using "lower level" methods that don't use the local storage. For example, instead of viewer.setQualityLevel(useSAO, useFXAA) you could use viewer.impl.togglePostProcess(useSAO, useFXAA), and instead of viewer.setGroundShadow(bool) you could use viewer.impl.toggleGroundShadow(bool).
EDIT
Also try the Profile API to persist settings - you can get the current profile with:
viewer.profile
I was unable to replicate the issue unfortunately ... looking at your code did you assign two viewers to the same handle? can you post your code to initialize viewers? what version of viewer btw?

Saving Prefab with AR/VR Toolkit doesn't work

I used Autodesk Forge AR/VR Toolkit unity plugin to load a model statically, and then save it as a prefab, then I tried to load the prefab in another project. It just doesn't show up. It doesn't show up even in the same project(if u delete the loaded model and load from the prefab you saved). I am wondering why?
I noticed that the model loaded from Forge AR/VR toolkit doesn't have mesh component and I guess that's the reason. But how come it can be shown correctly when loaded from Forge? How to solve this problem?
Below are some screen-shots
The model is successfully loaded statically from Forge
The prefab was created successfully but when dragging it into project it is invisible
The reason is that you probably created a prefab from the game engine versus the editor. You need to understands how Unity is working regarding creating assets on disk. To create a static prefab, do not launch the game engine, but use the Forge menu, make sure to check the 'Save To Disk' option. After importing the asset, you will see a pivot object in the hierarchy, and a folder in the Resources folder of the Project. Select the pivot object in the hierarchy tree, and select the Forge Menu again, and 'Create Prefab' - that should work just fine now.

Chrome chokes on TOO many XHR2 requests

I have a "JSFiddle-like" demo of fetching PNG (Binary Blobs) in a tight loop using XHR2. This first demo grabs 341 PNG images, and then saves them in IndexedDB (using PouchDB).
This demo works fine: http://codepen.io/DrYSG/pen/hpqoD
(To operate, first press [Delete DB], Reload Page, wait for Status = Ready (you should see that it plans to fetch 341 tiles), then press [Download tiles]. )
The next demo is the same code (identical JS, CSS, HTML), but it tries to get 6163 PNG files (again from Google Drive). This time you will see many XHR 0 errors in the console log.
http://codepen.io/DrYSG/pen/qCGxi
The Algorithm it uses is as follows:
Test for presence of XHR2, IndexedDB, and Chrome (which does not have binary blobs, but Base64). and show this status info
Fetch a JSON manifest of PNG tiles from GoogleDrive (I have 171 PNG tiles, each 256x256 in size). The manifest lists their names and sizes.
Store the JSON manifest in the DB
MVVM and UI controls are from KendoUI (This time I did not use their superb grid control, since I wanted to explore CSS3 Grid Styling).
I am using the nightly build of PouchDB
All files PNG file are feteched from Google Drive (NASA Blue Marble.
I created the tile pyramid with Safe FME 2013 Desktop.
My guess is what is happening is that all these XHR2 requests are being fired async, being placed on a thread separate from the JavaScript thread, and then when there are too many pending requests, Chrome is getting sick.
FireFox does not have this issue, nor does IE10.
You can fork the code, and try different values for line 10: (max number of tiles to fetch).
I have submitted a bug to Chromium Bugs, but does anyone have any experience in throttling the async XHR2 fetches for large downloads of data to the Chrome Browsers?
The chromium folks acknowledge this is something that they should fix: https://code.google.com/p/chromium/issues/detail?id=244910, but in the meantime I have implemented throttling using jquery defer/resolve to keep the number of threads low.
Update: I am going to delete my codepen, since I don't need to show this error anymore.

Where does LocalFileSystem.PERSISTENT point to?

In PhoneGap, I use
window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, onFileSystemSuccess, fail);
to access the file system.
In my ASUS tablet, it has no external sdcard(I don't insert any removable device) so I think the file system root points to the internal sdcard. However, in my HTC Desire HD, the data was written to the external sdcard. (Since the data just reside in the microSD card.)
So what is the truth? I can't see any clues in the W3C document, maybe I miss something...
PS: Both the android version are ICS(Ice cream sandwich).
PhoneGap's FileAPI, while designed to mirror the HTML5 spec, is actually a custom implementation of the W3C document. You can find the docs specific to their API here. While it mostly can be used the same, there are some subtle differences between how things are implemented on the web and per device. The location of storage is one of these.
To find out how PhoneGap handles persistent storage, I had to dig into the Cordova source code. This file here contains the methods used by the PhoneGap FileAPI. The relevant block of code starts at line 871. Basically, the API will make a call to Environment.getExternalStorageState(). If this returns Environment.MEDIA_MOUNTED, meaning there's either an removable or non-removable SD card for storage, the FileSystem returned by the API is the root directory of the mounted storage, using Environment.getExternalStorageDirectory(). This explains the difference in behavior you saw between devices with internal and external SD cards, both considered mounted external storage by the system. If you encounter a device without any external storage, i.e. !Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED), the root of the returned FileSystem will be "data/data/packageName" in internal storage, similar to calling Context.getFilesDir(), which usually returns something like "data/data/packageName/files".