Forge model derivative job failed. What now? - autodesk-forge

I ran a model derivative job and the status came back: failed. After drilling through the return values, it said that two of the linked dwg files were missing. I added the dwg files, re-zipped and re-uploaded the zip. When I try to run the job, it keeps coming back with the initial failed status. Am I missing something?

Assuming you have buckets, on the POST Job endpoint, use the x-ads-force header, if you pass true it will translate again the file.

In hindsight, one could say this is obvious but it isn't spelled out in any documentation anywhere. Essentially, one needs to DELETE the failed manifest and run a new job. There doesn't seem to be any re-try mechanics.

Related

Error while executing Work Item "Cannot find the addin file"

I am new to the Design Automation API, so please excuse and correct me if I am using the wrong terms. I am setting up the wiring for my very first Design Automation AppBundle, and I have almost all of it working. I followed the patterns in the "Delete Walls" tutorial.
I have a working add-in DLL that I can test locally and it runs under the "design.automation-csharp-revit.local.debug.tool".
I also have all of the Rest API connections setup, and I can successfully submit a WorkItem that will download a Revit file from a BIM 360, and start processing it in the sandbox of Design Automation. But I am getting an error during the execution on the sandbox where it seems it can't find my add-in file. Here is an excerpt from the WorkItem log:
[07/21/2020 18:02:26] Resolving location of Revit/RevitCoreEngine installation...
[07/21/2020 18:02:26] Running user application....
[07/21/2020 18:02:31] Cannot find the addin file:
[07/21/2020 18:02:31] Fail to deploy Addon DLL(s) in AppPackages.
[07/21/2020 18:02:31] RESULT: Failure
I have looked through "bundle" ZIP file many times looking for typos that could cause this, but I can't find anything, it looks identical to the "delete walls" example. So I'm wondering if there is somewhere else that I need to look. Or any other way I could debug this to find out were the connection is missing. I can only assume that the AppBundle and Activity items are setup correctly since I am getting this far and the error is not mentioning either of those items.
Any tips on where to look?
This turned out to be a misspelling of the [dot]bundle folder extension that triggered the issue.

Logging in Revit Design Automation add-in

I want to send some diagnostic output to the default report.txt file.
In some posts it is shown that exceptions are logged to this report.txt file somehow (automatically or not?).
Also, I see in some samples that people do the logging with
System.Console.WriteLine(),
I've tried this, but still can't see it in the report file.
Could you tell me, how to achieve this?
I understand there is an option to create another log file and send it back with the result, but I think it would be easier to use this existing report.txt.
Thanks!
UPDATE: System.Console.WriteLine() works.
The reason why I didn't see the output was that my add-in failed to load.
So, it simply didn't reach this line of code.
Logging in Design Automation for Revit appbundles can indeed be done with System.Console.WriteLine. Anything sent to standard output will be capture in your workitems report.txt. For example.
The following code:
System.Console.WriteLine("Hello World!");
Will generate the following lines in the report.txt:
[04/23/2020 19:20:59] Hello World!

TF293000: The data warehouse has detected data conflicts for the following work item fields

Hi I'm looking for help with the following issue:
In TFS on our SSRS report server whenever I run any of the out the box Sprint Burndown reports the report seems to run successfully but I get the following error in the bottom right hand corner:
Through some research I found that the issue was due to the field definitions in that particular Collection not matching the other collections that we have in TFS. Simple...
In order to determine which field definition in the collection was the issue I used the witadmin command listfields for all of my collections:
witadmin listfields /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy
This led me to find that the Synchronizes Identity Name Changes definition in the collection mentioned in the TF293000 error was set to a value of true, while it is false in all of my other collections. Issue Found! Should be easy from here...wrong.
The following command should solve my problem:
witadmin changefield /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy /syncnamechanges:false
*of course with the proper collection url subbed in for the word Collection
However when run and after I confirm that I want to make the change I get the following error:
TF401327: The operation is not supported. The feature is obselete.
I look the error up and it takes me to this page TFS Known Issue which tells me it's a known issue but was resolved in update 1 ... we have update 3.
I then attempted to simply edit the WIT .xml file and update the attribute for that WIT on that collection with false, but when I import the change to the server it tells me it has imported successfully however when I export it I see that the file has not changed.
I have also tried copying the the .xml file from the same WIT in another collection and uploading that to the offending collection and that will not work. I've never had an issue with uploading a WIT as we've made several changes to our TFS workflow before. I'm pretty stuck at this point and just wondering if anyone else has experienced this issue before, thanks!
According to the error info, seems there is a conflict in the TFS Data warehouse and this because 2 fields in different collection has different attributes in the data warehouse as it’s only one data warehouse. To avoid schema conflicts when you export and process data to the data warehouse databases, you must assign the same values to these attributes across all collections:
Field type (the value for this field cannot be changed for an
existing field).
Reporting type.
Reporting name.
What you have done is the correct operation, change/update the attribute for the field in one project collection to match the assignments that are made in other project collections.
You could try to narrow the issue, if this issue only happened on that specific field in the team project collection. All other work item filed working correctly? Also give a try with other collections, such as change the syncnamechanges=true, then set it back to syncnamechanges=false, to see if any issue occurs.
Run the command line on TFS sever machine instead of your develop machine. Clear TFS cahce. And if the filed is not use for reporting about those project collections, you could also try to mark it as non-reportable. More details please refer below links:
Resolve data warehouse schema conflicts
Change a reportable attribute for a work item field

Cloudconnect CSV buffer size

When I try to load a big CSV from a zip file, the execution log give me the following error:
----------------------------------------- Error details ------------------------------------------
Component [Clientes:CLIENTES1] finished with status ERROR.
The size of data buffer is only 100663296. Set appropriate parameter in defaultProperties file.
--------------------------------------------------------------------------------------------------
How can I set the appropriate parameter in defaultProperties file?
I tried this link, but my cloudconnect run configurations page is different from the link:
I've created the parameters file and filled the additional parameters with the right values like said the tutorial (code bellow) and the same error appear in the screen.
Name: -config; Value: new_buffer_size.txt
The new_buffer_size.txt content have just this line:DEFAULT_INTERNAL_IO_BUFFER_SIZE = 200000000
How can I solve this problem? I need to solve this before the world explodes.
CloudConnect is designed to develop ETL(s), which can be run on GoodData cloud workers and therefore some lower level settings are surpassed as in this case. The only legitimate way is to modify the ETL the way it can process the data with current settings. Regarding to docs, the referenced article is outdated. GoodData docs team is aware if it and they are preparing docs refactoring.
Note: As you have probably noticed, CloudConnect is being powered by Javlin's Clover ETL, therefore feel free to check their forums, as you would find there how to overcome the issue on lower level (no UI), but it would work only for data processing on the local machine.

Model derivative: translation stops at 50%, never fails, never completes

I have a following scenario, 2 revit files, ModelA.rvt and ModelB.rvt. They are cross-referenced together, zipped and uploaded twice under diferrent object key (ModelA.zip, ModelB.zip). ZIP files are identical, very small(4MB) and containing both files. They both are uploaded succesfuly in a loop using:
PUT https://developer.api.autodesk.com/oss/v2/buckets/:bucketKey/objects/:objectName
Files are overwritten with token scope data:write and a post job called with x-ads-force = true in case of model update. Then I call the POST JOB 2x in a loop, once with ModelA.rvt as rootFilename for ModelA.zip and secondly with ModelB.rvt for ModelB.zip. Both post jobs are done sucesfully.
Right after I am getting manifest for both zip files each 10 secs. ModelB.zip is translated 100% in a few secs, but ModelA.zip never finishes (few hours so far), just hangs for no reason. On friday I thought that is just temporary issue, but no it still lasts.
I tried this scenario 3x times, each time with different set of files today and 3 days back. Same result. This one is the easiest one and they are all already present on the cloud. Still have no idea what is going on.
When I list bucket objects, zip files are never present. Another weird thing. Other files with non-zip extension are.
Does anyone have a clue what is causing this, what could be possible workaround? That is serious issue, because it corrupts usability and reliability of the whole API.
The linked revit files need to be in one zipfile with the new v2 API. See this post for more details: http://adndevblog.typepad.com/cloud_and_mobile/2016/07/translate-referenced-files-by-derivative-api.html