Rename Bucket or transfer all models - autodesk-forge

I would like to know if it is possible to rename a Bucket.
If not, I would like to know if I can move all my models on the bucket I want to rename to a new bucket without translating each model again.
Thanks.

Unfortunately, it is not possible to rename a bucket, but it is possible to copy files (objects) across bucket with this API
For the viewables, it is a different story - they are not stored in OSS buckets, but on the Model Derivatives server. It means, you either need to translate them again if you want to use the new URN, or leave them where they are and map the old and new URNs. Viewables are destroyed only when you delete their manifest.

Related

Is it possible to get urns of models which are translated as references via zip translation?

When I upload and translate a zip-file with one rootFile and some models which act as references to Autodesk-Forge, I could only find one model-urn afterwards. Are all models uploaded separately under the hood and do you have the possibilty to get the urns of each model?
One usecase would be to open any other model from the package than the predefined root, to get to view the 2D-sheets from this model.
Another usecase would be to save data in relation to elements/referenced models with their dbId/guid and urn.
I was expecting to get each models urns by selecting parts from different models and running this.viewer.getAggregateSelection().lastItem.model as it would do the trick if I would've translated them separately and aggregated the view. But this way there's just one urn for all elements.
I also tried inspecting the buckets and objects via the awesome "Autdesk Forge Tools" extension for VSCode, but couldn't get any deeper than the .zip file as an object in the bucket.
Is the only possibility to upload/translate the same .zip-package for every model i want to open with a new defined rootFilename again? Is this still the only possibility as stated in an answer from 2016? (https://stackoverflow.com/a/38720162/19956654)
Appreciate any help with this one, thanks in advance!
Unfortunately, one ZIP will have one URN only. So, you will need to have the ZIP uploaded with different names and request translations with different rootFilenames separately.
However, you don't really need to upload the same file several times. Just call PUT buckets/:bucketKey/objects/:objectKey/copyto/:newObjectKey to duplicate the uploaded ZIP with different names.

Autodesk-Forge bucket system: New versioning

I am wondering of what is the best practise for handling new version of the same model in the Data Management API Bucket system
Currently, I have one bucket per user and the files with same name overwrites the existing model when doing a svf/svf2 conversion.
In order to handle model versioning in be the best manner, should I :
create one bucket per file converted
or
continue with one bucket per user.
If 1): is there a limitation of number of buckets which is possible to create?
else 2): How do I get the translation to accept an bucketKey different than the file name? (As it is now, the uploaded file need to be the filename to get the translation going.)
In advance, cheers for the assistance.
In order to translate a file, you do not have to keep the original file name, but you do need to keep the file extension (e.g. *.rvt), so that the Model Derivative service knows which translator to use. So you could just create files with different names: perhaps add a suffix like "_v1" etc or generate random names and keep track of which file is what version of what model in a database. Up to you.
There is no limit on number of buckets, but it might be an overkill to have a separate one for each file.

Autodesk Forge - Post Jobs - Must files be in buckets and proper URN

I am working on doing a post job and I am confused about where files need to be to run the job and the proper urn.
The examples all use a file that the user uploads to a bucket. I am trying to run the post job on a file that a user has created in Fusion 360 and that he has selected through a GUI I created. The urn in question is obtained by letting the user select the hub, project, folder(s), and file. I then use this file urn on the post job.
I keep getting back the response of :
Failed to download the design description for the input design.
My questions are:
Is it possible to do this from a users hub or do all items have to be in buckets?
Where are those translated files stored once created? If I want to get data like volume and mass without storing the translated file, is that possible?
I took the "urn:" off the front of the urn and got a different error, which I believe meant that it couldn't find any file.
Invalid 'design' parameter.
So, it looks like the urn I am using is finding a file but there is an issue somewhere that is preventing that file from being accessed or translated or something.
I keep getting back the response of : Failed to download the design description for the input design.
For Fusion 360 files make sure the extension name of the object is f2d/f3d. BTW Forge Viewer support these two formats directly so you don't have to translate to SVF for Viewer to visualize them.
Is it possible to do this from a users hub or do all items have to be in buckets?
For hub project items use the Data Management API to obtain the object ID - be sure to include the version parameter in your URN - see GET projects/:project_id/folders/:folder_id/contents and use the id of the item as your URN as well as tutorial here to help you understand how project folder items work.
Where are those translated files stored once created? If I want to get data like volume and mass without storing the translated file, is that possible?
The translated derivatives would be stored separately and you can access them through the derivative manifest. Use GET :urn/metadata/:guid/properties to query derivative properties but you will need to translate the model (to any format will do) in order to extract properties - see tutorial here.

Couchbase: How to copy data from bucket to another?

I have a bucket (Bucket1) that I need to delete, and move its documents to another bucket (Bucket2).
What is the best way achieving this?
I'm using version 4.5.
The cbtransfer tool should be able to do the data transfer you need.
https://docs.couchbase.com/server/6.0/cli/cbtransfer-tool.html
bucket-delete can be used to delete a bucket once you are finished with it.
https://docs.couchbase.com/server/6.0/cli/cbcli/couchbase-cli-bucket-delete.html

Add lambda conditions in cloudformation?

I am building my Cloudformation template to create an S3 Bucket.
I wanted to create folders in the bucket at the same time but I have read that I need to use a lambda backed resource.
So I've prepared the lambda part of my template but I need to add a condition :
If the lambda refers to a bucket which already exists
The bucket already exists and it has been created in this ( everything has to reside in one cloudformation stack) file
Call the lambda to create my folders.
I do not want to check if my bucket exists in S3 or if my folders already exist as S3 objects in the bucket.
I would like the lambda backed resource to be created after the bucket has been created.
First of all - why you need directories at all? S3 is in fact a key-value store, "paths" are just prefixes. Usually there is no benefit of doing so other than human-friendly presentation.
Secondly - you can either use DependsOn to enforce proper order or resources provisioning, or (that would be a good practice I think) if you make Lambda generic and accept bucket name in your custom resource parameter, you just do it by using the Ref function, which implicitly creates dependency.