Is it possible to read the version of File in a specific bucket - autodesk-forge

I need to get the Versions of file which is uploaded to a specific bucket. if not possible can you please suggest a way to achieve the versing using forge

Buckets and objects are part of the low-level Object Storage Service that does not provide versioning (see https://forge.autodesk.com/en/docs/data/v2/developers_guide/basics for more details). You can either provide your own versioning logic on top of the storage service, or perhaps use the Items.

Related

Whats the best way to deploy predefined data via api calls?

I have several json files that represent the payload for different API's(I can map which API to call based on the file name, but other methods could be applied as well),
what is the best practice to populate my data on the application with the help of those json files?
My first though was to use some automation framework(rest assured for example) to accomplish my task, but I think it might be an overkill for my scenario.
p.s. snapshot of DB/query direct to DB is not an option because of the nature of the application.

DWG or other 2D File Versioning in OSS Buckets

How can we maintain multiple versions of the same DWG or other 2d file while uploading and translating the file. What are the preferred options.
I need to maintain the versioning of the files in Private OSS Bucket with Autodesk Forge and not using BIM360 in this case.
Thanks a lot.
OSS is just a simple blob storage service and as such it does not provide any versioning capability. You can introduce your own versioning schema, though, for example by appending a version suffix to the object key of every uploaded file (and then removing the suffixes when listing the objects), or by maintaining the document-version relationships (and then perhaps even folder-document relationships) in a relational database, with version records pointing to OSS objects.

Querying json with RESTful API using Azure

In brief I have some json files which I would like to have publicly available as databases for anyone to make queries on. The querying should be via a REST API. All this needs to be done using Azure.
What might be the fastest/simplest way of doing this? I have researched things like json-server, Azure's Cosmos DB, and so on, but everything seems to be either not what I'm asking for or way too much.
Any help or pointers in the right direction would be appreciated.
Thanks!
Have you looked into Azure Blob Storage? Each of the blobs is essentially a file that is stored in Azure which you can house Json data.
https://azure.microsoft.com/en-us/services/storage/blobs/
Then you can layer NoDb (https://github.com/cloudscribe/NoDb/blob/master/README.md), or something similar, in a repository on top and expose with Web API. You will need to write a custom storage path resolver to connect to Azure (https://github.com/cloudscribe/cloudscribe.SimpleContent/blob/master/src/cloudscribe.SimpleContent.Storage.NoDb/PostStoragePathResolver.cs). This approach would not scale very well for high throughput applications but could be switched out for EF and a bigger document store (i.e. CosmosDb) later without having to adjust your API layer.

Transfer files from dropbox/drive to Google cloud storage

We are planning to implement a feature to let customers browse their images in their own google drive or dropbox accounts (within our app), and select ones they wanna use in our app. Then, we will save these images into our Google cloud storage (GCS) place.
A simple approach is to use OAuth to download the customers' images from dropbox/drive to our local disk, then upload them to our GCS.
I am wondering if it is possible to make a direct transfer between dropbox/drive and GCS? GCS does provide transfer service for S3. You can also create a list of URL list, but each URL needs to be publicly accessible, which in our case does not seem possible.
Is there any better suggestion on how to implement this feature? Thanks
Well you could stream the files so at least you wouldn't have to save them to local disk, but other than that I think your solution is the way to go.
It depends on your language how streaming works exactly, but basically you download the file and upload it to GCS right away without ever writing any bytes to local disk.
I ran into your question during my search of finding a way to transfer data from G-drive to Google Cloud Storage. Since they are both under Google umbrella, so I guess they should be able to seamlessly upload/download. So far in my search, without programming, the answer is NO.
I don't have a exactly solution for your question, because it really depend on how much data you have and how often of the uploading/downloading. Unfortunately, there is not straight shot (even for the Gdrive!). But maybe thinking of trying the RESTful API calling to Google and make the connection with Dropbox API. Hoping this could help.
With this recipe, you can use Google Colab to mount Google Drive folder, and then copy from the mounted folder to the storage bucket using gsutil, avoiding the need to first save the data to local disk.

Fastest way to download/access files in Amazon S3

I'm trying to access files in my amazon S3 and do some operations on it. Currently evaluating the options.
Since I will be doing some operations on the S3 files, I would prefer using some language to access the files in S3 (I have already tried copy command).
My S3 contains JSON files which range between 2MB to 4 MB and I would need to parse these JSON and load them into a database (thinking about using JQuery here, but any other suggestions are welcome)
Given these requirements which is most efficient language/platform to be used here.
You options are pretty broad here. AWS has a list of SDKs for you to choose from. https://aws.amazon.com/tools/#sdk
So your comfort level with a particular language should be your largest influencer. Given that you mentioned JSON and JQuery perhaps you should look at Node.js SDK and AWS Lambda.