Transfer BIM360 folders from one Hub to another - autodesk-forge

is it possible to transfer files/folders "Hub A" to "Hub B"? Let's say we use Hub A (and C4R) for prototyping purposes, and every month we have to download this prototypes, which is problematic because either the download does not have a directory structure or BIM 360 just never finishes "compressing your items". After downloading and matching the folder structure, we upload this files again to Hub B every time there is a major update to the prototypes.
There is a way to copy files using BIM 360 UI, but just within the same hub.
I would appreciate some guidance here. I've just done some step by step tutorials to get the viewer on my website, so I do not have much experience.
Thanks,
Xavier

Since you ask this question with “Forge Data Management API” tag, so I assume you want to implement the way to achieve that by API.
With the current Data Management API, there is not an easy API to do that directly, but if you want to transfer files/folders from "Hub A" to "Hub B", you can always check & download all the files/folder from "Hub A", and then upload them to "Hub B". There are many details you may need to handle, for example, when the file you want to upload is already there, you can either create a new version or just override it.
We have some similar samples may help you to understand or get started with if you are interested. For example, https://github.com/Autodesk-Forge/data.management-nodejs-integration.box , this is used to exchange files between "Autodesk Hub" and "Box", you can download and check. Also, https://github.com/Developer-Autodesk/data.management-csharp-a360sync , this sample is used to sync the files from local to your A360 account, it's a C# desktop application and is still working in progress as mentioned in README, but, you can get the idea from sample/code anyway.
Hope it helps.

Related

BIM 360 Project Creation, Template Duplicate Folders (Update Request)

Related to this 2-year-old question: BIM 360 Project creation template
Is there still not a way to replicate the Web UI option "Activate Document Management" -> "Apply a project template" that duplicates folder structure and role-based permissions?
Specifically, I'm looking at requests to create a project
POST projects https://forge.autodesk.com/en/docs/bim360/v1/reference/http/projects-POST/ supplying a template_project_id value
and then activate with
POST users/import v2 https://forge.autodesk.com/en/docs/bim360/v1/reference/http/projects-project_id-users-import-POST/
I don't see any other template input in the API documentation besides the first request, and that does not appear to bring folder structure along.
unfortunately, what you found is correct currently, creating a project from a template with folder structure and role permission is not supported with current API, but you can achieve that by yourself as shown in a sample project at https://github.com/Autodesk-Forge/forge-bim360.project.setup.tool.
This is actually a high requested wish HQ-5023 as you can see at https://fieldofviewblog.wordpress.com/2019/06/15/bim-360-acc-api-known-issues-and-wishes/#more-4424 , our engineering is actively working on that, we will update you at our blog if any progress.
Based on reading the documentation, and running some tests, this appears to be unavailable still. I would very much like to be proven incorrect by someone at Autodesk! Thanks.

transfer BIM 360 project between hubs

is there an easy way to transfer BIM 360 project between hubs? either manually or via Forge.
what I mean transfer project is everything, not just files.
If I understand correctly, you meant by clone projects from one hub to another hub.
There is not manual or direct API way yet.
For the general configurations (members, activate services, folder permissions etc), the Setup tool might be of help.
https://github.com/Autodesk-Forge/forge-bim360.project.setup.tool
As to transferring files, you may check if the 3rd party app of Forge could be help or not:
https://apps.autodesk.com/BIM360/en/Detail/Index?id=6253626418897534668
or you may have to build a script to transfer. Two samples might be useful:
https://github.com/Autodesk-Forge/bim360appstore-data.management-nodejs-transfer.storage
this sample transfers files between cloud storages (google drive, box etc) with BIM360, you could follow the similar format to read files from BIM360 hub, and take advantage of the code to upload to the other hub folder.
https://github.com/Autodesk-Forge/forge-upgradefiles-revit
this sample is to upgrade Revit model version, while the workflow is to get source model from one folder, and upload the output model to the target folder. you could borrow the workflow to copy model from source hub folder to target hub folder.

Is there any way to get Document Management activity logs using the Forge API

We are currently looking into the monitoring of documents and activities in BIM360. Webhooks for version added or changed work to a certain extent but only capture so much.
There is a way to manually export the logs into a csv file and idealy we would want to automate this process so our BI department has data to work with.
Furthermore if not possible, is this something you guys are planning to add to the API in the near future or is it a low prio kinda thing.
Cheers and thanks in advance
Sorry we don't have a direct API to extract that information. There is a built-in UI feature.
Via API you can get each file's version history, but that would require a recursive function to craw all folders & subfolders.

Monitor and automatically upload local files to Google Cloud Bucket

My goal is to make a website (hosted on Google's App Engine through a Bucket) that includes a upload button much like
<p>Directory: <input type="file" webkitdirectory mozdirectory /></p>
that prompts users to select a main directory.
The main directory will first generate a subfolder and have discrete files being written every few seconds, up to ~4000 per subfolder, upon which the machine software will create another subfolder and continue, and so on.
I want Google Bucket to automatically create a Bucket folder based on metadata (e.g. user login ID and time) in the background, and the website should monitor the main directory and subfolders, and automatically upload every file, sequentially from the time they are finished being written locally, into the Cloud Bucket folder. Each 'session' is expected to run for ~2-5 days.
Creating separate Cloud folders is meant to separate user data in case of multiple parallel users.
Does anyone know how this can be achieved? Would be good if there's sample code to adapt into existing HTML.
Thanks in advance!
As per #JohnHanely, this is not really feasible using a application. I also do not understand the use case entirely but I can provide some insight into monitoring Cloud Buckets.
GCP provides Cloud Functions:
Respond to change notifications emerging from Google Cloud Storage. These notifications can be configured to trigger in response to various events inside a bucket—object creation, deletion, archiving and metadata updates.
The Cloud Storage Triggers will help you avoid having to monitor the buckets yourself and can instead leave that to GCF.
Maybe you could expand on what you are trying to achieve with that many folders? Are you trying to create ~4,000 sub-folders per user? There may be a better path forward should we know more about the intended use of the data? Is seems you want hold data and perhaps a DB is better suited?
- Application
|--Accounts
|---- User1
|-------Metadata
|----User2
|------Meatadata

Allowing a third party website to save to my drive (and with no sharing permission)

We're working on an app to sell our music and was wondering if Google Drive can be used as an online storage solution.
The user would complete the transaction on our site, and then authorize us to save the file (or multiple files) to their Google Drive.
The appeal to us is to solve downloading problems via the browser. I believe the Google Drive api returns a successful response when the delivery is complete. If incomplete, we would then either resend or update.
One other requirement is whether we can set permissions to not allow sharing after save (and for that setting to be permanent).
You can do everything you want. The last part about not allowing users to reshare, you can do this if you still own the file, but cannot do it if you have made the user own the file. I am not sure you could ever achieve that - a user can always download a file and share it themselves, whether you are using Drive or your own custom system.