Forge App and BIM 360 custom integration correct approach - autodesk-forge

We are working on a web application that connects to BIM360 using 3-Legged authentication. We have created an app in Autodesk Forge that users can connect through a BIM360 Custom Integration.
We are considering using 1 Forge application with multiple BIM360 Custom Integrations, but we are not sure if this is the best approach. The alternative is to have 1 Forge app per BIM360 custom integration.
Our main concern is data security. We think this approach is correct, but we would like to check with you and get your thoughts. Is this a safe and correct way to implement this authentication workflow in this type of scenario?
I would appreciate any guidance based on your experience.

You can use both approaches.
With one app for each BIM 360/ACC hub, the apps would be completely isolated, though it will require extra work to manage all of them individually. Currently, we don't have API to handle apps creation or activation/deactivation of services. Even with one app for each hub, you'll still need to ensure you're not exposing too much data to the wrong user. That can be avoided by using the 3-legged authentication.
With one app for many hubs, you can ensure the users are accessing only their data by limiting your application to always use 3-Legged authentication.
You can refer to the blog posts below for recommendations:
https://aps.autodesk.com/blog/security-recommendations-bim-360-app-developers
https://aps.autodesk.com/blog/security-recommendations-bim-360-app-developers-select-account

Related

Is there a way to store images in Azure or AWS without hosting the application on their platforms?

I am currently developing a full-stack web application for the first time. It is a store that needs to give users the ability to upload "books", edit, delete, and manage them. As of now, I have a React front-end, that calls an Express API using Axios that queries a MySQL database. This currently can manage the product details, titles, and simple labels and relations.
However, I now need to store images and .json files dynamically as well. So, I have researched and need to use Azure Storage to store these images and allow access to them by the end-user. I have researched it and the client would like to use Azure storage as well.
I have gotten quite overwhelmed looking into the Azure documentation for Javascript image uploading, and every "tutorial" starts with create a web application, storage account, and app service.
All I would like to do is store images from the user in Azure storage, so that when I eventually deploy the website, the data is available to be accessed and then my front end or API can call Azure to get the images for the user. I apologize if the question is confusing or vague, I am really just overwhelmed by Azure's documentation and it seems like such a simple and common problem. Any guidance or references would be greatly appreciated.
Yes, it's quite possible. You simply create an Azure Storage account and upload your files as blobs via that account. Then they will be available publicly on the Internet, and your web application can reference them from wherever it is hosted.
It is possible as I believe it is quite a common practice to have a Content Delivery Network to deliver the images. I am not very familiar with Azure but I am with AWS and I can tell you that you can use an AWS S3 bucket to store the images and JSON. It comes with many different configuration options to allow content to be protected or open to the general internet.

Is there any way to work with a users local (non-BIM360) file using design automation and still get things like user selection?

I am trying to use the design automation API to modify a Revit file in place. From what I am reading, it seems like, because you have to upload a revit model to the cloud, that there is no way to modify a single users revit model that they have open on thier local Revit client unless the model they are operating on is on BIM360 or another cloud service already.
I would like to make all of my Revit addins into web services to avoid having to manage versioning and distribution of installers for my addins, but it seems that there is no way to make Design Automation API work on a local file that a user has open and modify it in place.
I would also need the ability to get information about what the user is currently selecting in the model to make some of these addins work.
Could anyone shed some light as to whether what I'm asking is possible with Forge in any way or if its just a pipe dream for now?
Unfortunately, just as you have already noted, the Forge environment in mainly oriented towards viewing CAD files.
The Design Automation API adds the possibility to programmatically modify them in fully automated batch mode.
In all cases, however, the CAD engine is used to manipulate the seed CAD file, in this case the Revit BIM.
For all end user interaction with and modification of the Revit model, you will still be making use of a standard Revit end user installation on the Windows desktop.
You can certainly modify your add-ins to run fully automated in the Forge Design Automation environment.
However, they will not be able to execute on locally stored Revit models in place.
I hope this clarifies.

Google apps Script web app deploy for many users

I programmed a web app and now i need to deploy it for my organization. There are 500 potential users. What factors i need to consider before the deploy? There's an easy way to deploy it? there are limitations to consider? what risks there are? there are a model or can you give tips or experiences for the deployment?
There are no definable risks or dangers, it all depends on what your application does.
When you choose who are the users authorized to access, you will inevitably have to select anyone within your organization, otherwise only you (or the account with which the script was created) will be able to access. While as for the execution mode, you will have to choose how you or the user who accesses the application. This is important because if the application accesses an external service, such as Google Analytics, it can do so as the user with whom the application was generated or as the user who is accessing the application. Based on that selection you will see the Analytics data of one or the other Account.
So it all depends on what your application does and how it is to be used.

Azure APIM Data Level Authentication - How do people accomplish this?

Scenario: Externally exposed API, connects to multiple backed Dbs. Multiple customers can use the API, they obviously should only have access to their data. In the past this is done by separate accounts for each customer/user, and consequently each account would need setting up in each of the backend systems with the correct authorities.
Problem: I want to use Azure APIM. I don't want the extra maintenance for each user in both the Azure APIM and the backend Dbs. I was wondering if anyone has any thoughts or cases where they accomplished this in a different way. Also the API may be built with access via one account with all access to tables.
I'm sure there are different ways to approach this but a common way I believe to do this would be using Application Roles.
I don't believe this is really dependent on Azure APIM as such, but you can leverage OAuth 2.0 support to pre-authorize requests and in your backend, depending on the claims present in the token passed, you can allow/deny access to the data.
You backend would usually authenticate to the different DBs as itself with full access to all data and your backend would be tasked with making sure only people with the right claims can access the data.
In order to use an API the user/customer has to register with the Developer Portal and get a Subscription to a given API and the associated key. So you have to authenticate them. When you publish APIs through Azure API Management, it's easy and common to secure access to those APIs by using subscription keys. Client applications that need to consume the published APIs must include a valid subscription key in HTTP requests when they make calls to those APIs. https://learn.microsoft.com/en-us/azure/api-management/api-management-subscriptions
The Developer Portal supports different authentication mechanisms including Azure AD. So if you plan to use Azure AD for your authentication for both portals you will need to configure it accordingly. https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-aad

Autodesk Forge : How to store 3D Models on BOX

I am new to Autodesk services. We are having 1200+ 3D-models of size average 40MB. These models are built in .fbx format using Autodesk products. I have few doubts regarding how to proceed further. Please let me know. Thank you in advance.
Can we put our models on BOX. So that we can convert them into svf using Model Derivative API and save the output back in BOX Account. And viewer will be fetching the model file from BOX itself?
If yes can you give some references for how to proceed further?
If no, what is the costing of Autodesk server? I am aware that the free trial provides 5 GB free. But for further storage what is the cost?
My client is in US and customers are from US only, but later on we may move it to worldwide. So if needed can we change the region later on?
For now as I am from India, can I select the region to India, because the cost is like 0.6?
As we are giving access to models only to authenticated users, is Autodesk fully secure? What storage method/place will be safe?
From my forge account how can I see storage usage?
For performance wise which is the best solution for storing 3D Models.
In order to use Model Derivative APIs, your models must be uploaded to Forge using the Data Management APIs. After that, you could download the generated viewable files to your own storage, however this is not an officially supported workflow.
If you still want to serve Forge models from your own storage, this blog post shows how to obtain a list of all files that must be downloaded for a specific model.
With a non-trial Forge subscription, which costs $100 a year, there's no storage limit.
The data in Forge can be accessed from anywhere. In case you need to control the actual data center where your data shoud live, when creating a bucket for your data, you can choose between US and EMEA regions.
At this point the two available geos are US and EMEA.
Forge is secured; you can control access to content stored with Data Management APIs using 2-legged or 3-legged OAuth.
While in the trial, this information is not available. And as I mentioned above, with an active Forge subscription there's no storage limit.
I would recommend storing your CAD data in Forge. It's performant, safe, and again, the Model Derivative APIs can translate models from the Forge storage directly.