Is there any way to work with a users local (non-BIM360) file using design automation and still get things like user selection? - autodesk-forge

I am trying to use the design automation API to modify a Revit file in place. From what I am reading, it seems like, because you have to upload a revit model to the cloud, that there is no way to modify a single users revit model that they have open on thier local Revit client unless the model they are operating on is on BIM360 or another cloud service already.
I would like to make all of my Revit addins into web services to avoid having to manage versioning and distribution of installers for my addins, but it seems that there is no way to make Design Automation API work on a local file that a user has open and modify it in place.
I would also need the ability to get information about what the user is currently selecting in the model to make some of these addins work.
Could anyone shed some light as to whether what I'm asking is possible with Forge in any way or if its just a pipe dream for now?

Unfortunately, just as you have already noted, the Forge environment in mainly oriented towards viewing CAD files.
The Design Automation API adds the possibility to programmatically modify them in fully automated batch mode.
In all cases, however, the CAD engine is used to manipulate the seed CAD file, in this case the Revit BIM.
For all end user interaction with and modification of the Revit model, you will still be making use of a standard Revit end user installation on the Windows desktop.
You can certainly modify your add-ins to run fully automated in the Forge Design Automation environment.
However, they will not be able to execute on locally stored Revit models in place.
I hope this clarifies.

Related

Forge App and BIM 360 custom integration correct approach

We are working on a web application that connects to BIM360 using 3-Legged authentication. We have created an app in Autodesk Forge that users can connect through a BIM360 Custom Integration.
We are considering using 1 Forge application with multiple BIM360 Custom Integrations, but we are not sure if this is the best approach. The alternative is to have 1 Forge app per BIM360 custom integration.
Our main concern is data security. We think this approach is correct, but we would like to check with you and get your thoughts. Is this a safe and correct way to implement this authentication workflow in this type of scenario?
I would appreciate any guidance based on your experience.
You can use both approaches.
With one app for each BIM 360/ACC hub, the apps would be completely isolated, though it will require extra work to manage all of them individually. Currently, we don't have API to handle apps creation or activation/deactivation of services. Even with one app for each hub, you'll still need to ensure you're not exposing too much data to the wrong user. That can be avoided by using the 3-legged authentication.
With one app for many hubs, you can ensure the users are accessing only their data by limiting your application to always use 3-Legged authentication.
You can refer to the blog posts below for recommendations:
https://aps.autodesk.com/blog/security-recommendations-bim-360-app-developers
https://aps.autodesk.com/blog/security-recommendations-bim-360-app-developers-select-account

Is there a way to store images in Azure or AWS without hosting the application on their platforms?

I am currently developing a full-stack web application for the first time. It is a store that needs to give users the ability to upload "books", edit, delete, and manage them. As of now, I have a React front-end, that calls an Express API using Axios that queries a MySQL database. This currently can manage the product details, titles, and simple labels and relations.
However, I now need to store images and .json files dynamically as well. So, I have researched and need to use Azure Storage to store these images and allow access to them by the end-user. I have researched it and the client would like to use Azure storage as well.
I have gotten quite overwhelmed looking into the Azure documentation for Javascript image uploading, and every "tutorial" starts with create a web application, storage account, and app service.
All I would like to do is store images from the user in Azure storage, so that when I eventually deploy the website, the data is available to be accessed and then my front end or API can call Azure to get the images for the user. I apologize if the question is confusing or vague, I am really just overwhelmed by Azure's documentation and it seems like such a simple and common problem. Any guidance or references would be greatly appreciated.
Yes, it's quite possible. You simply create an Azure Storage account and upload your files as blobs via that account. Then they will be available publicly on the Internet, and your web application can reference them from wherever it is hosted.
It is possible as I believe it is quite a common practice to have a Content Delivery Network to deliver the images. I am not very familiar with Azure but I am with AWS and I can tell you that you can use an AWS S3 bucket to store the images and JSON. It comes with many different configuration options to allow content to be protected or open to the general internet.

Amazon MWS and Microsoft Access for a DB Layman?

I have some experience in MS Access, but mostly only as an offline DB tool.
I have begun working with both Seller and Vendor Central at my new company, and am in charge of scrubbing the vast amount of data for trends and whatnot. At the moment our company is solely relying on exporting reports from Seller Central directly, and cross referencing documents. I was hoping to get us started with a rudimentary database hooked into Seller directly. Our company already has a MWS Developer ID, and I see an MWS Access Key and whatnot.
I'm surprised to not finding any resources as to how I should actually connect MWS to Access. I feel confident that I can find some success by dabbling with the API once I get it connected, but I can't actually find any references on how to actually establish that connection.
Any resources you guys can forward me? Maybe I'm searching for the wrong terms. Everything I search just comes up with data service companies advertising their tools.
Well, the interface to AWS is going to be web service based. And access unfortantly does not have a built in web services interface.
So, your choices are:
Write some VBA code to hit/use/consume AWS web services. Web services are just that -a web API. (likly REST services. REST is just a fancy term that you have to type in a given URL.
So, what you looking to search for?
How can I consume web based data in Access.
Say this answer on SO
Making a SOAP request from Access 2007
The main issue is that Access does not have really good tools for consuming web data.
However, most web front "store" applcations tend to have a user area in which you can export the daily sales or data say to csv. You now can import that data into Access (or Excel).
And they often have a report area - you can generate a report, and then download again in some format like xml or csv (and again, import into Access or Excel).
If you don't want to have to maually import the data?
Then you have to code out web requests. And that can be painful.
This unfortantly means you can use say a linked table (ODBC) like you can for Acces say to some database.
So, you can start to write web interface code (it will be SOAP or REST.
Believe it or not, there was a SOAP add-in tool kit for Access 2003. But, no one used it, so they dropped it. (of course now 17 years later -gee, a truckload of people GET IT - and now see the need to consume web data!
So, you question and what to learn about?
You asking how does one consume web services.
Well, using a tool designed to work with web services helps a lot. (that's why I suggest Visual Studio and .net). If they have a WSDL for you? Then you can point Visual Studio at the web (WSDL), and it will crank out a set of "methods" and properites for you. (it will create a class. But then again, did you use and write class objects in VBA? (it does support you creating classes. But the SOAP tool kit (no longer avaiable) would write this code for you!
So, if you want to go beyond their built-in repoting tools (that let you export + download the data in some format like csv for use with Access or Excel)?
Then you have to write writing code to make web calls.
This is not a lot different in the past. If you wanted some data from the accounting system? Well, you can/could/usually do some export with the accounting package to spit out a csv file of some sort. You then import into Access.
However, if you had better skills, you might link up to the database from Access, using ODBC and then write some SQL queries against that data. So, it really comes down to skill level here. Some could not be bothered to learn say SQL and a query. So, they just export the data out of accounting, and then import into access.
The problem is now you can't link to that web site, and use SQL queries of data. You have to use web service calls. (at least if you want to make some of this process automatic).
So, you might be just fine by exporting data/files from the AWS services, and then just import into Excel or Access. As such, you not writing any code, and you just use the Access GUI to import data.
But, some want to just hit a button in Access, and see all the orders and sales from today - and have Access pull that data from the web site with one click.
For some simple data pulls? You could make a web call from Access. But for complex web interfaces? Then you need to use tools that support web interfacing (say like Visual Studio .net).
For a simple data pull? I'll use VBA and MSXML.
But, if the parameters and data call is complex? Then I write it in .net, and THEN expose that code as a consuming library to MS-Access.
So, once you signed up for AWS and what ever web services? Then they will supply you with the web calls, and documentation. You then are free to use your programming tools of choice to interface. But, this can be quite a bit of work. So, you might use VBA, but .net is much better for this type of work. (and it also a lot more difficult to code out).
As a developer who has done this, I would write a "sync" program that connects to MWS, pulls back your data, and then inserts that into MS Access. In my case, it was a C# .NET Core app with SQL Server and I used the available MWS SDK that Amazon provides for free to handle all the API calls to MWS. You can create a schedule so your app pulls the data on an interval, or make it manual where you push a button to sync it into your system.
Of course you can use Java or PHP instead of C#, or you can roll your own MWS API calls. Or like you mention there are several third party vendors that have out-of-the-box ready solutions.
I haven't used MS Access in 20 years or so, so I'm not sure about calling MWS directly. I would gather it could be done, but is probably too much work, but I could be wrong. A .NET app can insert into MS Access, no problem, but also handle the HTTP calls to MWS for you.

Multiple app were running in background , how to close them properly, autodesk forge

It is a forge app question. The app is created in visual studio and forge app. I was debugging. I am working on a forge application where I am using Data management, design automation and model derivatives. Design automation app were closing properly. After integrating Model Derivative I was passing rfa file by mistake and app was running endlessly.. The model is very small, but initially I did some mistake while passing the model to model derivative. I was passing rfa file, later realise model derivatives do not read rfa file. In the mean time I tried so many times. After passing a rvt file to model derivative, it was very fast and was running perfect. I had deleted those hanged apps. When they were running endlessly and did not show the model in the viewer, I closed the app and rerun the app or deleted the app and created a new one and rerun. Then when I saw data usage,it shows multiple apps were consuming cloud credit.Is it possible, multiple app were running simultaneously and the jobs which I had deleted did not close properly? The test model was really small.
I had stopped the app abruptly and debug again. I can guess, multiply apps were running one upon another. It is my guess. I can be wrong.
Please let me know few known reasons, for app running in background and how to properly end a job in forge cloud platform. I simply stopped debugging in visual studio and deleted the app in forge and created a new one.
In the usage graph, it is showing overlapping colors and usage of several app. My new model for design automation is big, my worry is I will lost all of my cloud credit if I do not close the jobs properly.
How to stop a project running in forge cloud. I am sure I could not stop app running in cloud properly before creating a new forge app and debug. All of them were running. I can be wrong, because I am new to forge cloud.
I described above, what can go wrong. Please let me know what could go wrong. Model was really small.
May be Ngrok was not closed properly. No idea.
I thought I closed each job, before starting a new one.
Model Derivative API doesn't support RFA format currently. With my experience, it will the following message while submitting a translation job on the RFA file with POST Jobs. Therefore, I'm not sure why you're saying the job or the app keeps running endlessly. Could you share more details on that??
{
"diagnostic": "Failed to trigger translation for this file."
}
Besides, I'm confused with your statement here. What kind of app was deleted? The Forge app inside your myapp page? It's unclear to me.
I had deleted those hanged apps. When they were running endlessly and
did not show the model in the viewer, I closed the app and rerun the
app or deleted the app and created a new one and rerun.
To cancel or delete translation jobs, you can call DELETE manifest. According to our engineering team, each translation job has designed expire time (e.g. 2 hours). If your translation job cannot be completed within the period, then the job will be canceled immediately by our service.
Note. To cancel a Desing Automation WorkItem, you can call DELETE workitems/:id to stop the job.
Lastly, back to the cloud credits question, Model Derivative API is charging per translation job while calling POST Jobs, not by the processing hour. Currently, only the Design Automation API is charging by the processing hour of your work item, see the pricing table here: https://forge.autodesk.com/pricing.
If you have further questions, you can drop me a line to forge[DOT]help[AT]autodesk[DOT]com.
Cheers,

Autodesk Forge : How to store 3D Models on BOX

I am new to Autodesk services. We are having 1200+ 3D-models of size average 40MB. These models are built in .fbx format using Autodesk products. I have few doubts regarding how to proceed further. Please let me know. Thank you in advance.
Can we put our models on BOX. So that we can convert them into svf using Model Derivative API and save the output back in BOX Account. And viewer will be fetching the model file from BOX itself?
If yes can you give some references for how to proceed further?
If no, what is the costing of Autodesk server? I am aware that the free trial provides 5 GB free. But for further storage what is the cost?
My client is in US and customers are from US only, but later on we may move it to worldwide. So if needed can we change the region later on?
For now as I am from India, can I select the region to India, because the cost is like 0.6?
As we are giving access to models only to authenticated users, is Autodesk fully secure? What storage method/place will be safe?
From my forge account how can I see storage usage?
For performance wise which is the best solution for storing 3D Models.
In order to use Model Derivative APIs, your models must be uploaded to Forge using the Data Management APIs. After that, you could download the generated viewable files to your own storage, however this is not an officially supported workflow.
If you still want to serve Forge models from your own storage, this blog post shows how to obtain a list of all files that must be downloaded for a specific model.
With a non-trial Forge subscription, which costs $100 a year, there's no storage limit.
The data in Forge can be accessed from anywhere. In case you need to control the actual data center where your data shoud live, when creating a bucket for your data, you can choose between US and EMEA regions.
At this point the two available geos are US and EMEA.
Forge is secured; you can control access to content stored with Data Management APIs using 2-legged or 3-legged OAuth.
While in the trial, this information is not available. And as I mentioned above, with an active Forge subscription there's no storage limit.
I would recommend storing your CAD data in Forge. It's performant, safe, and again, the Model Derivative APIs can translate models from the Forge storage directly.