Get the template of a funtion app in azure - azure-template

I'm trying to export the template of a function app. All i managed to get is the full JSON template of the resource group the function app belongs to.
The automation script link below generates it. Even if i'm located in function app settings, it keeps generating the full resource group template.
Thank you for your help

Except that you can export the template from the resource group, you can export the template from the deployment. Find the function related deployment name from your resource group page-settings-developments, then click Templates and Download.
For more details, you can refer to export an Azure Resource Manager template from existing resources.
Alternatively, you can access the Azure Resource Explorer site to discover the Azure Resource Management APIs, navigate to the resources you are looking for, and just copy\paste it.

Related

Source code is missing from google cloud functions list

I am trying to retrieve the source code of a website/project which was made for me on fiverr. I've been made the owner of the project.
I followed the advice on previous links (below) to get the source code:
Is it possible to get the source code of a Google Firebase (or Google Cloud Platform) project?
Is it possible to retrieve Firebase Cloud Function source code?
Get code from firebase console which I deployed earlier
For some reason my google cloud functions list does not show “source” as stated in previous answers picure with source missing in functions list.
Is there another way to access the source code from google cloud?
Appreciated
J
So its been over a year and a bit more experience with Firebase. The correct answer is you cannot technically access the source code because the source code contains the html,css and javascript files which are deployed to the Firebase server and are back-end (server side). What you view on the website is only the front-end client side code and is not the full picture.
Whenever you get a website developed for you, you should always have the developer pass on all the html,css and javascript codes so that you can always deploy it when needed.

How to embed Autodesk Forge model viewer into a website?

I have created a web application for viewing models using the AutoDesk Forge Viewer, and I want to be able to add this onto a website. I used this tutorial: https://learnforge.autodesk.io/#/tutorials/viewmodels (using node.js for the language option).
The goal is to have the user access the viewer application from the website. I have been using VS code live server for testing. However, when I link the page that has the viewer into my own website, the viewer does not load the buckets or allow for creation of new buckets. It is just stuck on a loading symbol like below:
[Loading screen][1]
Could I please have the following questions answered:
What is the proper way to embed this application onto a website in the manner I have described above?
What part of the code controls where the buckets are loaded in?
Thank You.
[1]: https://i.stack.imgur.com/4Xlfv.png
LearnForge tutorial is an example on how to work with Forge API. As a web app, it depends on how the developer(you) designs the user interface, workflow, and data management.
e.g. you can remove the panel of bucket & object lists ,keeping the viewer only in the UI. while you will need to design how to provide the object id (urn) which will be loaded in the viewer. Normally, you would need to setup your own users management, logging process etc, and set your own user permission. Then the user logs in, the web app lists all files (objects) he has permission to check, and when one file is selected, get the urn and load the model in Forge Viewer.
if the end user of your app is BIM360 users, you could take advantage of BIM360 data management workflow, which follows the same permission specified with BIM360. Then the other tutorial will be a good start.
https://learnforge.autodesk.io/#/tutorials/viewhubmodels
In any case, the workflow and UI are defined by yourself. I hope this explains. If you have any further questions that need a meeting call, please feel free to check the calendar of our team:
https://calendly.com/autodeskforge

How to request internal API in App Script

I have a project in GCP with some internal APIs (in VMs), Databases, and so on.
Now, I'm trying to create a GAS to obtain data from one of those internal APIs (Druid, in my case) to print some data in a Google Spreadsheet.
My point here is that I link the GAS to my GCP project, expecting to be able to connect to my internal IP (10.1.0.x) which is in a VPC, shared with the default one. So, if I start a new VM attached to the default network, I could be able to ping and connect to it. Seems reasonable.
But, when I execute the GAS function, the following pice of code fails: UrlFetchApp.fetch('http://10.1.0.3:8082/druid/v2/?pretty', options);.
Should I configure something else in the GCP project to be able to connect to internal APIs?
Should I change the way and use another GCP service to do so?
Any help would be more than appreciated!
Thanks
That is not supported in Apps Script. You could request a feature request in the bug tracker. See https://developers.google.com/apps-script/support#missing_features
Internal APIs run client-side and need to be incorporated within JavaScript code.
Within Apps Script you can run JavaScript code, if you deploy your project as a Web App.
You can then either embed the JS code within the HTML file attached to the project, or directly insert it within the <script></script> tags within HtmlService.createHtmlOutput().

hosting a JSON file for a 3rd party app/service to use

We currently use Jive Cloud N which can use the Rest API and allows the use of Custom Apps. Our UI devs have created an app which uses a JS GET to pull data from a JSON file for our "Birthdays and Anniversaries" tile.
At the moment, the JSON file is hosted on our UI dev's Google Cloud Apps account, but we wish to host it internally so we don't have to keep contacting them for changes.
I uploaded the file to our OneDrive for Business storage and created a public URL with full read permissions but the Jive platform is throwing an error trying to load the custom app.
The error is that the file
has been blocked by CORS policy: No "Access-Control-Allow-Origin"
header is present
Our dev said that to get it working on his Google Cloud App storage, he had to specify the allow-control-allow-origin field in the server's server app.yaml file. I don't know what this is and if there is an equivalent for ODfB/SharePoint.
To get to my question: How can I host this JSON file on ODfB or even somewhere on our Azure tenancy so that it can be used? Or am I better off trying to setup a Google Cloud App storage location and replicate our dev's setup? FYI - I'd prefer the former because we're using M$ for a number of cloud hosted services already.
Thanks in advance
To get to my question: How can I host this JSON file on ODfB or even somewhere on our Azure tenancy so that it can be used?
FYI - I'd prefer the former because we're using M$ for a number of cloud hosted services already.
Per my understanding, you could leverage Azure Blob Storage to store your JSON file, and you could use Microsoft Azure Storage Explorer to easily manage/share your files.
Moreover, You could manage anonymous read access to your containers and blobs, refer to this tutorial for more details. Also, you could leverage SAS to grant limited access to your storage account for other clients, you could follow this tutorial for getting started with SAS.
For a simple way, you could create your storage account and leverage Microsoft Azure Storage Explorer to manage/share your file as follows:
For cross domain accessing, you need to configure CORS Setting:
For sharing your file(blob), you could Set Container Public Access Level or leverage SAS to grant limited access to your file for other clients as follows:
Right click your container, select "Set Public Access Level":
Sample file for share: https://brucechen.blob.core.windows.net/brucechen/index.json
Also, you could right click your JSON file, click "Get Shared Access Signature":
Sample file for share: https://brucechen.blob.core.windows.net/brucechen/index.json?st=2017-02-28T08%3A04%3A00Z&se=2017-09-01T08%3A04%3A00Z&sp=r&sv=2015-12-11&sr=b&sig=rVkorHeNOd4j2YhkmmxZ6DfXVLf1FoN2smY6mNRIoWs%3D

Google Drive Live API: Server Export of Collaboration Document

I have a requirement to build an application with the following features:
Statistical and Source data is presented on simple HTML pages
Some missing Source data can be added from that HTML page ( data will be both exact numerical values and discriptive text )
Some new Source data can be added from those pages
Confirmed and verified data will NOT be editable via the HTML interface
Data is stored and made continuously available via the HTML interface
Periodically the data added/changed from the interface needs to be pulled back into the source data - but in a VERY controlled way. All data changes and submissions will need verification and checking - and some will trigger re-runs of models ( some of which take hours to run ).
In terms of overview architecture I have:
Large DB that stores and manages the data - this is designed for import process's and analysis. It is not ideal for web presentation or interface
Code servers that manipulate the data for imports and analysis
Frontend server that works as a proxy to add layer of security to S3
Collection of generated html files on S3 presenting the data required
Before reading about the Google Drive Realtime API my rough plan was to simply serialize data from the HTML interface and post to S3. The import server scripts would then check for new information, grab it, check it, log it and process it into the main data set.
That basic process however would mean that once changes were submitted from the web page - they would be lost from the users view until they had been processed by the backend.
With the Google Drive Realtime API it would appear I could get the best of both worlds.
However for the above to work I would need to be able to access the Collaboration Document in code from the code servers and export the data.
The Realtime API gives javascript access to Export and hand off to a function - however in my use case I want to automate the Export from the Collaboration Document.
The Google Drive SDK does not as far as I can see give any hints on downloading/exporting a file of type "Collaboration File".
What "non-browser-user" triggered methods are there for interfacing with the Collaboration Documents and exporting them?
David
Server-side export is not supported right now. What you could do is save the realtime model to a regular drive file, and read from that using the standard Drive API. See https://developers.google.com/drive/realtime/models-files for some discussion on different ways to setup interactions between realtime models and Drive Files.