hosting a JSON file for a 3rd party app/service to use - json

We currently use Jive Cloud N which can use the Rest API and allows the use of Custom Apps. Our UI devs have created an app which uses a JS GET to pull data from a JSON file for our "Birthdays and Anniversaries" tile.
At the moment, the JSON file is hosted on our UI dev's Google Cloud Apps account, but we wish to host it internally so we don't have to keep contacting them for changes.
I uploaded the file to our OneDrive for Business storage and created a public URL with full read permissions but the Jive platform is throwing an error trying to load the custom app.
The error is that the file
has been blocked by CORS policy: No "Access-Control-Allow-Origin"
header is present
Our dev said that to get it working on his Google Cloud App storage, he had to specify the allow-control-allow-origin field in the server's server app.yaml file. I don't know what this is and if there is an equivalent for ODfB/SharePoint.
To get to my question: How can I host this JSON file on ODfB or even somewhere on our Azure tenancy so that it can be used? Or am I better off trying to setup a Google Cloud App storage location and replicate our dev's setup? FYI - I'd prefer the former because we're using M$ for a number of cloud hosted services already.
Thanks in advance

To get to my question: How can I host this JSON file on ODfB or even somewhere on our Azure tenancy so that it can be used?
FYI - I'd prefer the former because we're using M$ for a number of cloud hosted services already.
Per my understanding, you could leverage Azure Blob Storage to store your JSON file, and you could use Microsoft Azure Storage Explorer to easily manage/share your files.
Moreover, You could manage anonymous read access to your containers and blobs, refer to this tutorial for more details. Also, you could leverage SAS to grant limited access to your storage account for other clients, you could follow this tutorial for getting started with SAS.
For a simple way, you could create your storage account and leverage Microsoft Azure Storage Explorer to manage/share your file as follows:
For cross domain accessing, you need to configure CORS Setting:
For sharing your file(blob), you could Set Container Public Access Level or leverage SAS to grant limited access to your file for other clients as follows:
Right click your container, select "Set Public Access Level":
Sample file for share: https://brucechen.blob.core.windows.net/brucechen/index.json
Also, you could right click your JSON file, click "Get Shared Access Signature":
Sample file for share: https://brucechen.blob.core.windows.net/brucechen/index.json?st=2017-02-28T08%3A04%3A00Z&se=2017-09-01T08%3A04%3A00Z&sp=r&sv=2015-12-11&sr=b&sig=rVkorHeNOd4j2YhkmmxZ6DfXVLf1FoN2smY6mNRIoWs%3D

Related

Forge Design Automation Revit Workitem Arguments

I'm following the Design Automation API v3 tutorial for Revit.
When doing a workitem post I'm a little unclear about the "rvtFile" and "result" arguments. Can the rvtFile url be in an aws bucket? Also what are the restrictions for the result website? It states that it needs to be a signed url, but can this just be another aws bucket? Or do I need to create a website? (Note: I've never done any web development. Everything I know i learned from this tutorial)
Since Design Automation for Revit runs on cloud (and not your local machine), it needs a way to download your input files. You may put your files on any of the storage service providers (say Amazon S3) and provide direct download links to it. For Design Automation to have access to it, you will either need to make those files be public urls or keep them private and generate a signed url for it. When DA4R runs your workitem, the direct download urls provided in the workitem payload will be called to download your files to the worker machine.
Design Automation also does not store any of your result files. So, you will have to generate a signed url for uploading them to appropriate cloud location(s) (say a location in Amazon S3 bucket).
While Amazon S3 is just an example, there are several other storage providers. I also recommend reading Autodesk Forge's Data management APIs:
https://forge.autodesk.com/api/data-management-cover-page/
EDIT:
Useful links
Tutorials: https://learnforge.autodesk.io/
AU Class: https://www.autodesk.com/autodesk-university/class/Revit-Data-Forge-How-Can-Design-Automation-Revit-API-Help-Me-2018

using extract.autodesk.io and automatically download bubbles to our local server

I'm trying to use and modify the extract.autodesk.io (thanks to Cyrille Fauvel) but not yet successful. In a nut shell, this is what I want to do:
user drag-drop the design file (i'm ok with this)
I've removed the submit button - so right after uploading, extraction should begin in autodesk's server. (i've added a .done to trigger the auto-extraction : uploadFile (uri).done(function(){SubmitProjectDirect();}); )
no need to load a temp viewer for view/test
automatically download the bubble in zip file into our local server folder.
Delete uploaded model right away as our projects are mostly strictly confidential.
I'm encountering a 405 'Method not allowed' on 'api/file' sub folder, which I believe it should be autodesk's folder in the server.
Can anyone point the root urn of api/file?
I seem to get stuck on item 2 above due to the 405 error. But if get passed that one, I still need to solve 3, 4 and 5.
Appreciate any help...
In light of the additional comment above, the issue is a bit more complicated than I thought originally. In order to upload a file on the Autodesk cloud storage, you need to use specific endpoints, with a PUT verb and provide an oAuth Access Token.
It should be possible to setup the Flow.js to use all the above, but since it is a javascript library running on your client, it means anyone can steal your access token and use it illegitimately to either access your data, or consume your cloud credit to do action on your behalf.
Another issue is that the OSS minimum chunk is 5Mb - see this article, so you need to control this as well as providing OSS the byte assembly range information.
I would not recommend uploading to OSS from the client directly for security reason, but if you do not want to store on your server as a temporary storage, we can either proxy the Flow.js upload on the OSS storage or pipe the uploaded chunk on the Autodesk cloud storage. Both of the solution will be secured with no storage on your server, but traffic will continue to go via your server. I will create a branch on the github repo in a few days to demonstrate both approach.

How to retrieve my Install Statistics from the Google Developer's Console

I am trying to programmatically retrieve my company's app data from the Google Developer's Console, specifically the daily installs. I have found that Google recommends the gsutil tool to access the data programmatically through the Google Cloud Storage SDK. However, I beleive they charge for this service. I want a free way to programmatically retrieve the data, preferably as a JSON stream to avoid dealing with file downloads. I have found the "direct reporting" links, but I have problems authenticating when I try to use them, and I also have to do something with the actual files then.
Is there a way to get a JSON version of the data through OAuth2 or something without downloading an Excel file? Has anyone had to do this?
You should look into use the Core Reporting API.
There are client libraries available in a number of languages.
You should work through the Hello Analytics APIs to get started.
Java Script
PHP
Python
Java
A quick solution for building a dashboard would also be the Embed API.
Using the gsutil tool to access the company's storage bucket that are provided by google is a free service. I wrote a code that will run the gsutil code as a process through the command line and parsed the downloaded .csv files into a database for storage. OAuth2 was not necessary.

What API to use for Offline Chrome App

I want to develop an offline chrome application.
As in offline app SQL is not available , so what API can serve the following purpose.
=>Large Storage
=>Efficient method to set and get values
=>Fast
=>Secured (user cannot temper the data)
Confused between IndexDB and File System API
I have knowledge of web languages and how online apps can store data on server. But don't know much about how to save data offline.
It all depends on your needs.
The Chrome apps have couple of limitations. Because they must to be very fast some web API's are disabled. You can't use localStorage and webSql for example.
However in apps you have different set of storage options:
chrome.storage.local - equivalent for localStorage but asynchronous. you can also save/read many objects at once
chrome.storage.sync - same as above but data are shared between different app instances (on other browser's profiles or machines)
web filesystem API - well known web filesystem API that can keep any kind of file in protected, browser storage. User's do not have direct access to this files, only the app have
extension to the above: chrome.syncFileSystem - it works similar to the above but files saved using this API are synced between app's instances (e.g. different machines) using Google Drive as a back-end. However user's can't see synced files in Drive UI because they are hidden.
chrome.fileSystem API - another extension to the web filesystem API and it gives you access to the user's sandboxed local filesystem. You can read from and write to selected by the user locations.
IndexedDB - quoting the docs: IndexedDB is an API for client-side storage of significant amounts of structured data, which also enables high performance searches of this data using indexes.
other custom solutions saving data on some server and syncing changes in all instances
You can choose one of above. As I can see you'll probably want to use IndexedDB API. It is not SQL and it is different approach to saving data. If you never use it before try some sample app first. However it's fast, efficient and combining with unlimitedStorage permission also can set large amount of data.
I also suggesting you to read Offline First page in Chrome Apps documentation where are examples of solutions for making an app offline.

Google Cloud Storage Client Library - Check Shared Publicly By Default

I am using Google cloud storage client library for uploading files into it dynamically and serving the files from it. The uploading part is working fine. There is no problem in that.
The issue i am facing is after upload if I try to access the file in my application, the serving link is broken.
So i went to Google cloud console and one thing i noticed is the Shared Publicly is unchecked.
If i check the Shared Publicly checkbox, then i could access the file in my application.
I have tried with both public-read and bucket-owner-full-control ACL.
This is the code I am using..
GcsFileOptions options = new GcsFileOptions.Builder().cacheControl("public, max-age=31536000, no-transform").mimeType(mimeType).acl("bucket-owner-full-control").build();
Can you guys tell us what modification should i make, to check that Shared Publicly checkbox.
Do you want the file to be publicly available? If so, why not set the ACL to public-read instead of bucket-owner-full-control?