blob.getDataAsString() Size Limit in Google Apps Script? - google-apps-script

What is the file size limit for using blob.getDataAsString()? I can't seem to find any documentation anywhere on this.
I'm currently saving JSON data in a large (102MB) json file I save to Google Drive, which I attempt to read and parse back to JSON. Before that step, I read the file via file.getBlob() and then use .getDataAsString() to read the blob contents. Here's my code:
var jsonFile = DriveApp
.getFolderById(fileId)
.getFilesByName("masterAsset.json")
.next(); //A 102MB .json file
var blob = jsonFile.getBlob();
var text = blob.getDataAsString();
Currently, I'm getting this error:
"File masterAsset.json exceeds the maximum file size."
Does anyone know what the maximum file size is for .getDataAsString()?

Related

Updating Spreadsheets with the Google Drive API

I want to update an existing spreadsheet in a drive folder but have trouble implementing the http request. I followed the documentation and was able to update a spreadsheet but the request body, which I tried to send in JSON, is always converted to CSV. This results in the distribution of the JSON parts into individual cells depending on present commas.
For instance, cell1= "{key1" and cell2= "value1" and so on. However, this prevents me from specifying the style of the sheet and values within the cells.
I found the possibility to send multipart request which, however, results in the same result. Now the first boundary string and the initial information until the first comma are included in the first cell and the rest ist divided according to existing commas.
What I want to do ist send an HTTP request with the body consisting of a JSON-File of specified information for the spreadsheet as described in the Sheets API of Google, but cannot find my current mistake. Even with mimetype set to "application/vnd.google-apps.spreadsheet" the json is always converted to csv.
mimetype "application/vnd.google-apps.spreadsheet"
If the file in question is an actuall google sheets file type. For example the mime type is "application/vnd.google-apps.spreadsheet". Then you should go though the google sheets api to update it. Other wise updating it though google drive you will need to load the file itself into a file stream and then upload it that way. You cant pick and choose what parts are uploaded with drive its all or nothing. Drive doesn't have the power to format things like cells and stuch it just uploads the raw file data.
Mimetype "text/plain"
If the file is in fact a csv file so the mime type is "text/plain" then you can update the text directly. by turning the text into a stream.
You have not said what language you are using so here is my sample for C#. The code is ripped from How to upload to Google Drive API from memory with C#
var uploadString = "Test";
var fileName = "ploadFileString.txt";
// Upload file Metadata
var fileMetadata = new Google.Apis.Drive.v3.Data.File()
{
Name = fileName,
Parents = new List<string>() { "1R_QjyKyvET838G6loFSRu27C-3ASMJJa" } // folder to upload the file to
};
var fsSource = new MemoryStream(Encoding.UTF8.GetBytes(uploadString ?? ""));
string uploadedFileId;
// Create a new file, with metadata and stream.
var request = service.Files.Create(fileMetadata, fsSource, "text/plain");
request.Fields = "*";
var results = await request.UploadAsync(CancellationToken.None);
if (results.Status == UploadStatus.Failed)
{
Console.WriteLine($"Error uploading file: {results.Exception.Message}");
}
// the file id of the new file we created
uploadedFileId = request.ResponseBody?.Id;

URLFetch is unable to fetch the URL

I am trying to fetch the excel file from this url:-
https://www.cmegroup.com/CmeWS/exp/voiProductsViewExport.ctl?media=xls&tradeDate=20220909&assetClassId=3&reportType=P&excluded=CEE,CEU,KCB
But when I am trying to do it using URLFetchApp, script is getting timed out.
Although when I click on this URL directly to open this in browser, file gets download
Code I am using :-
let url = "https://www.cmegroup.com/CmeWS/exp/voiProductsViewExport.ctl?media=xls&tradeDate=20220909&assetClassId=3&reportType=P&excluded=CEE,CEU,KCB"
const excelFile = UrlFetchApp.fetch(url).getBlob();

How to pass a huge json file to the Command.cs in forge api design automation for revit?

I created a project from this tutorial.
How may I send a huge json file by user to be read in Command.cs in Design Automation for Revit on cloud? I receive the file in DesignAutomationController.cs using form, but i am unable to send it to Command.cs as in this the url becomes way too huge.
XrefTreeArgument inputJsonArgument = new XrefTreeArgument()
{
Url = "data:application/json, " + ((JObject)inputJson).ToString(Formatting.None).Replace("\"", "'")
};
How huge is the json file? The workitem payload limit is only 16 kb.
We recommend embedded json only for small contents. For anything big, you may upload the json content to a cloud storage and pass the signed url to the file as input argument URL.
Design Automation API limits are defined here:
https://forge.autodesk.com/en/docs/design-automation/v3/developers_guide/quotas/

How can I download a file by name from Google Drive API?

I am trying to download a file from Google Drive API v3. I have to do this by finding a file by name.
This is request url to get a file general information:
https://www.googleapis.com/drive/v3/files?q=name+%3D+'fileName.json'
and to download a file I have to use parameter alt=media.
It works but only when I am finding this file by id. I mean:
https://www.googleapis.com/drive/v3/files/3Gp-A4t6455kGGGIGX_gg63454354YD?alt=media
Anyone know how to download a file by name so using this?
https://www.googleapis.com/drive/v3/files?q=name+%3D+'fileName.json'
Answer:
Unfortunately it isn't possible to download a file from Google Drive using the file name in the URL itself.
Reasoning:
As can be seen in the image below, Google Drive supports multiple files having the same name. Each file, instead of being identified exclusively by its name, has a unique ID which tells it apart from other files. As it is possible to have, for example, three files all with the same name, making a request and only referring to the file name doesn't give enough information to identify exactly which file you want to download.
Workaround:
You can still use the filename to build the request, but first you need to make a list request to the Drive API so that you can obtain the specific file ID for the file you wish to download.
I'll assume the file you want to download is called fileName.json as in your question.
First, you'll want to make a list request to the server to obtain the initial file name. The scope you will need for this is:
https://www.googleapis.com/auth/drive.readonly
The request itself is as you placed in the question. Once you have obtained your token, you must make a GET request:
GET https://www.googleapis.com/drive/v3/files?q=name+%3D+'fileName.json'&key=[YOUR API KEY]
You must replace [YOUR API KEY] with your actual API key here. You can obtain a temporary one over at the OAuth Playground.
From this request you will get a JSON reponse of all the files in your Drive with the requested filename. This is an important point - if you only have one file with this filename, you have nothing to worry about and can continue from here. If more than one file exists then the JSON response will contain all these files and so extra code will need to be added here to retrieve the one you want.
Continuing on - the response you get back is of the following form:
{
"incompleteSearch": false,
"files": [
{
"mimeType": "application/json",
"kind": "drive#file",
"id": "<your-file-ID>",
"name": "fileName.json"
}
],
"kind": "drive#fileList"
}
From here, you can start to build your URL.
Building the download URL:
After retrieving the JSON response from the API, you need to extract the File ID to put into a URL. The following example is written in JavaScript/Google Apps Script, but can be built in whichever language suits your needs:
function buildTheUrl() {
var url = "https://www.googleapis.com/drive/v3/files";
var fileName = "fileName.json"
var apiKey = "your-api-key";
var parameters = "?q=name+%3D+'";
var requestUrl = url + parameters + fileName + "&key=" + apiKey;
var response = JSON.parse(UrlFetchApp.fetch(requestUrl).getContentText());
var fileId = response.files.id;
var downloadUrl = "https://www.googleapis.com/drive/v3/files/";
var urlParams = "?alt=media";
return downloadUrl + fileId + urlParams + "&key=" + apiKey;
}
This returns a string which is the download URL for the file. This is a Files: get request that can then be used to download the file in question.
References:
Google OAuth Playground
Google Drive API Files: list
Google Drive API Files: get
Google Apps Script: UrlFetchApp
w3schools: JSON objects

Visualize files from Azure blob container

I have deployed a asp mvc where I am trying to display csv files as tables which have been stored in Azure blob storage.
I have problems to read the files in a blob container. I couldn't find any solution in the Microsoft documentation.
My blob containers are public and maybe I could access through their url, but I dont know how to read the csv files. Any Ideas?
My blob containers are public and maybe I could access through their url, but I dont know how to read the csv files. Any Ideas?
To read the csv file stored in Azure Blob storage, you could refer to the following sample code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("connection string");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob blockBlobReference = container.GetBlockBlobReference("testdata.csv");
using (var reader = new StreamReader(blockBlobReference.OpenRead()))
{
string row = "";
while (!reader.EndOfStream)
{
//read data from csv file
row = reader.ReadLine();
}
}
My aim is to visualize real time data that comes into the blob storage.
It seems that you’d like to real-time display csv data as tables in clients’ web page. ASP.NET SignalR could help us develop real-time web functionality easily, you could detect csv file under a specified Blob container and call hub method to read data from csv file and push data to connected clients in your WebJob function, and then you could update UI based on the pushed csv data on SignalR client side.
call hub method inside your WebJob function
var hub = new HubConnection("http://xxx/signalr/hubs");
var proxy = hub.CreateHubProxy("HubName");
hub.Start().Wait();
//invoke hub method
proxy.Invoke("PushData", "filename");
hub method to push data to connected clients
public void PushData(string filename)
{
//read data from csv file (blob)
//call javascript side function to populate (or update) tables with csv data
Clients.All.UpdateTables(data);
}