Workaround Google Cloud Function memory limits - google-cloud-functions

I am using a Google Cloud Function to send files from Cloud Storage to Sharepoint using the Sharepoint REST API. This works great except for large files because the function runs out of memory (set to max 2gb).
Currently, the Cloud Function is triggered when a new file arrives at the storage bucket. It downloads this file into the /tmp directory and then makes a POST request to the Sharepoint endpoint.
Is there a way to bypass downloading the file to /tmp? Can a POST request be made by specifying a remote file location? Other recommendations?

Related

Can GoogleSheet write json to local folder in computer automatically by apps script?

I would like to write json files to the local computer( D://MyJsons ) by apps script.
(I've already wrote the code to get json string. I just need a way to write to the local folder)
I've found DriveApp.createFile to write to google drive, but not found a way to save in the local computer.
Is that possible?
Apps Script only work in the cloud, it has no access to your local storage. The best way to do it would probably to setup a synced folder between your computer / server and your drive.

How to upload a CSV file in a microservice deployed in cloud foundry

I am new to cloud foundry. I am currently working on a requirement where I have to upload a CSV file (via JSP UI) into a service deployed in cloud foundry and persists its data in service.
The issue is from UI, I only get a local path of that CSV file and when I am trying to parse that CSV via this path the file is not recognized. I guess the reason is service is already deployed in CF, so it does not recognize this local machine path.
Can you please let me know how can I parse this CSV file in local machine and where to parse this CSV.
Thanks in Advance!
There is nothing specific to Cloud Foundry about how you would receive an uploaded file in a web application. Since you mentioned using Java, I would suggest checking out this post.
How to upload files to server using JSP/Servlet?
The only thing you need to keep in mind that's specific to Cloud Foundry is that the filesystem in Cloud Foundry is ephemeral. It behaves like a normal filesystem and you can write files to it, however, the lifetime of the filesystem is equal to the lifetime of your application instance. That means restarts, restages or anything else that would cause the application container to be recreated will destroy the file system.
In short, you can use the file system for caching or temporary files but anything that you want to retain should be stored elsewhere.
https://docs.cloudfoundry.org/devguide/deploy-apps/prepare-to-deploy.html#filesystem

how copy files from google drive to an ftp server using google app scripts?

I have a project on Google App Scripts connected to my drive.
I was able to save and organize my files (from mail) and directories in the drive using the script.
now, I want to save the files to an FTP server and cannot find the References to do so.
how do I access an ftp server, create folders and upload files to them ?
thanks
It seems this is not possible. If you check this Class UrlFetchApp, app script only supports HTTP and HTTPS communication over the internet. A supporthing SO thread seems to confirm this.
The workaround to go from GAS -> FTP is as follows. I've been using this method for a number of years in production.
From apps script, save the file to google drive
Using Zapier, trigger to copy new filesfrom the google drive folder to a DropBox folder
Using Microsoft Flow, trigger to copy new files from the Dropbox folder to an FTP or SFTP location.
You can probably do step 3 in Zapier as well. I use Microsoft Flow because I already had a significant number of automated tasks setup there. For Step, 2, I use Zapier because MS flow does not yet have a built-in trigger available based on a new file being saved to google drive. You would have to make your own custom trigger using the google drive API's.

Google Drive upload in Service Account

We have a requirement where we should provide capability to upload files up to 100 GB size. Current flow which we have is to put the file from client location/local system to the application server. Then application server pushes the file to a service account in Google Drive server. I would like to know if there is a way to push the file from local system directly to service account in Google Drive. This would help us to not have to store such big files in application server. Anyone has any suggestion on this ?
Service accounts are not meant to be used like this. They have limited quota: Google Drive Service Accounts Additional Storage

How to upload file directly from a server to Google Drive using the Google Drive API?

Suppose I want to write a tool which allows the user to copy a file from Dropbox to Google Drive but I don't want to download the file first to my server and then upload it to the drive. Is there a way where I can insert a file to drive by just providing a URL? I couldn't find anything in the documentation.
No.
Having said that, it isn't necessary to create a file on your server. You can buffer the content in memory.
I suppose you could send the access token to the client via a POST request. You'd have to encrypt it, of course, preferably SSL. Then the client could use the Drive SDK to upload the file directly from itself, completely bypassing your server.