Does a query to an ScriptDb consume quotas? - google-apps-script

In the official documentation of ScriptDb is well documented the storage capabilities of the ScriptDb. However, it is not clear for me if querying a ScriptDb or saving new objects to an ScriptDb consumes quotas. Is there a limit in the number of queries that an app can do to an ScriptDb per day?

ScritpDb is only capped by overall store size starts at 50mb and is the total of all the databases owned by the script author.
The only thing you'll want to be sure with reads is that it doesn't run over on execution times (usually if a script runs for several minutes you should be concerned). For example, the total trigger execution time for a day is 1hr. If you run the script as the person accessing the script then you'll be less likely to run into this.

Related

Support 5000 concurrent users of Apps Script Web App

I am in the process of architecting a small application for a G Suite customer. The app will be used by all employees at the customer and during peak times could see as many as 5,000 concurrent users.
They do not have App Maker and do not plan to enable App Maker anytime soon.
Will an Apps Script Web App be able to handle this many concurrent users?
Database:
I was originally thinking of using a Google Sheet as the database. There are no documented Apps Script limits around reading or writing data to a Google Sheet. So as long as I stay within the Google Sheets API quota limits I should be good.
With that said, I am also debating on using Cloud SQL as the database. App Maker forces you to use Cloud SQL so I assume it is a superior option to a Google Sheet. The only way I see to connect to Cloud SQL from Apps Script is via the JDBC service. However there is a hard limit of 50,000 connections per day. https://developers.google.com/apps-script/guides/services/quotas#current_quotas
Does anyone know if this limit is per app script or per user?
If per app script, then 5,000 users would only have 10 calls each per day. That would be unusable for my needs.
Side note, Google Cloud SQL has a maximum of 4,000 connections. I am banking on the fact that reads and writes will be extremely fast so max connections at a single point in time will be less than 4,000.
As TheMaster noted in the above comment you have a max of 30 simultaneous executions which limits you to a max of 30 concurrent users for a GAS Web App.
As an alternative you might be able to leverage Cloud Functions (basically just a Node.js/Express.js module). ̶M̶a̶x̶ ̶n̶u̶m̶b̶e̶r̶ ̶o̶f̶ ̶c̶o̶n̶c̶u̶r̶r̶e̶n̶t̶ ̶u̶s̶e̶r̶s̶ ̶i̶s̶ ̶1̶0̶0̶0̶ ̶t̶h̶o̶u̶g̶h̶. It's not a free platform, but it supports CloudSQL and may be cheaper than getting a Google for Business account (US$11.00 per month per user) as required by App Maker.

Azure - Trigger which copies multible blobs

I am currently working on a ServiceBus trigger (using C#) which copies and moves related blobs to another blob storage and Azure Data Lake. After copying, the function has to emit a notification to trigger further processing tasks. Therefore, I need to know, when the copy/move task has been finished.
My first approach was to use a Azure Function which copies all these files. However, Azure Functions have a processing time limit of 10 minutes (when manually set) and therefore it seems to be not the right solution. I was considering calling azCopy or StartCopyAsync() to perform an asynchronous copy, but as far as I understand, the processing time of the function will be as long as azCopy takes. To solve the time limit problem, I could use WebJobs instead, but there are also other technologies like Logic Apps, Durable Azure functions, Batch jobs, etc. which makes me confused about choosing the right technology for this problem. The function won't be called every second but might copy large data. Does anybody have an idea?
I just found out that Azure Functions only have a time limit when using consumption plan. If there is no better solution for copy blob tasks, I'll go for Azure Functions.

Change queue time of Google spreadsheet app script trigger

When I create a daily time-based trigger for the Google app script associated with my Google spreadsheet, I am prompted to select an execution time that is within an hour-long window, and it appears that a cron wrapper randomly assigns an exact execution time within that hour-long interval.
Because my application's specific use case has several data dependencies which may not be completed early in the hour, I was forced to divide my application into several stages, with separate triggers each delayed by an hour, to insure that the required data would be available.
For example, the trigger time that was initially assigned for my script was 6:03AM, but the data which usually arrived at 5:57AM, occasionally did not arrive until 6:10AM and the script had nothing to process for that day. As a blunt force solution, I deleted the 6-7AM trigger and re-created it to execute in the 7-8AM time slot to insure the required data was available. This required that the second stage of the script had to be moved to 8-9AM, resulting in script results which could be delayed by as much as 2-3 hours.
To improve this situation, I am contemplating integrating the two script processing stages and creating a more accurate script execution trigger time, say 6:30AM to be safe. Does anyone know if:
Is it possible, other than by observing daily processing, to discover the exact trigger execution time that has been assigned, and
If randomly assigned, can script triggers be created and deleted until an acceptably precise execution time is obtained?
Thanks in advance for any guidance provided.
If accuracy is paramount, you can forgo using apps script triggers altogether and leverage a 3rd party tool instead.
I'd recommend using cron-job.org. This service can create cron jobs that make POST requests to a url endpoint you specify, and you can schedule times accurate to a minute. To use it with Apps Script implement a doPost() to handle post requests and deploy your script as a Web APP. You then create a cron job using the service and pass it the web app's URL as an endpoint.
The cron job will fire at the scheduled time and you can perform any requisite operations inside the doPost() in response to the incoming POST request.
Thank you to random parts and Dimu Designs for the guidance. Based upon experimentation, here are the answers to my questions:
Is it possible, other than by observing daily processing, to discover the exact trigger execution time that has been assigned? Answer: No way except by observing the random trigger time assigned within the requested hour window.
If randomly assigned, can script triggers be created and deleted until an acceptably precise execution time is obtained? Answer: Yes. I adjusted my script's assigned execution time by observing a trigger's execution time (via email message timestamp), and deleting, recreating, and observing the randomly assigned trigger execution time until I got an acceptable minute within the requested hour window.

Memory limit exceeded

I am using Google Apps Script to create documents based on a template and some data stored in a spreadsheet. It was previously working well but recently (without having changed any code) I started getting the below error during a "google.script.run" call from a very simple HTML sidebar:
Execution failed: Memory limit exceeded
The error occurs during the copying process - it seems to occur at slightly different places each time the script is run.
I don't see any references to memory limits in the apps script quotas and a general Google search doesn't seem to find anything.
Can anyone shed some light on this and how to determine the limit/what is holding most of the memory/increase the limit if possible?
Check the methods that you're using to get the data ranges.
i.e. if you use sheet.getRange(A:X) it'll get all the rows in the sheet, even the empty ones (probably in the order of tens of thousands, or even hundreds of thousands). It's a better option to use the getDataRange method, which get the last column and last row with data.
Be noted with the Quotas for Google Services. Apps Script services impose daily quotas and hard limitations on some features. If you exceed a quota or limitation, your script will throw an exception and terminate execution.

Google Apps Script Service Using Too Much Computer Time for One Day

So I got the error message in the subject on the 1 (one) script I had running yesterday and I am assuming I will get a similar message today.
I have improved the script (which has a trigger to run once per minute) so it functions more along the lines of how it is supposed to however the error message got me thinking as to what sort of functions or bits of programs might be asking for more service time than others.
For example, I have had to use multiple sleep calls in my google apps script to allow the data import to run and again for the worksheet changes/copy paste calls to process. Are all those sleep calls counting against me in terms of service time used?
I would ask on the community's behalf that this be left as an open ended question not specific to the sleep function. What sorts of parts of a script are demanding service time and which are not (if any).
Every call to a service (Spreadsheet, Calendar or whatever) takes more time than regular JavaScript operations.
For example, if you have to modify 10 cells in a Spreadsheet,
calling range.setValue() 10 times takes far more time than having all the data in an array and then updating the spreadsheet in one go using range.setValues().
If you can paste pieces of your code, the community will be able to offer more advice on how to improve your script.
The limit is on CPU time used in time based triggers, and I believe those sleep calls are counted against your limit. I'd encourage you to find ways to avoid the sleep calls, or schedule your script to run less often.