This question already has answers here:
build real time dashboard using google apps script
(3 answers)
Closed 2 years ago.
I'm developing a Web App using google apps script and a spreadsheet as storage.
Basically, an HTML showing some tables for the different tabs.
From my app, users can add new tasks, edit tasks, and mark tasks as completed.
Since there are many users using the app, the data showed on each client will get outdated very fast.
I would like to update the app with new records and changes to the existing ones as soon as possible.
I thought in logging the last edit tab+row in a cell and pull that data from the server every minute, but, what if many entries are added/edited during that minute?
I think WebSocket is not possible. Any other idea?
I'm using JQuery client-side.
To help avoid conflicts, give every task a unique ID. Something like creation time + random string. That way you can look it up in the spreadsheet. Also, I think the Lock Service can prevent concurrent edits temporarily to avoid conflicts:
https://developers.google.com/apps-script/reference/lock/
To check for updates, try polling the last edit time of the spreadsheet. If it's greater than the previous poll, fetch updates.
https://developers.google.com/apps-script/reference/drive/file#getLastUpdated()
No other way besides polling. You can't have sockets or callbacks from HTML service. You could poll frequently but that may run you out of quotas.
If you really want to poll and avoid quotas you can log the last edit on a published public spreadsheet and read it with ajax from the client, however published spreadsheets update every minute only.
You could try something like this:
var lock = LockService.getPublicLock();
var success = lock.tryLock(10000);
if (success) {
// check your spreadsheet for lastUpdated or PropertiesService.getScriptProperties();
}
} else {
// do something else (try it again or error msg)
}
lock.releaseLock();
I have found that it works well on my app and I have around 1000 users.
Related
I developed numbers of custom scripts in a project which supposed to run over lastrow of data in response sheet linked to a gform upon submission of that gform. When tested, it took around 45 seconds to finish all the custom scripts.
Now here is where I get worried and I'm looking for a firm answer which I still cannot find from google search. Within that 45 seconds of processing, other gform users can submit the form and I'm thinking every latest submission will be taken as lastrow of data in the response sheet.
So, since my custom scripts will always refer and/or getValue(s) on the lastrow of response sheet, will that makes some of my scripts will jump on to the latest row of data before it finishes the previous one just because the latest submission is before the 45 seconds is finished? I'm so worried that if it is yes, that definitely will make the output become a disaster.
Add Info. I have one main function that calls other subfunctions chronologically and some subsubfunction(s) been called from that subfunction. Please refer below:-
function MAINFUNCTION(){
var sheet=SpreadsheetApp.getActiveSheet();
MYfunctionA();
MYfunctionB();
MYfunctionC();
}
MYfunctionB(){
//something done here..
mysubfunctionB();
return
}
MYfunctionA(){
//something done here..
return
}
MYfunctionC(){
//something done here..
return
}
So, if LockService is the solution, must it be applied to every subfunction? Another thing, what will happen if there's a google service/server error during that locktime? will it release the lock and proceed with the next record?
Or will it auto-retry? Because during testing, sometimes I got this service/server error prompted and usually I just rerun it back, but I wonder what will happen if this project is launched and used by many users almost concurrently?
My html form uploads data to a spreadsheet with a Google script. I use the following function to prevent a conflict, since many users are accessing the web app page:
function lockservice() {
var lock = LockService.getScriptLock();
lock.waitLock(30000);
lock.releaseLock();
}
I have another script to retrieve the data (separate script), that also has many users accessing it. So do I need getscriptlock for that also?
In another words, is there a conflict between users occuring when getting the data or only when uploading it? or maybe in both cases.
If you are writing data to the spreadsheet with:
sheet.appendRow(array);
then you don't need Lock Service to write data. If you are using:
sheet.getRange().setValues(array_2_D);
Then you do need Lock Service.
So, appendRow() is "atomic" and setValues() is not. Atomic meaning that each operation runs completely independent from each other.
https://developers.google.com/apps-script/reference/spreadsheet/sheet?hl=en#appendRow(Object)
Getting values should use Lock Service if you have concurrent users. There is also a quota limit for concurrent users. The limit for "Simultaneous Executions" is 30, (at the time of this post)
https://developers.google.com/apps-script/guides/services/quotas#current_limitations
I have actually have the same understanding but:
I have a script where Lock Service is used and the locked section also includes a appendRow() operation to the sheet.
The script allows up to 30 concurrent executions.
Despite knowing that appendRow()is stated to be atomic, I now experience that appendRow() actually may overwrite the last used row on a sheet!
I never experienced this behavior without making use of the Lock Service.
Anybody else observed this unexpected behavior?
In the linked screenshot you see the new Date() value of the previous record being replaced instead of an entire new row being appended below.
Any thoughts & suggestions to overcome this behavior are highly appreciated.
Screenshot: appendRow() replaces last row of sheet:
This question already has answers here:
Summary of failures for Google Apps Script: Not found
(2 answers)
Closed 5 years ago.
I am getting below email on daily basis. How to stop it??
Summary of failures for Google Apps Script: Eml Manager
If you are not supposed to be running any scripts then you can just open the script editor, click on Resources → All Your triggers and then remove triggers you do not want. You can also simply disable notifications by clicking on the Notifications link to the right of the trigger and removing your e-mail.
Otherwise, in a similar manner, go to the Eml Manager and fix the code :)
If you are getting 'Summary of failures for Google Apps Script' on daily basis, you should look forward to fixing your code because code isn't behaving as you had expected.
Here are few of the reasons which can cause those failure emails:
Too many simultaneous invocations on the google app to which you have attached your appscript
Authorization or Authentication issue (a new user of the script has to authorize it first time)
Script took more than 6 minutes to complete (6 min is the limit)
If you have created programatical triggers, and forget to delete them once it has served the purpose, it will keep on creating new triggers and there is total trigger limit per script is 20.
In some cases, server can also be too busy to trigger your script but that rarely happens.
Also, if your script is taking too longer to complete, you should look forward to reducing the looping and also reducing writing operations on google apps(as it keeps on saving every change you made). Also, use in built formulas etc as much as possible.
If your code is totally fine according to you, then you can stop those failure emails by removing your name from the notifications option of your triggers as #Vytautas mentioned.
This question already exists:
Exceeded maximum execution time in Google Apps Script [duplicate]
Closed 7 years ago.
My Google script, which I'm runnig from script.google.com invokes a PHP script on a remote server. This PHP script works for a few minutes, maybe more.
How much time can the Google script wait for a response from server?
Check this advice from Google.
We do not increase script execution times on a per-domain basis. There are ways for you to accomplish your use-case without any increases, however. Instead of trying to do everything in one script's execution time, use a sequence of script executions (using triggers) to process batches of the overall job. You can use ScriptDB or another persistent datastore to keep track of the documents processed, and each subsequent triggered execution can read from this and pick up processing where the previous execution left off.
Better than "waiting" for the response, is making the PHP call a new function on GAS (sorta like a callback), to do this publish your GAS with the options "As me" and "Anyone, even anonymous". Then you can call the /exec link with Get/Post parameters and continue executing the function.
This is ofcourse if your editor on the PHP page.
I am writing an immediate response code in google script which must monitor inbox at current time and if it detects a new e-mail with given subject then it runs the code and replies to the e-mail and at the end it marks the e-mail as read.
The whole stuff works fine, however I want it to get started more frequently than once in a minute. Thus I guess I can't use trigger option from the scripts interface. Will it work if I won't specify a trigger? How can I organize all this? I am using this stuff with google free quota for my personal use only.
thanks beforehand
Its not advisable to do what you want because you will run out of quota.
But its possible. Not with triggers as they run at most once a minute.
If you must, write a htmlservice app with a setInterval. Call a server method to do your work. Leave the webapp open all day.