What is the exact use of Utilities.sleep() function? Should we use it between function calls or API calls?
I use the Utilities.sleep(1000) in-between function calls, is it right? Will it slow down the execution time?
Utilities.sleep(milliseconds) creates a 'pause' in program execution, meaning it does nothing during the number of milliseconds you ask.
It surely slows down your whole process and you shouldn't use it between function calls.
There are a few exceptions though, at least that one that I know : in SpreadsheetApp when you want to remove a number of sheets you can add a few hundreds of millisecs between each deletion to allow for normal script execution (but this is a workaround for a known issue with this specific method). I did have to use it also when creating many sheets in a spreadsheet to avoid the Browser needing to be 'refreshed' after execution.
Here is an example :
function delsheets(){
var ss = SpreadsheetApp.getActiveSpreadsheet();
var numbofsheet=ss.getNumSheets();// check how many sheets in the spreadsheet
for (pa=numbofsheet-1;pa>0;--pa){
ss.setActiveSheet(ss.getSheets()[pa]);
var newSheet = ss.deleteActiveSheet(); // delete sheets begining with the last one
Utilities.sleep(200);// pause in the loop for 200 milliseconds
}
ss.setActiveSheet(ss.getSheets()[0]);// return to first sheet as active sheet (useful in 'list' function)
}
Serge is right - my workaround:
function mySleep (sec)
{
SpreadsheetApp.flush();
Utilities.sleep(sec*1000);
SpreadsheetApp.flush();
}
Some Google services do not like to be used to much. Quite recently my account was locked because of script, which was sending two e-mails per second to the same user. Google considered it as a spam. So using sleep here is also justified to prevent such situations.
You can also use it to limit API pulls per second. Some websites limit the amount of API pulls per second to reduce spam and server stress.
Related
I developed numbers of custom scripts in a project which supposed to run over lastrow of data in response sheet linked to a gform upon submission of that gform. When tested, it took around 45 seconds to finish all the custom scripts.
Now here is where I get worried and I'm looking for a firm answer which I still cannot find from google search. Within that 45 seconds of processing, other gform users can submit the form and I'm thinking every latest submission will be taken as lastrow of data in the response sheet.
So, since my custom scripts will always refer and/or getValue(s) on the lastrow of response sheet, will that makes some of my scripts will jump on to the latest row of data before it finishes the previous one just because the latest submission is before the 45 seconds is finished? I'm so worried that if it is yes, that definitely will make the output become a disaster.
Add Info. I have one main function that calls other subfunctions chronologically and some subsubfunction(s) been called from that subfunction. Please refer below:-
function MAINFUNCTION(){
var sheet=SpreadsheetApp.getActiveSheet();
MYfunctionA();
MYfunctionB();
MYfunctionC();
}
MYfunctionB(){
//something done here..
mysubfunctionB();
return
}
MYfunctionA(){
//something done here..
return
}
MYfunctionC(){
//something done here..
return
}
So, if LockService is the solution, must it be applied to every subfunction? Another thing, what will happen if there's a google service/server error during that locktime? will it release the lock and proceed with the next record?
Or will it auto-retry? Because during testing, sometimes I got this service/server error prompted and usually I just rerun it back, but I wonder what will happen if this project is launched and used by many users almost concurrently?
I have a sheet that pulls alot of finance data from alot of web pages using HTML. Problem is that it slows down alot. Plus it starts giving errors due to high number of HTMLs and etc.. i looked for the following solutions:
Having an app script to make refreshes controlled..(not so effective)
Any way i could copy paste the old data automatically till new data comes in and updates old pasted values..(also not effective as i couldnt find an automated method)
Tried finding a way to play with google sheet auto-calculations to help....(also failed in that)
IS there a way to ristrict the sheet from auto refreshing so many times...?
In the documentation you can see that:
Functions that pull data from outside the spreadsheet recalculate at the following times:
ImportRange: 30 minutes
ImportHtml, ImportFeed, ImportData, ImportXml: 1 hour
GoogleFinance: may be delayed up to 20 minutes
If you want to recalculate the value for your importHTML functions with a lower frequency you should definitely use Apps Script to do the data fetch and then populate your Spreadsheet with the information.
You can use UrlFetchApp class on Apps Script to get the data and define your timing logic for the updates.
function myAppScriptFunction() {
var urls = ["your-website-url1", "...", "your-website-urlN"];
var response = UrlFetchApp.fetchAll(urls);
//... Parse the response and store the information you need in a table-like data structure
// Let's assume the variable parsedResponseData is created
var ss = SpreadsheetApp.getActiveSheet();
ss.getRange("your-range").setValues(parsedResponseData);
}
You should now wrap your function in your custom time logic to manage the updates.
You can achieve this with a time-driven trigger on Apps Script.
function triggerSetup() {
ScriptApp.newTrigger('myAppScriptFunction')
.timeBased()
.everyHours(6)
.create();
}
References:
Time-driven triggers
UrlFetchApp
Quota Limits
you may try this addon which freezes your spreadsheet on demand:
https://gsuite.google.com/marketplace/app/spreadsheet_freezer/526561533622
My html form uploads data to a spreadsheet with a Google script. I use the following function to prevent a conflict, since many users are accessing the web app page:
function lockservice() {
var lock = LockService.getScriptLock();
lock.waitLock(30000);
lock.releaseLock();
}
I have another script to retrieve the data (separate script), that also has many users accessing it. So do I need getscriptlock for that also?
In another words, is there a conflict between users occuring when getting the data or only when uploading it? or maybe in both cases.
If you are writing data to the spreadsheet with:
sheet.appendRow(array);
then you don't need Lock Service to write data. If you are using:
sheet.getRange().setValues(array_2_D);
Then you do need Lock Service.
So, appendRow() is "atomic" and setValues() is not. Atomic meaning that each operation runs completely independent from each other.
https://developers.google.com/apps-script/reference/spreadsheet/sheet?hl=en#appendRow(Object)
Getting values should use Lock Service if you have concurrent users. There is also a quota limit for concurrent users. The limit for "Simultaneous Executions" is 30, (at the time of this post)
https://developers.google.com/apps-script/guides/services/quotas#current_limitations
I have actually have the same understanding but:
I have a script where Lock Service is used and the locked section also includes a appendRow() operation to the sheet.
The script allows up to 30 concurrent executions.
Despite knowing that appendRow()is stated to be atomic, I now experience that appendRow() actually may overwrite the last used row on a sheet!
I never experienced this behavior without making use of the Lock Service.
Anybody else observed this unexpected behavior?
In the linked screenshot you see the new Date() value of the previous record being replaced instead of an entire new row being appended below.
Any thoughts & suggestions to overcome this behavior are highly appreciated.
Screenshot: appendRow() replaces last row of sheet:
I'm building a script that uses a looping ImportHTML command to web scrape weather data based on zip code, and am currently running into an issue with the execution timing out every time the script is run.
The current way I have the script set up produces a correct result when run, but given that the script is pulling data from several hundred sources, it is taking a while and will not complete within the current time limit of Google scripts.
The sheet running the script utilizes 3 tabs:
ZIPS, which contains a list of zip code link values pulled from the
site that weather data is to be pulled from
Blank, which is simply an intermediary sheet used in the execution of the script
Result, where the final output is to be placed
In order to try and reduce the amount of read/write as much as possible, I changed the code from writing each result of the ImportHTML command as it was executed to appending to an array and writing this array at the end of the script. The code in it's current form is as follows:
function getTemps() {
var googleSheet = SpreadsheetApp.getActive();
// Read in Zip code link values
var sheet = googleSheet.getSheetByName('ZIPS');
var zipArray = sheet.getDataRange().getValues();
var arrayLength = zipArray.length;
//Set up sheet values
var blankSyntaxA = 'ImportHtml("https://www.wunderground.com/cgi-bin/findweather/getForecast?query=pz:';
var blankSyntaxB = '&zip=1", "table", 1)';
var tempResult = [];
// Writing Section
var sheet = googleSheet.getSheetByName('Blank');
for (var i = 0; i < arrayLength; i++)
{
var liveSyntax = blankSyntaxA+zipArray[i][0]+blankSyntaxB;
sheet.getRange('A1').setFormula(liveSyntax);
var importedData = sheet.getDataRange().getValues();
tempResult = tempResult.concat(importedData);
}
var sheet = googleSheet.getSheetByName('Result');
sheet.getRange(1,1,tempResult.length,8).setValues(tempResult);
}
I know the run time of the script could be reduced by eliminating the read/write contained within the For loop, but I'm not sure how to obtain the necessary HTML table without running the ImportHTML command within the 'Blank' sheet. Is there a way to run that command to fill the 'importedData' array without writing to a sheet?
Alternatively, I had considered utilizing a check on the runtime of the function and implementing a break as it neared the ~5 minute runtime limit, followed by a recursive call back to the original function, but I wasn't sure if this would actually mitigate the runtime issue, or even be possible given the nature of the recursive call.
Any advice on how this script could be modified to run within the script timeout parameter or modified to produce the complete desired outcome with all the necessary imported data would be appreciated. Thanks!
Is there a way to run that command to fill the 'importedData' array without writing to a sheet?
IMPORTHMTL is a Google Sheets built-in spreadsheet function. This kind of functions can't be ran / evaluated by Google Apps Script.
Related
How to evaluate a spreadsheet formula within a custom function?
Alternatively, I had considered utilizing a check on the runtime of the function and implementing a break as it neared the ~5 minute runtime limit, followed by a recursive call back to the original function, but I wasn't sure if this would actually mitigate the runtime issue, or even be possible given the nature of the recursive call.
Rather than a "mitigator" this is a workaround. There are several techniques like batch processing and parallel processing.
Reference
Exceeded maximum execution time in Google Apps Script
From answer to Threading in Google App Script
There is a great example from Bruce
Mcphearson. His example
Parallel Processing in Apps
Script uses
Map Reduce in exercise. He is utilizing triggers as well, but perhaps
may provide some different perspective.
Another alternative is to sign in to the Early Access Program to extend the execution time limit to 30 minutes.
Because google spreadsheets does not support iterations, I wrote my own simple app script to adjust an input based upon the calculation of the spreadsheet output. However, after I change the input variable, the spreadsheet recalculates but app scripts does not seem to wait for that recalculation so I end up retrieving values such as "Thinking..." or "#NA". Is there a way to pause a script and wait for the calculation to complete before moving to the next line in the script?
Currently, I am just using a loop to watch the cell but I wanted to find out if there was a more elegant way to pause the execution until the sheet was done calculating.
I write a lot of Excel Macros and Excel VBA always waits for the calculation to complete before moving to the next line in the code. Apps Script does not seem to do this so I am hoping there is an easy way to do this.
A second question: Because this iteration can take some time, how does one interrupt and terminate a script from running? I can't seem to find a way to do this.
Here is a very simple way of preventing the next script from starting until the current script completes in google apps scripts. Just add a call for testWait() after each script you are processing successively. The SpreadsheetApp.flush() also seems to reset the timeout timer on the spreadsheet back to the default 5min so you have more time to process multiple scripts in one go.
//holds processing of next script till last one has completed
function testWait(){
var lock = LockService.getScriptLock(); lock.waitLock(300000);
SpreadsheetApp.flush(); lock.releaseLock();
}
Scripts timeout after about 6 minutes to prevent infinite loops and constantly running programs. I don't think there's a manual way to stop a script.
Edit: Oops, forgot to answer your first question: I thought onEdit ran after values were recalculated, but apparently I don't use enough formulas to see this. If it's not waiting, then the best way is to do something like this:
while(value === "Thinking..." || value === "#NA") {
Utilities.sleep(10000);
}
It pauses the script for a few seconds and then checks again.
I also write a bit in Excel VBA, where the native Calculation functions come in handy. I've run across the same problem in Google Apps/Docs several times, wanting to execute code whenever calculations are complete. I wonder why there's no native function in Google Apps/Docs to handle this. Anyhow, I wrote this code to solve the problem. Hope it helps.
function onEdit() {
Refresh();
};
function Refresh () {
var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("Sheet1");
var sheet2 = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("Sheet2");
// set the range wherever you want to make sure loading is done
var range = sheet.getRange('A:A')
var values = range.getValues();
var string = values.toString();
var loading = "Loading";
do{
var randomWait = Math.floor(Math.random()*1+0); randomWait;
} while (string.search(loading) ==! 0);
range.copyTo(sheet2.getRange('A1'), {contentsOnly:true});
customMsgBox();
};
function customMsgBox() {
Browser.msgBox("Data refreshed.");
};
Here's an example in action:
https://docs.google.com/spreadsheet/ccc?key=0AkK50_KKCI_pdHJvQXdnTmpiOWM4Rk5PV2k5OUNudVE#gid=0
Make a copy if you want to play around with it.
I had the same problem. I resolved it by using 2 scripts as I needed to be sure the spreadsheet has the data I need.
The first script provides a seed value to populate the spreadsheet through a IMPORTXML function.
The second script processes this data.
I used time based triggers to run the scripts allowing for sufficient time for the first script to complete
You could put the recursive structure in the code.js file, rather than on the spreadsheet. Javascript can handle recursion really well. You could then have the script update the spreadsheet on each iteration (or each 100).