Google Apps Script execution time issue - google-apps-script

Im currently using Google Apps Script to implement a viewer of a supply chain database.
To synchronize the viewer with the current database (a google spreadsheet) I import the values and all the formatting it into a new sheet, this means the viewer basically is a copy of the current database.
However executing the script always takes something about 1 minute in time. I tried to find the issue with logging some debug messages at various positions in the code.
At first it seemed that the line with Viewer.setFrozenRows(1); (which is strange since I actually only freeze the first row) was the issue, however when commenting out this line the line afterwards (Viewer.setFrozenColumns(Database.getFrozenColumns());) seemed to be the issue.
Unfortuanetly I'm not able to share the database sheet with you, but maybe somebody can already spot the issue from the code.
Some additional Info: The database sheet has 1300 rows and 100 columns, and I added a picture of the log of the current code below.
function LoadViewer(view) {
Logger.log("LoadViewer Start");
if (view == null) {
view = 0;
}
var Database = SpreadsheetApp.openByUrl('[SHEET_URL].getSheetByName('Database');
var Viewer = SpreadsheetApp.getActiveSpreadsheet().getSheets()[0];
var numberOfColms = Database.getLastColumn();
var numberOfRows = Database.getLastRow();
var rules = Database.getConditionalFormatRules();
var headerRowHeight = Database.getRowHeight(1);
var dataRowHeight = Database.getRowHeight(2);
var Values = Database.getRange(1, 1, numberOfRows, numberOfColms).getValues();
Logger.log("Declarations Finished");
Viewer.getRange(1, 1,numberOfRows,numberOfColms).setValues(Values);
if(!Viewer.getRange(1, 1,numberOfRows,numberOfColms).getFilter())
Viewer.getRange(1, 1,numberOfRows,numberOfColms).createFilter();
Viewer.setConditionalFormatRules(rules);
Viewer.getRange(1, 1, 1, numberOfColms).setFontWeight('bold');
Viewer.autoResizeColumns(1, numberOfColms);
Viewer.setRowHeight(1, headerRowHeight);
Logger.log("1st Half of functions finished");
Viewer.setRowHeights(2, numberOfRows-1, dataRowHeight);
Logger.log("Freeze Rows");
//Viewer.setFrozenRows(1);
Logger.log("Freeze Columns");
Viewer.setFrozenColumns(Database.getFrozenColumns());
Logger.log("Loop Start");
for(var i = 1; i<=numberOfColms; i++){
Viewer.setColumnWidth(i, Database.getColumnWidth(i));
}
Logger.log("Loop End");
Viewer.getRange(1, 1,1,numberOfColms).setVerticalAlignment('middle').setWrap(true);
Logger.log("Load Viewer End");
}

Two optimization points I can see for your code:
Requests to the any external service including SpreadsheetApp make your code slow - see Best Practices.
Thus, making calls to a SpreadsheetApp method within a for loop will slow your code down.
You will be able to accelerate your code by replacing multiple setColumnWidth() requests within the loop by a single setColumnWidths(startColumn, numColumns, width) - avoiding iteration.
Log the number of columns and rows in your sheet.
A common problem is that the sheet contains a significant amount of empty rows and columns that increase your detected data range and consequently apply the subsequent calls to a bigger range than necessary.
If that you case - either delete the spare rows and columns manually, or use getNextDataCell() instead of getLastRow() or getLastColumn()

Related

How to reduce the latency between two script calls in Google Apps Script

In a self-developed add-on for Google Sheets, the functionality has been added that a sound file will be played from a JavaScript audio player in the sidebar, depending on the selection in the table. For the code itself see here.
When a line is selected in the table the corresponding sound file is played in the sidebar. Every time the next line is selected it takes around 2 seconds before the script will start to run and load the sound file into the sidebar. As the basic idea of the script is to quickly listen through long lists of sound files, it is crucial to reduce the waiting time as fare as possible.
A reproducible example is accessible here; Add-ons > 'play audio' (Google account necessary). To reproduce the error, the sheet has to be opened two times (e.g. in two browsers).
In order to reduce the latency you might try to reduce interval on your poll function as suggested by Cooper on a comment to the question and to change the getRecord function.
poll
At this time the interval is 2 seconds. Please bear in mind that reducing the interval too much might cause an error and also might have an important impact on the consume of the daily usage quotas. See https://developers.google.com/apps-script/guides/services/quotas
getRecord
Every time it runs it make multiple calls to Google Apps Script which are slow so you should look for a way to reduce the number of Google Apps Script calls. In order to do this you could store the spreadsheet table data in the client side code and only read it again if the data was changed.
NOTE: The Properties Service has a 50,000 daily usage quota for consumer accounts.
One way to quickly implement the above is to limit the getRecord function to read the current cell and add a button to reload the data from the table.
Function taken from the script bounded to the demo spreadsheet linked in the question.
function getRecord() {
var scriptProperties = PropertiesService.getScriptProperties();
var sheet = SpreadsheetApp.getActiveSheet();
var data = sheet.getDataRange().getValues();
var headers = data[0];
var rowNum = sheet.getActiveCell().getRow(); // Get currently selected row
var oldRowNum = scriptProperties.getProperty("selectedRow"); // Get previously selected row
if(rowNum == oldRowNum) { // Check if the was a row selection change
// Function returns the string "unchanged"
return "unchanged";
}
scriptProperties.setProperty("selectedRow", rowNum); // Update row index
if (rowNum > data.length) return [];
var record = [];
for (var col=0;col<headers.length;col++) {
var cellval = data[rowNum-1][col];
if (typeof cellval == "object") {
cellval = Utilities.formatDate(cellval, Session.getScriptTimeZone() , "M/d/yyyy");
}
record.push({ heading: headers[col],cellval:cellval });
}
return record;
}
Related
Problems when using a Google spreadsheet add-on by multiple users

persistently sporadic out of memory error in Google Apps Scripts when copying data validations from a template

I have hundreds of Spreadsheets that were made from a template sheet. They all have the same number/name of sheets, rows, columns, etc...
I have added some data validations to the template. I want to copy the data validations from the template to each of the Spreadsheets. I have the code, and it works, but it throws a memory error.
It always throws the error -- the only thing that changes is how many destination Spreadsheets it has processed before it throws the error. Sometimes it'll process 4 Spreadsheets before it throws the error, sometimes 50, sometimes more/less. I cannot figure out why.
I trimmed my code down to a working sample. I can't share the source files but they are just normal Spreadsheets with 5 sheets/tabs and various data validations. If it matters, the data validations do use named ranges. For example: =REGEXMATCH(LOWER(google_drive_url) , "^https:\/\/drive\.google\.com\/drive\/folders\/[a-z0-9_-]{33}$").
I have commented the below code but here is a recap:
Get the template Spreadsheet and cache all of the data validations in it
Go through each destination Spreadsheet:
Alear all of the data validations
Apply the data validations from the template
In my real code I have an array of destination file IDs. For testing purposes I am just using one destination file and applying the data validations from the template multiple times.
function myFunction() {
var sourceFileID = "1rB7Z0C615Kn9ncLykVhVAcjmwkYb5GpYWpzcJRjfcD8";
var destinationFileID = "1SMrwTuknVa1Xky9NKgqwg16_JNSoHcFTZA6QxzDh7q4";
// get the source file
var sourceSpreadsheet = SpreadsheetApp.openById(sourceFileID);
var sourceDataValidationCache = {};
// go through each sheet and get a copy of the data validations
// cache them for quick access later
sourceSpreadsheet.getSheets().forEach(function(sourceSheet){
var sheetName = sourceSheet.getName();
// save all the data validations for this sheet
var thisSheetDataValidationCache = [];
// get the full sheet range
// start at first row, first column, and end at max rows and max columns
// get all the data validations in it
// go through each data validation row
sourceSheet.getRange(1, 1, sourceSheet.getMaxRows(), sourceSheet.getMaxColumns()).getDataValidations().forEach(function(row, rowIndex){
// go through each column
row.forEach(function(cell, columnIndex){
// we only need to save if there is a data validation
if(cell)
{
// save it
thisSheetDataValidationCache.push({
"row" : rowIndex + 1,
"column" : columnIndex + 1,
"dataValidation" : cell
});
}
});
});
// save to cache for this sheet
sourceDataValidationCache[sheetName] = thisSheetDataValidationCache;
});
// this is just an example
// so just update the data validations in the same destination numerous times to show the memory leak
for(var i = 0; i < 100; ++i)
{
// so we can see from the log how many were processed before it threw a memory error
Logger.log(i);
// get the destination
var destinationSpreadsheet = SpreadsheetApp.openById(destinationFileID);
// go through each sheet
destinationSpreadsheet.getSheets().forEach(function(destinationSheet){
var sheetName = destinationSheet.getName();
// get the full range and clear existing data validations
destinationSheet.getRange(1, 1, destinationSheet.getMaxRows(), destinationSheet.getMaxColumns()).clearDataValidations();
// go through the cached data validations for this sheet
sourceDataValidationCache[sheetName].forEach(function(dataValidationDetails){
// get the cell/column this data validation is for
// copy it, build it, and set it
destinationSheet.getRange(dataValidationDetails.row, dataValidationDetails.column).setDataValidation(dataValidationDetails.dataValidation.copy().build());
});
});
}
}
Is there something wrong with the code? Why would it throw an out of memory error? Is there anyway to catch/prevent it?
In order to get a better idea of what's failing I suggest you keep a counter of the iterations, to know how many are going through.
I also just noticed the line
sourceSheet.getRange(1, 1, sourceSheet.getMaxRows(), sourceSheet.getMaxColumns()).getDataValidations().forEach(function(row, rowIndex){
This is not a good idea, because getMaxRows() & getMaxColumns() will get the total number of rows in the sheet, not the ones with data, meaning, if your sheet is 100x100 and you only have data in the first 20x20 cells, you'll get a range that covers the entire 10,000 cells, and then calling a forEach means you go through every cell.
A better approach to this would be using getDataRange(), it will return a range that covers the entirety of your data (Documentation). With that you can use a much smaller range and considerably less cells to process.

Range#sort fails to sort based on new formula's values

I am extracting data from several Google Sheets into one main sheet using Google Apps Script. To convert dates to week-numbers I use the sheet function ISOWEEKNUM() for column 1.
I want to sort by week-numbers and because of headers I use Range#sort.
The problem I am having is that no sorting is performed when I test run getData(), the Range#sort call has no effect. If I first run getData() and then manually run my sortRange() function it works.
Is there something about writing cells with ISOWEEKNUM() in them and then try to sort directly in the same script execution? What can be done solve this without the user having to sort or start more scripts manually?
function getData()
{
var thisSpreadSheet = SpreadsheetApp.getActiveSpreadsheet();
var dataSheet = thisSpreadSheet.getSheetByName('Data');
var deliverySheets = listDeliverySheets();
var outputWeekAndTotal = [];
var outputCratesPerStore = [];
var i;
for(i = 0; i < deliverySheets.length; i++)
{
outputWeekAndTotal.push(["=ISOWEEKNUM(\""+deliverySheets[i].getRange("A2").getDisplayValue()+"\")", deliverySheets[i].getRange("L12").getValue()]);
outputCratesPerStore.push(deliverySheets[i].getRange("L5:L9").getValues());
}
dataSheet.getRange(2, 1, outputWeekAndTotal.length, outputWeekAndTotal[0].length)
.setValues(outputWeekAndTotal);
dataSheet.getRange(2, 3, outputCratesPerStore.length, outputCratesPerStore[0].length)
.setValues(outputCratesPerStore);
sortRange();
}
function sortRange()
{
var thisSpreadSheet = SpreadsheetApp.getActiveSpreadsheet();
var rangeToSort = thisSpreadSheet.getSheetByName('Data').getRange(2, 1, 7, 7); /*Constants used temporarily*/
rangeToSort.sort({column: 1, ascending: true});
}
The fundamental issue is that Google is free to (and does) optimize their interpretation of your code to avoid abuse of their servers. When you call functions that operate on different objects, Google doesn't always determine that the order is important, and thus may invoke certain API operations out of (your desired) order. You can help Google by chaining methods on the exact same object, but this is not always sufficient. Operations that cause side effects / asynchronous changes, such as writing a formula, or operate over different APIs - such as calling the Drive API/Service after using Form, Docs, or Sheets Service methods - may not be performed in order even if "chained."
To fix this, you must forcibly flush the write cache buffer. For the Spreadsheet Service, this is done via a call to SpreadsheetApp#flush.
...
dataSheet.getRange(...).setValues(...);
SpreadsheetApp.flush();
sortRange();
}
Flushing the write cache will force the written formulas to be calculated prior to executing the next script lines, making their values available to the sorting method.

Using DocumentApp's append functions to 'MailMerge' data into single Google Document

Update: I have updated my code with some of the suggestions as well as a feature that allows for easy multiple markers and changed the arrayOfData into a 2D Array or strings. Has similar runtimes, if not slightly slower - 50pg avg: 12.499s, 100pg avg: 21.688s, per page avg: 0.233s.
I am writing a script that takes some data and a template and performs a 'mail merge' type function into another document. The general idea for this is easy and I can do it no problem.
However, I am currently 'mail merging' many rows (150-300+) of ~5 columns of data into predefined fields of a single page template (certificates) into a single document. The result is a single Google Document with 150 - 300 pages of the certificate pages. The alternative is to generate many documents and, somehow, combine them.
Is This a Good/Efficient Way of Doing This?
It took me a while to work out put together this example from the documentation alone as I couldn't find anything online. I feel like there should be a simpler way to do this but can not find anything close to it (ie. appending a Body to a Body). Is this the best way to do get this functionality right now?
Edit: What about using bytes from the Body's Blob? I'm not experienced with this but would it work faster? Though then the issue becomes replacing text without generating many Documents before converting to Blobs?
*Note: I know Code Review exists, but they don't seem to have many users who understand Google Apps Script well enough to offer improvement. There is a much bigger community here! Please excuse it this time.
Here Is My Code (Updated Feb 23, 2018 # 3:00PM PST)
Essentially it takes each child element of the Body, replaces some fields, then detects its type and appends it using the appropriate append function.
/* Document Generation Statistics:
* 50 elements replaced:
* [12.482 seconds total runtime]
* [13.272 seconds total runtime]
* [12.069 seconds total runtime]
* [12.719 seconds total runtime]
* [11.951 seconds total runtime]
*
* 100 elements replaced:
* [22.265 seconds total runtime]
* [21.111 seconds total runtime]
*/
var TEMPLATE_ID = "Document_ID";
function createCerts(){
createOneDocumentFromTemplate(
[
['John', 'Doe'], ['Jane', 'Doe'], ['Jack', 'Turner'], ['Jordan', 'Bell'],['Lacy', 'Kim']
],
["<<First>>","<<Last>>"]);
}
function createOneDocumentFromTemplate(arrayOfData, arrayOfMarkers) {
var file = DriveApp.getFileById(TEMPLATE_ID).makeCopy("Certificates");
var doc = DocumentApp.openById(file.getId());
var body = doc.getBody();
var fixed = body.copy();
body.clear();
var copy;
for(var j=0; j<arrayOfData.length;j++){
var item = arrayOfData[j];
copy = fixed.copy();
for (var i = 1; i < copy.getNumChildren() - 1; i++) {
for(var k=0; k<arrayOfMarkers.length; k++){
copy.replaceText(arrayOfMarkers[k], item[k]);
}
switch (copy.getChild(i).getType()) {
case DocumentApp.ElementType.PARAGRAPH:
body.appendParagraph(copy.getChild(i).asParagraph().copy());
break;
case DocumentApp.ElementType.LIST_ITEM:
body.appendListItem(copy.getChild(i).asListItem().copy());
break;
case DocumentApp.ElementType.TABLE:
body.appendTable(copy.getChild(i).asTable().copy());
break;
}
}
}
doc.saveAndClose();
return doc;
}
Gist
This is more of a Code Review question, but no, as written, I don't see any way to make it more efficient. I run a similar script for creating documents at work, though mine creates separate PDF files to share with the user rather than creating something we would print. It may save you time and effort to look into an AddOn like docAppender (if you're coming from a form) or autoCrat.
A couple of suggestions:
I'm more of a for loop person because it's easier to log errors on particular rows with the indexing variable. It's also more efficient if you're pulling from a spreadsheet where some rows could be skipped (already merged, let's say). Using forEach gives more readable code and is good if you always want to go over the entire array, but is less flexible with conditionals. Using a for loop will also allow you to set a row as merged with a boolean variable in the last column.
The other thing I can suggest would be to use some kind of time-based test to stop execution before you time the script out, especially if you're merging hundreds of rows of data.
// Limit script execution to 4.5 minutes to avoid execution timeouts
// #param {Object} - Date object from loop
// return Boolean
function isTimeUp_(starttime) {
var now = new Date();
return now.getTime() - starttime.getTime() > 270000; // 4.5 minutes
}
Then, inside your function:
var starttime = new Date();
replace.forEach(...
// include this line somewhere before you begin writing data
if (isTimeUp_(starttime )) {
Logger.log("Time up, finished on row " + i);
break;
}
... // rest of your forEach()

Google App Script Service error: Spreadsheets

Were not set up to use a proper SQL database or anything so were working with google sheets.
I've been trying to avoid importrange as I have a large amount of data constantly being updated and more rows added to Form responses every day. Importrange constantly fails with "importrange internal server error"
I found this fantastic code to copy from one source spreadsheet to another (as static text) so I can further manipulate the data :
function CopyTaskSource() {
var sss = SpreadsheetApp.openById('1OPnw_7vTCFkChy8VUKhAG5QRhcpKnDbmod0ZxjG----'); //replace with source ID
var ss = sss.getSheetByName('TASK Status'); //replace with source Sheet tab name
var range = ss.getRange('E:L'); //assign the range you want to copy
var data = range.getValues();
var tss = SpreadsheetApp.openById('1T3tqsHvKxuulYxDnaR3uf-wjVdXwLHBcUgI7tgN----'); //replace with destination ID
var ts = tss.getSheetByName('TaskSource'); //replace with destination Sheet tab name
ts.getRange(1, 1, data.length, data[0].length).setValues(data); //you will need to define the size of the copied data see getRange()
}
Now it copies about 15,000 rows of data, and I expect I will end up at 50,000 rows of data (and some other sheets go up to 27 columns).
I started getting this Service error: Spreadsheets line 9 (last line of the code).
Can someone please advise me a workaround to get bulk data transferred to multiple Google spreadsheet files?
importrange doesn't work well, and I have a few Google Forms that I need to combine the source responses to manipulate the data to output presentable spreadsheets.
Thank you
So I am working currently on a script that sends out emails when there is an issue, it then adds an array of values ,containing three values [type, ID, status], to an existing array ending with [[values1],[values2],etc...].
I have gotten the same error when I leave the third parameter of getRange as the array.length. I got it to work once yesterday by subtracting the array.length by 1 as I will show below. Maybe you can try this on line 9 and see if that fixes it?
It is important to mention that today after running the exact same script, it gave me an error stating that the range size was incorrect (due to the same subtraction that seemed to fix the service error)
I think that it may be broken on Google's side, but that is not something I can confirm.
This:
ts.getRange(1, 1, data.length, data[0].length).setValues(data);
Becomes This:
ts.getRange(1, 1, data.length - 1, data[0].length).setValues(data);
Hope that fixes it for you, I am truly stumped as to why it decides to work one day but not another...
I also added a waitLock to make sure it waits for other processes to be finished before trying to write it, but realize the data I write is much smaller, only 3 columns by 6-10 rows at a time. Here is the code for that, though this is to insert the data at the top of the sheet, not the bottom. (From Henrique G. Abreu, Original Post)
function insertRow(sheetI, rowData, optIndex) {
var lock = LockService.getScriptLock();
lock.waitLock(30000);
try {
var index = optIndex || 1;
sheetI.insertRowsBefore(index,rowData.length).getRange(index, 1, rowData.length, 3).setValues(rowData)
SpreadsheetApp.flush();
} finally {
lock.releaseLock();
}
}