Speed up adding Items to Forms with Apps Script - google-apps-script

I'm wondering if there is a more efficient way to add Items to Google Forms with App Script.
My script is taking a long time and I'm wondering if there's anyway to add items more efficiently using an array or something rather than one by one. Here's the code I'm running...
function addFormItems() {
var startTime = new Date(); var form = FormApp.create("Test") form.setIsQuiz(true)
for (var i = 1; i < 100; i++) {
form.addTextItem().setTitle("Question number " + i).setHelpText("Hint: answer should not be more than a couple of words")
}
var endTime = new Date();
//Get runTime in seconds var runTime = (endTime - startTime)/1000;
Logger.log("runtime is: " + runTime)
}
Currently it takes quite a long time a minute to a minute and a half (odd thing is every time I execute I get a very different runtime not sure why that happens) Any thoughts on how to speed this up is much appreciated.
I searched Documentation and couldn't find anything about adding multiple items with one call.

Related

Bulk Delete and Update Events on Google Calendar

I'm trying to reflect Zoom schedules from 32 Zoom accounts on Google Calendar. I have shared 32 Calendar to 1 Master Calendar. I will get Scheduled Meetings through API and create Google Calendar Events.
function createEvent() {
var lstMeetings = getMeetings(); //Creating Zoom Meetings list
var lstAcc = SpreadsheetApp.getActiveSpreadsheet().getSheets()[1].getRange(2, 1, 32, 4).getValues(); //Getting Calendar ID
var calendar, events_delete;
var now = new Date();
var end = new Date(now.getTime() + (180 * 24 * 60 * 60000));
var temp;
for (var i = 0; i < lstAcc.length; i++) {
Logger.log(lstMeetings[i].length);
for (var j = 0; j < lstMeetings[i].length; j++) {
var events = CalendarApp.getCalendarById(lstAcc[i][0]).createEvent(
lstMeetings[i][j].topic,
new Date(lstMeetings[i][j].start_time),
new Date(lstMeetings[i][j].end_time),
{description: lstMeetings[i][j].description}
);
Logger.log(events.getId());
}
}
}
I'm running into You have been creating or deleting too many calendars or calendar events in a short time. Please try again later. error.
Addition to the code, I would like to bulk delete all events and create new events reflect to the schedule on Zoom. How can I achieve that?
I'm thinking to switch to Webhook so that it would be easier to handle, but I'm not sure how to do that yet.
If you have any advice, please let me know.
Thank you!
You have been creating or deleting too many calendars or calendar events in a short time. Please try again later.
Means exactly that you have been doing to much. This is a free api there are limits to the number of creates and deletes you can make over a period of time. This limit is to my knowledge undocumented. I would start with one a minute and work your way up and down until you find the limits.
This is the closes thing i have found Avoid calendar use limits It doesnt say how many are allowed per minute or per hour.

Using DocumentApp's append functions to 'MailMerge' data into single Google Document

Update: I have updated my code with some of the suggestions as well as a feature that allows for easy multiple markers and changed the arrayOfData into a 2D Array or strings. Has similar runtimes, if not slightly slower - 50pg avg: 12.499s, 100pg avg: 21.688s, per page avg: 0.233s.
I am writing a script that takes some data and a template and performs a 'mail merge' type function into another document. The general idea for this is easy and I can do it no problem.
However, I am currently 'mail merging' many rows (150-300+) of ~5 columns of data into predefined fields of a single page template (certificates) into a single document. The result is a single Google Document with 150 - 300 pages of the certificate pages. The alternative is to generate many documents and, somehow, combine them.
Is This a Good/Efficient Way of Doing This?
It took me a while to work out put together this example from the documentation alone as I couldn't find anything online. I feel like there should be a simpler way to do this but can not find anything close to it (ie. appending a Body to a Body). Is this the best way to do get this functionality right now?
Edit: What about using bytes from the Body's Blob? I'm not experienced with this but would it work faster? Though then the issue becomes replacing text without generating many Documents before converting to Blobs?
*Note: I know Code Review exists, but they don't seem to have many users who understand Google Apps Script well enough to offer improvement. There is a much bigger community here! Please excuse it this time.
Here Is My Code (Updated Feb 23, 2018 # 3:00PM PST)
Essentially it takes each child element of the Body, replaces some fields, then detects its type and appends it using the appropriate append function.
/* Document Generation Statistics:
* 50 elements replaced:
* [12.482 seconds total runtime]
* [13.272 seconds total runtime]
* [12.069 seconds total runtime]
* [12.719 seconds total runtime]
* [11.951 seconds total runtime]
*
* 100 elements replaced:
* [22.265 seconds total runtime]
* [21.111 seconds total runtime]
*/
var TEMPLATE_ID = "Document_ID";
function createCerts(){
createOneDocumentFromTemplate(
[
['John', 'Doe'], ['Jane', 'Doe'], ['Jack', 'Turner'], ['Jordan', 'Bell'],['Lacy', 'Kim']
],
["<<First>>","<<Last>>"]);
}
function createOneDocumentFromTemplate(arrayOfData, arrayOfMarkers) {
var file = DriveApp.getFileById(TEMPLATE_ID).makeCopy("Certificates");
var doc = DocumentApp.openById(file.getId());
var body = doc.getBody();
var fixed = body.copy();
body.clear();
var copy;
for(var j=0; j<arrayOfData.length;j++){
var item = arrayOfData[j];
copy = fixed.copy();
for (var i = 1; i < copy.getNumChildren() - 1; i++) {
for(var k=0; k<arrayOfMarkers.length; k++){
copy.replaceText(arrayOfMarkers[k], item[k]);
}
switch (copy.getChild(i).getType()) {
case DocumentApp.ElementType.PARAGRAPH:
body.appendParagraph(copy.getChild(i).asParagraph().copy());
break;
case DocumentApp.ElementType.LIST_ITEM:
body.appendListItem(copy.getChild(i).asListItem().copy());
break;
case DocumentApp.ElementType.TABLE:
body.appendTable(copy.getChild(i).asTable().copy());
break;
}
}
}
doc.saveAndClose();
return doc;
}
Gist
This is more of a Code Review question, but no, as written, I don't see any way to make it more efficient. I run a similar script for creating documents at work, though mine creates separate PDF files to share with the user rather than creating something we would print. It may save you time and effort to look into an AddOn like docAppender (if you're coming from a form) or autoCrat.
A couple of suggestions:
I'm more of a for loop person because it's easier to log errors on particular rows with the indexing variable. It's also more efficient if you're pulling from a spreadsheet where some rows could be skipped (already merged, let's say). Using forEach gives more readable code and is good if you always want to go over the entire array, but is less flexible with conditionals. Using a for loop will also allow you to set a row as merged with a boolean variable in the last column.
The other thing I can suggest would be to use some kind of time-based test to stop execution before you time the script out, especially if you're merging hundreds of rows of data.
// Limit script execution to 4.5 minutes to avoid execution timeouts
// #param {Object} - Date object from loop
// return Boolean
function isTimeUp_(starttime) {
var now = new Date();
return now.getTime() - starttime.getTime() > 270000; // 4.5 minutes
}
Then, inside your function:
var starttime = new Date();
replace.forEach(...
// include this line somewhere before you begin writing data
if (isTimeUp_(starttime )) {
Logger.log("Time up, finished on row " + i);
break;
}
... // rest of your forEach()

Clock Trigger Builder Not calling function when scheduled - Google sheets app Script

I am using the app script provided by Google to access their prediction API through sheets. I am trying to predict thousands of rows at once, however, after 6 minutes the maximum execution time is reached at the code stops.
I implemented a solution that I found using clock trigger builder. Once I run the function it goes for 5 mins, then it stops sets a trigger to recall the function within 2 mins.
The major problem is that the function is not called when scheduled. I see it in the current triggers list, but it never gets called again. Can you please explain why this is occurring.
My intention is to predict as many lines as possible in 5 min then stop set a trigger to call the predict function again within a few minutes start where it left off and continue until ever element has been predicted.
I also need to know how would I store then values in cache so that it would know all the information that it needs when the function is called again.
//This is the function that is used to predict a selection of data
function predict() {
try {
clearOutput();
var startTime= (new Date()).getTime();
var sheet = SpreadsheetApp.getActiveSheet();
var selection = sheet.getActiveSelection();
var instances = selection.getValues();
var project_number = getProjectNumber();
var model_name = getModelName();
var startRow = stRow;
var MAX_RUNNING_TIME = 300000;
var REASONABLE_TIME_TO_WAIT = 60000;
for (var i = startRow; i < instances.length; ++i) {
var currTime = (new Date()).getTime();
if(currTime - startTime >= MAX_RUNNING_TIME) {
var builder = ScriptApp.newTrigger('predict').timeBased().after(REASONABLE_TIME_TO_WAIT);
builder.create();
break;
} else {
var result = predictSingleRow(project_number, model_name, instances[i]);
selection.getCell(i + 1, 1).setValue(result);
}
}
} catch(e) {
Browser.msgBox('ERROR:' + e, Browser.Buttons.OK);
}
}
Few things as to why your code is not functioning as intended:
1) Since you mentioned,"I see it in the current triggers list, but it never gets called again" and looking at your code, I am unsure whether you intended to call the function again after it's execution has completed. If you do, this is because your for loop runs for a while until the length of the instances is obtained. Nothing in the script suggests that the function needs to be run again once it has finished iterating through instances. Refer to this link to see how to Manage Trigger Programmatically.
2) var builder = ScriptApp.newTrigger('predict').timeBased().after(REASONABLE_TIME_TO_WAIT);
This line of your code falls under an if condition which stops the execution for 1 minute (value is 60000). Hence, adding 1 minute to the time since execution started. Nowhere are you resetting the startTime counter to the time after the waiting time since once the value of currTime - startTime has exceeded MAX_RUNNING_TIME, the function will keep calling the if loop for all iterations of the for loop after that. Simply put, if startTime was 9:35 and currTime was 9:40, after waiting for 1 minute the currTime is 9:41 which is still more than the MAX_RUNNING_TIME(5 minutes) because value of startTime still remains 9:35. Resetting it to 9:41 at this point should resolve your problem.
3) Loosing the break in the if loop would probably help fix that as well.
EDIT:
Add a function as shown in the link I mentioned above:
function callTrigger(){
ScriptApp.newTrigger('predict')
.timeBased()
.everyMinutes(30)
.create();
}
Run the function callTrigger once from your editor and you should be good to go. Remember, for minutes you can only pass values 1,5,15 or 30.

How to speed up searching DriveApp files using GAS

I noticed that just looping through the files stored on Google Drive takes a LOT of time.
var startTime = Date.now();
var count = 0;
var files = DriveApp.getFiles();
while (files.hasNext()) { count++; var file=files.next();};
var endTime = Date.now();
Logger.log('Loop through ' + count + ' files takes ' + ((endTime-startTime)/1000) + ' seconds');
It takes about 1 seconds to loop through 100 files.
Storing fileinfo in cache and looping through it after retrieval makes it possible to handle about 20000 files a second (on my system)
var startTime = Date.now();
var fileIds = getFileIds(); // retrieve info cache (stored before)
var count = 0;
var numFiles = fileIds.length;
for (var i=0; i<numFiles; i++) { count++; var file = fileIds[i];};
var endTime = Date.now();
Logger.log('Loop through ' + count + ' files takes ' + ((endTime-startTime)/1000) + ' seconds');
The results above are nothing special, but it makes you think if it will be possible to speedup certain action once you have stored fileinfo in cache.
In my case I notice that specifying several searchcriteria and performing a search
var files = DriveApp.searchFiles(criteria);
might take a lot of time (over 20 seconds) processing results.
So I wonder if there is a way to speedup searching for files.
Does anybody have ideas how to speedup and/or to avoid looping through all files the way described in the first test?
Not possible to speed it up. The comparison you make is not very relevant because the 2nd time you are not making any drive api calls.
That 2nd call is just measusing the time it takes to run a right loop with no api calls.
The first time, all time is consumed calling next which does a roundgrip to the drive api.
If your data doesnt change often you can use the cache to avoid making the same searches again. Make sure to deal with stale results and such.

Avoiding Exceeded maximum execution time while trying to copy files

Hey every one this is my first time posting here, usually I find answers here but this time I had to ask.
So I'm trying to create 12 folders with months names and create 31 files in each one of them , the files are a copy of existing spreadsheet file.
I'v succeeded in writing script that works but after creating 3-4 months it stops and throws "Exceeded maximum execution time", now I did some reading and understand that there is time limit of something like 5 minutes - and as you can see in the code below my way of doing things isn't the most efficient maybe , now the only idea I got is to save the original file data inside blob and then read from that blob while creating the new files - that way avoiding large number of calls and making things faster, but when I tried to createFile(blob) I get PDF as output , which isn't my intention.
function create_months(month_name)
{
var testingfolder = DocsList.getFolder("testing");
var targetFolder = testingfolder.createFolder(month_name);
var mainDoc = DocsList.getFileById('original file id');
for(var i=1;i<32;i++)
{
mainDoc.makeCopy(i).addToFolder(targetFolder);
var root = DocsList.getRootFolder();
var file = root.find(i);
file[0].removeFromFolder(root);
}
}
//array of months in hebrew
year = ['ינואר','פברואר','מרץ','אפריל','מאי','יוני','יולי','אוגוסט','ספטמבר','אוקטובר','נובמבר','דצמבר'];
function create(){
for(var i=0;i<=12;i++)
{
create_months(year[i]);
}
}
Thanks in advance :)
Your code seems fine, aside from the "error by 1" on your loop in the create function (it should be I i < 12, not <=. If your script ever got to the end you would have a undefined month as your month_name on create_months.
If you said 3-4 months work, I recommend you do only 3 months at time. If you're running this manually. The easiest solution is just to wait, then run again, then again, etc. e.g.
function create1stQuarter() {
for( var i = 0; i < 3; ++i )
create_months(year[i]);
}
function create2ndQuarter() {
for( var i = 3; i < 6; ++i )
create_months(year[i]);
}
function create3rdQuarter() {
for( var i = 6; i < 9; ++i )
create_months(year[i]);
}
function create4thQuarter() {
for( var i = 9; i < 12; ++i )
create_months(year[i]);
}
Now, if you're not running this from the script editor yourself, or you find it difficult for your users to click 4 itens in a spreadsheet menu or elsewhere. You can enhance this by setting a up a trigger automatically and removing it automatically when you finish. You'd also have to keep track inside the script on which quarter you have to create next. A good place for something simple like this is ScriptProperties.
If you're using the trigger solution, I also recommend that you do not set the time interval lower than 10 minutes as you'll risk your functions running concurrently. If you do so, it's advised that you use the LockService to guarantee that does not happen.
Do what you can outside of the loop and get rid of the find since you already have the file.
var root = DocsList.getRootFolder();
for(var i=1;i<32;i++)
{
var daily = mainDoc.makeCopy(i);
daily.addToFolder(targetFolder);
daily.removeFromFolder(root);
}
That should speed things up.