Get multiple Threads by ThreadID in Google Apps Script's GmailApp Class - google-apps-script

I have an array of strings where each value represents a Gmail Threadid. This looks something like this:
var threadArray = [threadId1, threadId2, threadId3, threadId4, threadId5];
I want to apply a Gmail Label to each of the elements in my array of threadids, but my current approach is not efficient and takes too long to execute. See below:
for (var i = 0; i < threadArray.length; i++)
{
thread = getThreadById(threadArray[i]) // this statement takes too long (several seconds) to execute!!
thread.addLabel(exampleLabel);
}
What's the best approach for applying a Gmail Label to each Gmail Thread in an array of Gmail ThreadIds?
Gmail provides functions like getInboxThreads() which retrieve many threads at once to return GmailThread[] but its unclear to me how I can construct my own GmailThread[] purely based on the threadids, not my entire inbox.
Has anyone else experienced this issue or found a workaround? I found a similar open issue here: https://code.google.com/p/google-apps-script-issues/issues/detail?id=2598

Taking a long time for such task is not really an issue, you can easily manage to get it working on (relatively) small bunches based on a timer trigger.
I have a similar script that runs every night and it works nicely... it works for 4 minutes then wait a bit and continues until all the threads have been treated.
You only need to store the index where to start from (in scriptProperties) and measure elapsed time during script execution.

Related

Why is it every time we deploy our custom function users experience #ERROR! for some time?

Each time we deploy our Google AppScript Project, our custom function starts returning #ERROR!. This will happen regardless of whether a code change was made.
See photo below:
NOTE: Internal error executing the custom function. is not one of our error strings.
This is very strange because the function does not seem to be executing. I say this because #ERROR! is returned immediately, with 0 processing time. See failures in photo below:
This issue resolves itself after some seemingly arbitrary amount of time. Meaning the custom function will run normally, after some seemingly arbitrary amount of time.
This has become a very large problem because we have uncontrollable downtime after each deployment, and it does not seem to be an issue with our code considering this happens every time we deploy the code, regardless of whether the code actually changed.
This Google document states A custom function call must return within 30 seconds. If it does not, the cell will display an error: Internal error executing the custom function.. Our custom function does not take 30s to run. We actually can't even find an instance where our function runs longer than 5s.
NOTE: the only thing that fails is our custom function, our task pane that interacts with the Google Sheet remains functional.
According to the Optimization section of Custom Functions:
Each time a custom function is used in a spreadsheet, Google Sheets
makes a separate call to the Apps Script server.
Having multiple custom functions means you are making multiple calls to the server simultaneously. This could lead to a slow process or sometimes Error.
Solution:
The workaround to this is to lessen the use of custom functions. Custom function can accept range as parameter and it will be translated as two-dimensional array in Apps Script. Use the array to calculate the values and return a two-dimensional array that can overflow into the appropriate cells in your spreadsheet.
Example:
Code:
function multiplicationTable(row, col){
var table = [];
var rowArr = row.flat();
var colArr = col.flat();
for (var i = 0; i < rowArr.length ; i++) {
table.push([]);
for (var j = 0; j < col.length; j++) {
table[i].push(rowArr[i] * colArr[j]);
}
}
return table;
}

How to speed up copying/creating a file many times in Google Apps Script?

So, I'm setting up student surveys in a college. There are lots of forms to be handed out to lots of groups of students. For one particular survey, I have a template google form.
What I do is a loop that on every iteration creates a copy of this template and then modifies it a little bit. It takes A LOT of time. Google is copying and modifying 220 forms for like 40-50 minutes. I found my way around the time limit put on google script, but it's still too long. Do you see any way to speed this up a little?
Now it looks schematically like this:
for(some range):{
template.makeCopy("template", formsFolder);
var formFile = formsFolder.getFilesByName("template").next();
var form = FormApp.openById(formFile.getId())
// ... do some modifications
}
Thank you!
File#makeCopy already hands you the exact file you want, so you can completely cut out the need to search for the file you create:
for (var f = 0; f < newNames.length; ++f) {
var formFile = template.makeCopy(newNames[f], formsFolder);
var form = FormApp.openById(formFile.getId());
// Do stuff
}
I am surprised that your script actually finishes and does not timeout as the maximum script run time is 30 minutes or so dependent on your edition. I have had a very similar obstacle where I needed to run a script over 200-300 different sheets and copy data. The number of calls to the API was about 3-5 per loop and then some javascript to filter that dataset before writing to a sheet. It would regularly fail.
I found that my best approach was to leverage Triggers and rather than have 300 sheets processed at once have 100 sheets processed every 5 minutes or so. In my circumstances there really was no reason why these sheets had to be processed at the exact same time as long as my parameters for filtering was accurate. I would control all the parameters through inputs on a sheet.

How to get gmail messages received after a particular date using Google Apps Script?

Is it possible to retrieve all Gmail messages received after a particular date directly without going through all messages?
Currently, I am using GmailApp.getInboxThreads(0, 50) function to retrieve the first 50 threads and then loop through all the threads for finding the messages satisfying the condition. But what if there are more than 50 threads satisfying the condition? So fetching emails with GmailApp.getInboxThreads(start, max) function isn't a perfect solution.
The getInboxThreads() function seems good but it could fail when the size of all threads is too large for the system to handle.
Also, it should fetch all emails except ones from the spam folder.
Here is the code I use.
var gmailThreads = GmailApp.getInboxThreads(0, 50);
for (var i = 0; i < 50; i++) {
var messages = gmailThreads[i].getMessages();
for (var j = 0; j < messages.length && (messages[j].getDate().valueOf() > requiredDate.valueOf()); j++) {
//Loop Content
}
}
use search, specifying a start date.
https://developers.google.com/apps-script/reference/gmail/gmail-app#search(String)
Thou undocumented, you can search also by date+time with a resolution of 1 second because the date operators like "before" and "after" accept unix timestamps.
One lame issue with that apps script api is that it returns threads, not messages, and requires looping potentially long threads woken up by a new message, making it less robust as it can be time consuming to deal with those old messages. Gmail advanced api from apps script advanced services does have another message-level search api that wont have that issue.

How to avoid 'exec maxSimultaneous' limit in Google Spreadsheet?

I'm using Google Spreadsheet for my tax administration. All financial transactions that I have done in a year are in the spreadsheet. Because the tax rules in my country are quite complex, I've developed a number of JavaScript functions to help me with my calculations. There are many rows in my spreadsheet, about 1000. Each row has multiple references to those JavaScript functions.
This system worked beautifully before, but today I've found that Google installed some kind of runtime limiting system into Google Spreadsheet, causing many of my columns to abort with the following error:
Service invoked too many times in a short time: exec maxSimultaneous.
Try Utilities.sleep(1000) between calls.
After some investigation, it would appear that this time limit is there to protect against scripts that take too long time to run. My case is different: my scripts are all short, O(1) algorithms that do nothing but calculating some numbers. A typical script looks like this:
// Calculates the total amount excluding Value Added Tax, given
// an amount including Value Added Tax, and other sorts of information.
function ex_btw(inc_btw, commentaar, soort, btw_verlegd, btw_pct) {
Utilities.sleep(1000);
var soort = soort.toLowerCase();
if (soort == 'eten en drinken'
|| commentaar.match(/treinkaartje/i)
|| commentaar.match(/treinticket/i)
|| commentaar.match(/taxi/i)
|| commentaar.match(/ boek /i))
{
return inc_btw / 1.06;
} else if (soort == 'priveonttrekking'
|| soort == 'boete'
|| soort == 'belasting'
|| commentaar.match(/postzegel/i)
|| btw_verlegd == 'Ja')
{
return inc_btw;
} else {
return inc_btw / (1 + btw_pct);
}
}
The script is then invoked like this from a cell:
=IF(B6<>""; ex_btw(B6;D6;E6;J6;S6); "")
Maybe my problem is that I have too many script calls. Every single row calls about 6 of such scripts, so with 1000 rows I call 6000 times per spreadsheet.
How do I solve this problem? Is there a way to increase the execution limit, or is there a way to make the scripts run slower so that they don't hit the execution limit? As you can see in the example code I've already tried inserting Utilities.sleep(1000), but that doesn't seem to solve the problem. I don't care whether the scripts run slowly, I just just care that they finish without errors.
Can I pay to have the limit increased? I need to hand in my taxes in a few days.
Other alternatives that I've considered, but that are not feasible.
Using non-JavaScript functions. Not feasible because: they don't support collaboration like Google Spreadsheet does. I regularly go over the spreadsheet with a colleague to check whether we've made any mistakes. It helps that the both of us can immediately see any changes the other makes.
Have one huge-ass JavaScript function that iterates over rows and populates cells. Not feasible because:
Too error prone, it's very easy to make mistakes compared to my current method.
Does not update cells automatically until I re-run the script. I want to see any calculations immediately after I update other cells, just like a spreadsheet is supposed to do.
Using other spreadsheets like Excel and OpenOffice Calc. Not feasible because: they don't appear to offer the same scripting capabilities.
Writing my own financing app. Not feasible because: it takes too much take, it's not my core business, and tax rules change almost every year so I will have to constantly update the app. I can update a spreadsheet very quickly, but writing a financing app takes too much time.
I solved it by making every function sleep for a random period, like this:
Utilities.sleep(Math.random() * 5000);
It is important that the sleeping time is random, not constant. Apparently Google limits the maximum number of functions that may simultaneously be using CPU.
an alternative to the custom function might be to have an onEdit function trigger and then process either just the entered data or the whole column of numbers and place the results of the function in the target cell(s) as a number.
might be quicker

Exchange Webservices Managed API - using ItemView.Offset for FindItems() hits performance of LoadPropertiesForItems method

I'm doing a little research on possible application of EWS in our existing project which is written with heavy use of MAPI and I found out something disturbing about performance of LoadPropertiesForItems() method.
Consider such scenario:
we have 10000 (ten thousands) messages in Inbox folder
we want to get approximately 30 properties of every message to see if they satisfy our conditions for further processing
messages are retrieved from server in packs of 100 messages
So, code looks like this:
ItemView itemsView = new ItemView(100);
PropertySet properties = new PropertySet();
properties.Add(EmailMessageSchema.From);
/*
add all necessary properties...
*/
properties.Add(EmailMessageSchema.Sensitivity);
FindItemsResults<Item> findResults;
List<EmailMessage> list = new List<EmailMessage>();
do
{
findResults = folder.FindItems(itemsView);
_service.LoadPropertiesForItems(findResults, properties);
foreach (Item it in findResults)
{
... do something with every items
}
if (findResults.NextPageOffset.HasValue)
{
itemsView.Offset = findResults.NextPageOffset.Value;
}
}while(findResults.MoreAvailable);
And the problem is that every increment of itemsView.Offset property makes LoadPropertiesForItems method longer to execute. For first couple of iterations it is not very noticeable but around 30th time loop makes that call time increases from under 1 second to 8 or more seconds. And memory allocation hits physical limits causing out of memory exception.
I'm pretty sure that my problems are "offset related" because I changed a code a little to that:
itemsView = new ItemView(100, offset, OffsetBasePoint.Beginning);
...rest of loop
if (findResults.NextPageOffset.HasValue)
{
offset = findResults.NextPageOffset.Value;
}
and I manipulated offset variable (declared outside of loop) in that way that I set its value on 4500 at start and than in debug mode after first iteration I changed its value to 100. And according to my suspicions first call of LoadPropertiesForItems took veeeery long to execute and second call (with offset = 100) was very quick.
Can anyone confirm that and maybe propose some solution for that?
Of course I can do my work without using an offset but why should I? :)
Changing the offset is expensive because the server has to iterate through the items from the beginning -- it isn't possible to have an ordinal index for messages because new messages can be inserted into the view in any order (think of a view over name or subject).
Paging through all the items once is the best approach.