Gmail add-on - "Missing access token" when using getMessages() in multiple threads (but fine for current message's thread) - gmail-addons

I have written a Gmail add-on which, when the user opens an email, finds the label of the thread (the thread only has one label). If there are multiple threads with the same label, it fetches the messages in those threads. Otherwise it fetches the messages from the thread with the opened email.
When there is only one thread, it works fine. When there are multiple threads, I get an access token error:
Access denied : Missing access token for authorization. Request: MailboxService.GetThread. [line: xx (the line highlighted in the code below) etc
The add-on uses the access token correctly as far as I can tell. The numbers of emails and threads are very small (three or four threads with a few messages in each).
Below is a simplified version of my code. Does anyone know why I should get the access token error when I try to access messages of other threads?
I have tried using other methods for building the array of messages (search, filter/function etc), but the result is the same: no problem with the single thread containing the trigger email, but access denied when accessing multiple threads.
function getAllMessagesWithTheSameLabel(message) {
var threads = [];
var messages = [];
var thread = message.getThread();
var label = thread.getLabels()[0];
if (label.getThreads().length > 1) {
threads = label.getThreads();
**messages = GmailApp.getMessagesForThreads(threads);**
} else {
messages = GmailApp.getMessagesForThread(thread);
}
}

Related

Gmail API in GAS: Refreshing Interface on State Change?

I'm currently writing using Google-Apps-Scripts to do change labels on threads and messages. When the script completes, the payloads are moved, but the display on the user frontend does not change until the user interacts with Gmail.
Is there a way to push a refresh command to Gmail? How can I gracefully display "job's done" to the user so they know that messages are now appropriately labeled?
I am working directly against the Gmail API, not GmailApp. I discovered ActionResponseBuilder.setStateChanged(), but the problem is I am currently not working with any sort of frontend interface. This is all in the background.
Here is an abbreviated example of some of the code I'm using to grab messages to modify (as requested):
function changeLabel(ids,oldLabel,newLabel) {
Gmail.Users.Messages.batchModify({
"ids":ids,
"addLabelIds":oldLabel,
"removeLabelIds":newLabel
},"me");
}
function start() {
// Labels to work with
const FOO = "Label_5095729546757874255";
const BAR = "Label_5102306845672214551";
// API call to retrieve list of messages by Label
var msgIdsByLabel = new MessageIndex(FOO);
// API call to retrieve message contents
var payloadMessages = new Messages(msgIdsByLabel);
var manifestMessagesToMove = [];
for (var i=0;i < Object.keys(payloadMessages.response).length; i++) {
// Criteria for selecting messages to move goes here
manifestMessagesToMove[i] = payloadMessages.response[i].id;
}
// Change labels of Message Ids
changeLabel(manifestMessagesToMove,FOO,BAR);
// ??? Refresh Gmail Interface ???
}
Unfortunately this isn't possible.
The Gmail UI can't be refreshed from Apps Script as it is run in the cloud as a separate session to that which is being viewed in the web browser by a user. The two aren't connected - and the same goes for the Gmail API.
If you don't have a front-end interface (aka a Gmail Add-on utilising CardService) then there is not a way of displaying a message to the user, either. The refresh will have to be done manually.

GAS userProperties not behaving as expected

In the guide to Properties Service, it is stated that the User Properties is for data specific to the current user of a script. I have a stand alone app deployed for the user accessing the app. I have used PropertiesService.getUserProperties() to set some values. And I expect those values be specific to each user; however, every time a user opens the app, the values for that user overwrites the ones for other users. Is this a bug or expected behavior?
Minimal Reproducible Example:
function doGet(e) {
const userProps = PropertiesService.getUserProperties();
let userData = JSON.parse(userProps.getProperty('userData'));
if (!('foo' in userData)) {
userData = {foo: e.parameters.userData}
userProps.setProperty('userData', JSON.stringify(userData));
}
return HtmlService.createHtmlOutput(`<p>Expected data specific to ${Session.getActiveUser().getEmail();}:</p><p>${userData.foo}`);
}
Expected behavior:
When deployed as a stand-alone app configured to execute for user accessing the web app, the displayed data must be specific to the user.
Observed behavior:
When a new user runs the app, previous users see the data set by the new user, as if the new user's setting has overwritten theirs.

Can we have document for when to use getScriptLock GetUserLock and GetDocumentLock

Use is not so extensive. I read but I can not imagine the use case and when to apply the LockService in Apps Script. There are three different locks.
var lock = LockService.getScriptLock();// BEGIN - start lock here
try {
lock.waitLock(30000); // wait 30 seconds for others' use of the code section and lock to stop and then proceed
} catch (e) {
console.log('Could not obtain lock after 30 seconds.');
return HtmlService.createHtmlOutput("<b> Server Busy please try after some time <p>")
// In case this a server side code called asynchronously you return a error code and display the appropriate message on the client side
//return "Error: Server busy try again later... Sorry :("
}
//Then my stuff here
//finaly
lock.releaselock();
}
I am writing form data from my published webapp. Web app is accessed by username and password; that means only used by specific users. The form data is being saved in Google sheet where prevously Google Form used to save responses.
I have problem when two users submits form data, sometime one user's data is being replaced by another in the same row. To prevent that I wanted to implement getScriptLock()/Documentlock or Userlock. But it seems one or other problem reamin there. Some lock prevent another user to submit data [ at web app - message is submited] but actually in google form nothing is logged. Very frastating. Which of lockservice do you think I should serve my purpose?
prevents concurrent access
ensure data input are executed on a first-come-first-served basis
(similar to above) avoid data input messing up
allow setting a 'queue time'
Are you writing the responses to sheet by something like getRange(sheet.getLastRow(), 1, 1, cols).setValues(values)?
I think appendRow(values) ensure they are inserted one after another.

Different IP while making two requets throught UrlFetchApp in same script

Can we relly that while Google App Script is executed by a Time Trigger, and makes two subsequest request using UrlFetchApp, both are made by same server with the same IP?
I need to ensure it, because in one request I query for an Access token for a remote service and with another I'm using this Access token. The remote service that I'm quering checks if the Access token was requested by the client with the same IP as requests that use this Access Token.
EDIT
I examined the behavior by time-triggering some dumb scripts with just few consecutive UrlFetchApp requests in them and checked server logs. I had two clear observations:
IP may vary in consecutive calls within one trigger
There is clear rotation of the IPs, sometimes there is a group of 7 consecutive calls with the same IP, sometimes 6. But in general there are always groups.
Because I wanted to only use Google infrastructure for my script and occasional failure was not a problem, I came up with a ugly ugly but working solution:
function batchRequest(userLogin, userPassword, webapiKey, resource, attributes, values ){
var token = requestToken(userLogin, userPassword, webapiKey ); // requestToken method uses UrlFetchApp.fetch
var result = request(resource, token, attributes, values); // requestToken method uses UrlFetchApp.fetch with options.muteHttpExceptions set to true so that we can read the response code
var i = 0;
while (result.getResponseCode() == 500 && i < 10){
token = requestToken(userLogin, userPassword, webapiKey ); // requestToken method uses UrlFetchApp.fetch
result = request(resource, token, attributes, values);
i++;
}
return result;
}
So I simply try hard max 10 times and hope to hook up to have the two requests — one for token and another for some bussiness logic — done in a same ‘IP group’.
I put more detailed description here: https://medium.com/p/dd0746642d7
Within the same trigger call yes. From another trigger no. Based on experience nce but i havent seen this docummented.

script exceeding driveWriteVolume rateMax

I have a piece of code that is supposed to create a folder for each email message in a thread, and save the body (as a pdf) and all the attachments (as whatever they are) into that folder.
If I run it without the loop for saving the attachments, I have no problem. (Well, I have a different problem for a different thread). If I uncomment the attachments loop, I get
Service invoked too many times in a short time: driveWriteVolume rateMax. Try Utilities.sleep(1000) between calls. (line 156, file "Code")
All lines that create a folder or a file are followed by a Utilities.sleep(sleepTime); and sleeptime is currently set to 1000. Changing it doesn't seem to have any effect.
The offending piece of code is:
// Save attachments
for(var i = 0; i < messageAttachments.length; i++) {
var attachmentBlob = messageAttachments[i].copyBlob();
newFolder.createFile(attachmentBlob);
Utilities.sleep(sleepTime); // wait after creating something on the drive.
} // each attachment
it is the newFolder.createFile(attachmentBlob); line that triggers the error.
I have looked at What is driveWriteVolume rateMax? and Intermittant DriveWriteVolume rateMax exception for help, and have found none.
Note that if I comment out the loop for attachments, and just save the messages bodies as PDF, I have no problem, regardless of the number of emails I'm saving. When I get the error, the script has died right where it should have saved the first attachment. So I'm thinking there is something else wrong than exceeding some sort of limit.
Your hitting a rate limit. Google api probably has a limit of approx 20 writes / minute, so you'll need to slow your script down in order to avoid triggering the rate limit. In What is driveWriteVolume rateMax? the user in the similar thread used a time of 3000 ms to solve the problem rather than the suggested 1000 ms.