GMail Addon: Expired Access Token - google-apps-script

My gmail addon consists on several cards. For card navigation I widely use setOnClickAction, e.g.
CardService.newAction().setFunctionName('openUserCard').setParameters({userJSON: JSON.stringify(user)})
Gmail addon reference says both keys and values of setParameters method must be strings. That's why it's impossible to send any complex object from one card to another.
Global variables are not supported as well. One can use PropertiesService for storing some data, but that also restricted to strings.
I have initial and export cards. On initial card there's current email data importer that looks like that:
function buildAddon(e) {
var accessToken = e.messageMetadata.accessToken;
GmailApp.setCurrentMessageAccessToken(accessToken);
var message = GmailApp.getMessageById(e.messageMetadata.messageId);
var attachments = message.getAttachments();
... we can do anything with attachments here...
The problem is that I have to use attachments not on the initial, but on the other, export card, to POST them to some external api. But I can not send attachments array directly using setOnClickAction, because it consists on complex objects with methods.
That's why I send the initial e.messageMetadata object to the export card, and there repeat all the operations above: setCurrentMessageAccessToken, getMessageById, getAttachments, and then for each attachment get it's content by attachment.getBytes() and send to external api.
If a customer goes to export card immediately, this all works. But if he browses some other cards for several minutes, and then goes to export, the call to GmailApp.getMessageById(messageMetadata.messageId) returns an error Access Denied:: Expired access token.
How to avoid this?

Each action receives just one parameter, the argument 'e' event.
Then if we inspect that 'e', we will find a JSON object with the property parameters, which is the our parameters sent into the action function via the setParameters() method of the Action.
Inside this variable 'e' there is also a property called messageMetadata with all the proper values.
var myAction = CardService.newAction().setFunctionName("xpto").setParameters({ name: "banana"} );
function xpto(e) {
var name = e.parameters.name;
}
A sample 'e' event has the following JSON inside:
{
formInput = {},
clientPlatform = web,
messageMetadata = {
messageId= ... ,
accessToken= ...
},
formInputs = {},
parameters = { name=Banana }
}
Hope this is still useful.

Related

How to pull data from Toggl API with Power Query?

First timer when it comes to connecting to API. I'm trying to pull data from Toggl using my API token but I can't get credentials working. I tried to replicate the method by Chris Webb (https://blog.crossjoin.co.uk/2014/03/26/working-with-web-services-in-power-query/) but I can't get it working. Here's my M code:
let
Source = Web.Contents(
"https://toggl.com/reports/api/v2/details?workspace_id=xxxxx&client=xxxxxx6&billable=yes&user_agent=xxxxxxx",
[
Query=[ #"filter"="", #"orderBy"=""],
ApiKeyName="api-token"
])
in
Source
After that I'm inputting my API Token into Web API method in Access Web content windows but I get an error that credentials could not be authenticated. Here's Toggl API specification:
https://github.com/toggl/toggl_api_docs/blob/master/reports.md
Web.Contents function receives two parameters: url + options
Inside options, you define the headers and the api_key, and other queryable properties, such as:
let
baseUrl = "https://toggl.com/",
// the token part can vary depending on the requisites of the API
accessToken = "Bearer" & "insert api token here"
options = [
Headers = [Authorization = accessToken, #"Content-Type" =
"application/Json"], RelativePath ="reports/api/v2/details", Query =
[workspace_id=xxxxx, client=xxxxxx6 , billable=yes, user_agent=xxxxxxx]
]
Source = Web.Contents(baseUrl, options)
// since Web.Contents() doesn't parse the binaries it fetches, you must use another
// function to see if the data was retreived, based on the datatype of the data
parsedData = Json.Document(Source)
in
parsedData
The baseUrl is the smallest url that works and never changes;
The RelativePath is the next part of the url before the first "?".
The Query record is where you define all the attributes to query as a record.
This is usually the format, but check the documentation of the API you're querying to see if it is similar.

Update Classroom courseState using patch

I am trying to write an Apps Script function to archive a whole bunch of courses in Google Classroom.
function myFunction() {
var response = Classroom.Courses.list();
var optionalArgs = {'courseState': 'ARCHIVED'};
var courses = response.courses;
if (courses && courses.length > 0) {
for (i = 0; i < courses.length; i++) {
var course = courses[i];
Classroom.Courses.update(course.name, course.id, {'updateMask':'courseState'}, body=optionalArgs); // Line 10
//Logger.log('%s (%s) [%s]', course.name, course.id, course.enrollmentCode);
}
}
}
I get the following error when running the above code:
Invalid number of arguments provided. Expected 2-3 only (line 10, file "ArchiveAll")
What is the correct way of doing this with Google Apps Script and the Classroom advanced service?
Based on the code, it looks like you may have previously used the Python client libraries (specifically the body=optionalArgs portion). In JavaScript / Google Apps Script, keyword parameter assignment isn't a thing, at least not like it is in Python.
The format expected by class methods in Google's "Advanced Services" client libraries are derived from the HTTP REST API specification for the associated API. For the Classroom.Courses.update call, this is courses#update (or per your title, courses#patch).
The REST API spec for update is for 1 path parameter (the course id), and a request body with a Course resource. As with all Google APIs, you can additionally add any of the Standard Query Parameters as an optional argument. This count - 2 required, 1 optional) corresponds with the error message you received:
Invalid number of arguments provided. Expected 2-3 only
Thus, your function should be something like:
function updateCourse_(course) {
course.courseState = 'ARCHIVED';
const options = {
fields: "id,name,courseState" // data sent back in the response.
};
return Classroom.Courses.update(course, course.id, options);
}
The patch method has an additional optional argument, the updateMask query parameter. As with other optional parameters (like Standard Query Parameters), this is passed in an object as the last parameter to the class method:
function patchCourse_(courseId) {
const newMetaData = {
courseState: 'ARCHIVED',
// other options, must be valid Course fields per patch documentation:
// https://developers.google.com/classroom/reference/rest/v1/courses/patch#query-parameters
};
const options = {
updateMask: "courseState", // CSV string of things you alter in the metadata object
fields: "id,name,courseState" // data sent back in the response
};
return Classroom.Courses.patch(newMetaData, courseId, options);
}
The updateMask allows you to use some template Course resource and only apply the specified portions of it to a specified course. If you were to use update instead of patch, you would alter all fields to use the template's values:
function patchedViaTemplate_(templateCourse, courseId, fieldsToAlter) {
const options = { updateMask: fieldsToAlter };
return Classroom.Courses.patch(templateCourse, courseId, options);
}

GSuite Admin SDK > Directory API: How do I add values to a custom schema for a user?

We are setting up some automation around SSO into AWS, but I have run into a problem.
There is a custom user attribute called AWSLab, and if a user does not have any IAMRole values populated for this attribute, then I need to add one.
The IAMRole field has Info type set to Text and No. of values set to Multi-value on the GSuite side, so I am putting it into an array for this API request.
Also, when I do a GET on the user and look at other schemas attached, I see this key named type set to work so I include that too.
Below is my function in Google Apps Script:
function check_user_access(){
var email = 'user#domain.com';
var role = [
'arn:aws:iam::123456789012:role/User',
'arn:aws:iam::123456789012:saml-provider/GoogleAppsProvider'
].join(',')
optArgs = {
projection: "full"
}
var user = AdminDirectory.Users.get(email, optArgs)
var schema = user.customSchemas
Logger.log("typeof(schema): %s", typeof(schema))
if(schema["AWSLab"]) {
Logger.log("schema[\"AWSLab\"] found on user '%s': %s", email, schema["AWSLab"])
} else {
Logger.log("schema[\"AWSLab\"] not found! Updating...")
Logger.log("schema before:\n\n%s\n", JSON.stringify(schema))
schema["AWSLab"] = { "IAMRole": [{ "type": "work", "value": role }] }
Logger.log("schema after:\n\n%s\n", JSON.stringify(schema))
AdminDirectory.Users.update(userFull, email) // line 35
}
}
When this function runs, I see this error:
Invalid Input: [AWSLab] (line 35, file "Labs")
I have some extra lines in there right now, to output some details for troubleshooting, but it's not helping me see the problem.
As it turns out, the issue was with the name of the custom schema.
At creation, the schema had a different name which was then edited at some point.
The key to figuring this out was populating the schema fields in question on a user with some dummy data, then pulling the user out via the API with a GET and examining the JSON.

Angular2 HTTP Providers, get a string from JSON for Amcharts

This is a slightly messy questions. Although it appears I'm asking question about amCharts, I really just trying to figure how to extract an array from HTTP request and then turn it into a variable and place it in to 3-party javacript.
It all starts here, with this question, which was kindly answered by AmCharts support.
As one can see from the plnker. The chart is working. Data for the chart is hard coded:
`var chartData = [{date: new Date(2015,2,31,0,0,0, 0),value:372.10,volume:2506100},{date: new Date(2015,3,1,0, 0, 0, 0),value:370.26,volume:2458100},{date: new Date(2015,3,2,0, 0, 0, 0),value:372.25,volume:1875300},{date: new Date(2015,3,6,0, 0, 0, 0),value:377.04,volume:3050700}];`
So we know the amCharts part works. Know where the problem is changing hard coded data to a json request so it can be dynamic. I don't think this should be tremendously difficult, but for the life of me I can't seem figure it out.
The first issue is I can't find any documentation on .map, .subscribe, or .observable.
So here is a plunker that looks very similar to the first one, however it has an http providers and injectable. It's broken, because I can't figure out how to pull the data from the service an place it into the AmCharts function. I know how pull data from a http provider and display it in template using NgFor, but I don't need it in the template (view). As you can see, I'm successful in transferring the data from the service, with the getTitle() function.
this.chart_data =_dataService.getEntries();
console.log('Does this work? '+this.chart_data);
this.title = _dataService.getTitle();
console.log('This works '+this.title);
// Transfer the http request to chartData to it can go into Amcharts
// I think this should be string?
var chartData = this.chart_data;
So the ultimate question is why can't I use a service to get data, turn that data into a variable and place it into a chart. I suspect a few clues might be in options.json as the json might not be formatted correctly? Am I declaring the correct variables? Finally, it might have something to do with observable / map?
You have a few things here. First this is a class, keep it that way. By that I mean to move the functions you have inside your constructor out of it and make them methods of your class.
Second, you have this piece of code
this.chart_data =_dataService.getEntries().subscribe((data) => {
this.chart_data = data;
});
What happens inside subscribe runs asynchronously therefore this.chart_data won't exist out of it. What you're doing here is assigning the object itself, in this case what subscribe returns, not the http response. So you can simply put your library initialization inside of the subscribe and that'll work.
_dataService.getEntries().subscribe((data) => {
if (AmCharts.isReady) {
this.createStockChart(data);
} else {
AmCharts.ready(() => this.createStockChart(data));
}
});
Now, finally you have an interesting thing. In your JSON you have your date properties contain a string with new Date inside, that's nothing but a string and your library requires (for what I tested) a Date object, so you need to parse it. The problem here is that you can't parse nor stringify by default a Date object. We need to convert that string to a Date object.
Look at this snippet code, I used eval (PLEASE DON'T DO IT YOURSELF, IS JUST FOR SHOWING PURPOSES!)
let chartData = [];
for(let i = 0; i < data[0].chart_data.length; i++) {
chartData.push({
// FOR SHOWING PURPOSES ONLY, DON'T TRY IT AT HOME
// This will parse the string to an actual Date object
date : eval(data[0].chart_data[i].date);
value : data[0].chart_data[i].value;
volume : data[0].chart_data[i].volume;
});
}
Here what I'm doing is reconstructing the array so the values are as required.
For the latter case you'll have to construct your json using (new Date('YOUR DATE')).toJSON() and you can parse it to a Date object using new Date(yourJSON) (referece Date.prototype.toJSON() - MDN). This is something you should resolve in your server side. Assuming you already solved that, your code should look as follows
// The date property in your json file should be stringified using new Date(...).toJSON()
date : new Date(data[0].chart_data[i].date);
Here's a plnkr with the evil eval. Remember, you have to send the date as a JSON from the server to your client and in your client you have to parse it to a Date.
I hope this helps you a little bit.
If the getEntries method of DataService returns an observable, you need to subscribe on it to get data:
_dataService.getEntries().subscribe(
(data) => {
this.chart_data = data;
});
Don't forget that data are received asynchronously from an HTTP call. The http.get method returns an observable (something "similar" to promise) will receive the data in the future. But when the getEntries method returns the data aren't there yet...
The getTitle is a synchronous method so you can call it the way you did.

How to use update function to upload attachment in CouchDB

I would like to know what can I do to upload attachments in CouchDB using the update function.
here you will find an example of my update function to add documents:
function(doc, req){
if (!doc) {
if (!req.form._id) {
req.form._id = req.uuid;
}
req.form['|edited_by'] = req.userCtx.name
req.form['|edited_on'] = new Date();
return [req.form, JSON.stringify(req.form)];
}
else {
return [null, "Use POST to add a document."]
}
}
example for remove documents:
function(doc, req){
if (doc) {
for (var i in req.form) {
doc[i] = req.form[i];
}
doc['|edited_by'] = req.userCtx.name
doc['|edited_on'] = new Date();
doc._deleted = true;
return [doc, JSON.stringify(doc)];
}
else {
return [null, "Document does not exist."]
}
}
thanks for your help,
It is possible to add attachments to a document using an update function by modifying the document's _attachments property. Here's an example of an update function which will add an attachment to an existing document:
function (doc, req) {
// skipping the create document case for simplicity
if (!doc) {
return [null, "update only"];
}
// ensure that the required form parameters are present
if (!req.form || !req.form.name || !req.form.data) {
return [null, "missing required post fields"];
}
// if there isn't an _attachments property on the doc already, create one
if (!doc._attachments) {
doc._attachments = {};
}
// create the attachment using the form data POSTed by the client
doc._attachments[req.form.name] = {
content_type: req.form.content_type || 'application/octet-stream',
data: req.form.data
};
return [doc, "saved attachment"];
}
For each attachment, you need a name, a content type, and body data encoded as base64. The example function above requires that the client sends an HTTP POST in application/x-www-form-urlencoded format with at least two parameters: name and data (a content_type parameter will be used if provided):
name=logo.png&content_type=image/png&data=iVBORw0KGgoA...
To test the update function:
Find a small image and base64 encode it:
$ base64 logo.png | sed 's/+/%2b/g' > post.txt
The sed script encodes + characters so they don't get converted to spaces.
Edit post.txt and add name=logo.png&content_type=image/png&data= to the top of the document.
Create a new document in CouchDB using Futon.
Use curl to call the update function with the post.txt file as the body, substituting in the ID of the document you just created.
curl -X POST -d #post.txt http://127.0.0.1:5984/mydb/_design/myddoc/_update/upload/193ecff8618678f96d83770cea002910
This was tested on CouchDB 1.6.1 running on OSX.
Update: #janl was kind enough to provide some details on why this answer can lead to performance and scaling issues. Uploading attachments via an upload handler has two main problems:
The upload handlers are written in JavaScript, so the CouchDB server may have to fork() a couchjs process to handle the upload. Even if a couchjs process is already running, the server has to stream the entire HTTP request to the external process over stdin. For large attachments, the transfer of the request can take significant time and system resources. For each concurrent request to an update function like this, CouchDB will have to fork a new couchjs process. Since the process runtime will be rather long because of what is explained next, you can easily run out of RAM, CPU or the ability to handle more concurrent requests.
After the _attachments property is populated by the upload handler and streamed back to the CouchDB server (!), the server must parse the response JSON, decode the base64-encoded attachment body, and write the binary body to disk. The standard method of adding an attachment to a document -- PUT /db/docid/attachmentname -- streams the binary request body directly to disk and does not require the two processing steps.
The function above will work, but there are non-trivial issues to consider before using it in a highly-scalable system.