Can chrome notification be notified at selected time (offline)? - google-chrome

Basic on Service Worker API , I see that Service Worker can
Reacting to a particular time & date
. But I can't find any related document. So, Is there anyway to push notification offline at particular time & date?
Thank you

The ability to trigger a notification at a specific date/time (regardless of network connection) can be with an experimental feature called Notification Triggers, so unfortunately, it's not widely available yet.
To experiment with the Notification Triggers API in Chrome enable the #enable-experimental-web-platform-features flag in chrome://flags.
const createScheduledNotification = async (tag, title, timestamp) => {
const registration = await navigator.serviceWorker.getRegistration();
registration.showNotification(title, {
tag: tag,
body: 'This notification was scheduled 30 seconds ago',
showTrigger: new TimestampTrigger(timestamp + 30 * 1000),
});
};
Hopefully this will be available in Chrome and other browsers later this year. The article will be updated with the status of the API.

Related

Cloud Schedule + Cloud Functions -> Gmail API watch() - WORKING NOW

This is my first post here. I am sorry if it's a repost, but I've been searching for more than one month for the answer to solve my problem in all websites and forums and until now... no answers!
My goal is to make a Gmail pub/sub watch() to make an action whenever I receive a new email.
To do so, according to the developer's website, I need to subscribe to Gmail watch() on a daily basis with the code:
request = {
'labelIds': ['INBOX'],
'topicName': 'projects/myproject/topics/mytopic'
}
gmail.users().watch(userId='me', body=request).execute()
Until now i have this a working scheduled task with a service account, with INVOKER Permissions. This part just works fine.
In my "initial autorization function" i have:
const {google} = require('googleapis');
// Retrieve OAuth2 config
const oauth2Client = new google.auth.OAuth2(
process.env.CLIENT_ID,
process.env.CLIENT_SECRET,
process.env.CALLBACK_URL
);
exports.oauth2init = (req, res) => {
// Define OAuth2 scopes
const scopes = [
'https://www.googleapis.com/auth/gmail.modify'
];
// Generate + redirect to OAuth2 consent form URL
const authUrl = oauth2Client.generateAuthUrl({
access_type: 'offline',
scope: scopes,
//prompt: 'none'// Required in order to receive a refresh token every time
});
return res.redirect(authUrl);
};
My issue now is that the access token is generated via (prompt) the first time and never updates to a new one ( the token expires after 1hour...) it means this code stops working after that period and a "manual" intervention is required. According with the documentation, i need to use "offline" method and on "prompt" i can omit (only requests permissions on the 1st time) or none (never asks), like is said here.
I managed how to make it work! tomorow i will continue with the process.
Should i post here my working code for reference?
Thanks!
I will rephrase the process you illustrated so that there is no ambiguity.
According the documentation you pushed:
You do not suscribed to watch(), you call watch()
watch() is an API call to the Gmail API that will enable automatic events publication on a pub/sub topic you define given conditions you specified. Who are you watching? On what events?
You suscribe to a Pub/Sub topic that is targeted by your previous watch() call
A process (e.g: Google cloud function) suscribes to the topic and will consume messages sent by the Gmail API
The call is to be renewed at least every seven days
Because Google needs to be sure you still need to monitor the targeted inbox, it needs a renewal from you. Another watch() call will act so.
Cloud scheduler will enable this periodic renewal
this service will trigger your renewal script you put in your question. To do so it needs to be authenticated to the platform that host the script. It is easier if your script is hosted in a google service (cloud function, cloud run,...) and the authent type depends on the target URL form. In all cases YOU DO NEED an authent token in your request header. The token is generated from a service account you created with the right permission to call your script (e.g: cloud run invoker). By default the scheduler has the right to generate a token from it
So far so good. Now comes the tricky part and you don't mention it in your question. How is authenticated your gmail api client? You cannot monitor someone inbox, unless this person gave you the permission to i.e you call the API with the right Oauth2 token. Indeed in the video you point they authenticat the user using this principe which is implemented in their code with Express-oauth2-handler.
So you will have a cloud function to init end user authent and watch to his/her inbox. The renewal should do so but problem is user will not be there for accepting the end user consent. Here comes the offline access but it is beyond the scope of your question. Finally a second functions will suscribe to the pubsub topic and consume the message as you need. See their implementation code which populate a spreadsheet.
The documentation you shared in the comments does not say that you can remove the token from the headers of the service account, also the gmail API documentation you also shared says that you only:
need to grant publish privileges to gmail-api-push#system.gserviceaccount.com. You can do this using the Cloud Pub/Sub Developer Console permissions interface following the resource-level access control instructions.
In order to achieve this basically what you will need is a setup of two cloud functions, the first scheduled function is responsible for setting up the watch(), and you can check this documentation for how to deploy a scheduled function, and the second function being triggered by the pubsub of gmail notifications, you can check this documentation for how to build an event triggered function. Both processes are similar.
NOTE: I have never user the Gmail API, so I am not sure if any extra steps are necessary but then again, the documentation implies that setting up the permissions of that service account is enough to make it work.
EDIT:
As per the information you have shared. The issue is likely that you are not properly setting the Service Account to authenticate with the Cloud Function. As per described in the documentation, you have to grant to the Service Account the role Cloud Functions Invoker in IAM.
Let me know if this fixed the issue.

Is there a way to stop web-push notification on a Google chrome on one device if it is already received on a Google chrome on another device?

I have implemented push notifications using web-push that works on Google chrome.
Is there a way to stop web-push notification on a Google chrome on one device if it is already received on a Google chrome on another device?
This would require you to access something in a distributed store before pushing the notification to the end user. For example, your logic could be something similar to:
// Only one client can get a lock at a time
var lock = await api.getPushNotificationDistributedLock();
// Get the status
var pushNotificationStatus = await api.getPushNotificationStatus();
// Send the notification if it hasn't already been sent
if(pushNotificationStatus === "pending") {
// Push notification to user
await api.setPushNotificationStatus("completed");
}
// Release the lock
lock.release();
The first client to get the lock will see the status as pending and successfully send the push notification. After it updates the status to completed, every subsequent client will see the status as completed and carry on.

Is it possible to watch Directory API changes from Google App Maker/Google Apps Script?

I'm working on an performance evaluation app in Google App Maker. One of the challenges we have with our current tool is that it doesn't sync with our G Suite directory when a person's manager changes or when a person has a name change -- their existing evaluations are linked to the person's old name and we have to change manually.
In my new app, I have an Employees datasource that includes a relation to the evaluation itself that was initially populated via the Directory API. Reading the documentation here, it seems as though I should be able to set up a watch on the Users resource to look for user updates and parse through them to make the appropriate name and manager changes in my Employees datasource. What I can't figure out, though, is what the receiving URL should be for the watch request.
If anyone has done this successfully within Google App Maker, or even solely within a Google Apps Script, I'd love to know how you did it.
EDITED TO ADD:
I created a silly little GAS test function to see if I can get #dimu-designs solution below to work. Unfortunately, I just get a Bad Request error. Here's what I have:
function setUserWatch() {
var optionalArgs = {
"event": "update"
};
var resource = {
"id": "10ff4786-4363-4681-abc8-28166022425b",
"type": "web_hook",
"address": "https://script.google.com/a/.../...hXlw/exec"
};
AdminDirectory.Users.watch(resource);
}
Address is the current web app URL.
EDITED TO ADD MORE:
The (in)ability to use GAS to receive web hooks has been an active issue/feature request since Sep 2014 -- https://issuetracker.google.com/issues/36761910 -- which has been #dimu-designs on top of for some time.
This is a more comprehensive answer.
Google supports push notifications across many of their APIs. However there are many subtle (and not so subtle) differences between them. Some that leverage webhooks send their data payloads primarily as HTTP headers; for example Drive API and Calendar API. Others mix their payloads across HTTP headers and a POST body(ex: AdminDirectory API). And its gets even crazier, with some APIs utilizing different mechanisms altogether (ex: GMail API leverages Cloud PubSub).
There are nuances to each but your goal is to leverage AdminDirectory push notifications in a GAS app. To do that you need a GAS Web App whose URL can serve as a web-hook endpoint.
STEP 1 - Deploy A Stand-Alone Script As A Web App
Let's start with the following template script and deploy it as a Web App from the Apps Script Editor menu Publish > Deploy As Web App:
/** HTTP GET request handler */
function doGet(e) {
return ContentService.createTextOutput("GET message");
}
/** HTTP POST request handler */
function doPost(e) {
return ContentService.createTextOutput("POST message");
}
STEP 2 - Verify/Validate Domain Ownership And Add/Register Domain
NOTE: As of August 2019, GAS Web App URLs can no longer be verified using this method. Google Cloud Functions may be a viable
alternative.
With the web app deployed you now have to verify and register the domain of the receiving url, which in this case is also the web app url. This url takes the following form:
https://script.google.com/macros/s/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/exec
Technically you cannot own a GAS web app url's domain. Thankfully the App Script Gods at Google do provide a mechanism to verify and register a GAS web app url.
From the Apps Script Editor menu select Publish > Register in Chrome Web Store. Registering a published web app with the Chrome Web Store also validates the URL's domain (no need to fiddle with the search console).
Once validated you need to add the "domain" via the Domain verification page in the API Console. The "domain" is everything in the url sans the 'exec', so you'll add a string that looks like this:
https://script.google.com/macros/s/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/
STEP 3 - Make a watch request
For this step the AdminSDK/Directory API service should be enabled both for your App Script project and in the API Console.
Create a function that generates a watch request (this can be retooled for other event types):
function startUpdateWatch() {
var channel = AdminDirectory.newChannel(),
receivingURL = "https://script.google.com/macros/s/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/exec",
gSuiteDomain = "[business-name].com",
event = "update";
channel.id = Utilities.getUuid();
channel.type = "web_hook";
channel.address = receivingURL + "?domain=" + gSuiteDomain + "&event=" + event;
channel.expiration = Date.now() + 21600000; // max of 6 hours in the future; Note: watch must be renew before expiration to keep sending notifications
AdminDirectory.Users.watch(
channel,
{
"domain":gSuiteDomain,
"event":event
}
);
}
Note that Directory API push notifications have an expiration, the max being 6 hours from starting the watch so it must be renewed periodically to ensure notifications are sent to the endpoint URL. Typically you can use a time-based trigger to call this function every 5 hours or so.
STEP 4 - Update doPost(e) trigger to handle incoming notifications
Unlike the push mechanisms of other APIs, the Directory API sends a POST body along with its notifications, so the doPost(e) method is guaranteed to be triggered when a notification is sent. Tailor the doPost(e) trigger to handle incoming events and re-deploy the web app:
function doPost(e) {
switch(e.parameter.event) {
case "update":
// do update stuff
break;
case "add":
break;
case "delete":
break;
}
return ContentService.createTextOutput("POST message");
}
There is one caveat to keep in mind. Push notifications for update events only tell you that the user's data was updated, it won't tell you exactly what was changed. But that's a problem for another question.
Note that there are a ton of details I left out but this should be enough to get you up and running.
You can do this with GAS and the Admin SDK. The Directory API supports Notifications (Note this is scheduled to be deprecated so not sure what is replacing this functionality). You can then set up a GMAIL script to do what you need to do with the notification.
UPDATE: There are also PUSH notifications from the Directory API.
I was able to set up push notifications for a local resource (a spreadsheet) using Heroku-based Node.js app as an intermediary API. The Node app captures the custom request headers and builds the payload to be consumed by the doPost(e) function of the GAS web app.
The code for constructing a watch request is simple
//get the unique id
var channelId = Utilities.getUuid();
//build the resource object
var resource = {
"id": channelId,
"type": "web_hook",
"address": "https://yourapp.herokuapp.com/drivesub
}
//watch the resource
Drive.Files.watch(resource, fileId);
The challenge is to get that domain address verified. There are ways to verify the standalone (not file-bound!) GAS web app, however, as previous posters have mentioned, the Apps Script web app can't access custom headers.
After you've enabled the Pub/Sub API and created the topic & subscription, go to APIs & Services -> Credentials -> Domain verification. It gives you a few options of verifying your domain, including serving the html file. Download the file generated by Google. Thankfully, Heroku makes it very easy to deploy a Node app
https://devcenter.heroku.com/articles/getting-started-with-nodejs
After your domain is verified you can make your subscription push data to the endpoint URL on Heroku.
I simply created the js file for route handlers and created one specifically for domain verification
handlers.verifyDomain = function(callback){
//Synchronously read from the static html file. Async method fs.readFile() doesn't make the file available to the router callback
var file = fs.readFileSync('./google/google_file.html');
callback(200, file);
}
Then include the handler in your router object like so:
var router = {
"google_file.html": handlers.verifyDomain
}
Finally, in your server starting function, get the path from the URL (there are multiple ways of doing that), and execute the handler;
var routeHandler = router(path);
routerHandler(function(statusCode, file){
//do something
});
Go back to domain verification tool and load the HTML page to verify the URL. After it's verified, the only remaining step is creating the GAS web app and posting to it.
Back to Node app. Note that my endpoint is https://yourapp.herokuapp.com/drivesub
//The code for posting data to GAS web app. Of course, you need to
// update your router with router['driveSub'] = handlers.driveSub
handlers.driveSub = function(data, callback){
var headers = data.headers;
var options = {
method:"POST",
uri: your_gas_app_url,
json:{"headers":headers}
};
//Use NPM to install the request module
request.post(options, function(err, httpResponse, body){
console.log(err, body);
});
callback(200, data);
}
Apps Script app - don't forget to publish it.
function doPost(req) {
var postData = req.postData["contents"];
var headers = JSON.parse(postData)["headers"];
//take action
return 200;
}
Unfortunately you cannot, at least not solely using Apps Script.
Admin Directory push notifications require a web-hook URL endpoint to receive notifications. You might think deploying a GAS web app and using its URL as an endpoint would be sufficient. But the thing with Admin Directory Push notifications is that its data payload resides in custom HTTP headers which cannot be accessed from a GAS Web App. (This also holds true for push notifications across other APIs including the Drive and Calendar APIs)
You can however leverage Google Cloud Functions (a GCP service) in tandem with GAS, but you'll have to know your way around Node.js.
EDIT
After giving this some thought, and reviewing your requirements I believe there is a way to pull this off just using GAS.
You can setup a unique push notification channel for a given event per user/domain (the 'update' event in your use case) by setting the event parameter when initializing the watch. Thus the GAS web app will only be triggered if an update event occurs; you don't really need to rely on the HTTP header to determine the event type.
If you want to track multiple events, you simply create a unique channel per event and use the same GAS Web app endpoint for each one. You differentiate between events by checking the event parameter sent in the POST request. This should remove the need for middle-man services such as Heroku or Google Cloud Functions.

gapi.auth.authorize() with immediate: false doesn't popup a window for authorization

Below is a javascript which simply requests authorization to access user's spreadsheets:
var CLIENT_ID = '********';
var SCOPES = [
'https://www.googleapis.com/auth/spreadsheets'
];
function auth() {
gapi.auth.authorize({
client_id: CLIENT_ID,
immediate: true,
scope: SCOPES
}, function(result) {
console.log('authorize(immediate = true)');
if (result && !result.error) {
console.log('authorize [OK]');
} else {
console.log('authorize [FAILED]');
gapi.auth.authorize({
client_id: CLIENT_ID,
immediate: false,
scope: SCOPES
}, function(result) {
console.log('authorize(immediate = false)');
if (result && !result.error) {
console.log('authorize [OK]');
} else {
console.log('authorize [FAILED]');
}
});
}
});
}
I believe it should do two things:
Popup a window to login unless the user is already logged in.
Popup a window to request authorization to access user's spreadsheets unless the authorization had been already granted earlier. After the authorization is granted the app should be listed under Connected apps & sites and no more popup with authorization will be shown.
I am testing this script with two distinct google accounts. One account works as expected and I get the following output on console:
auth.html:17 authorize(immediate = true)
auth.html:21 authorize [FAILED]
auth.html:27 authorize(immediate = false)
auth.html:29 authorize [OK]
With another account the popup for authorization is not shown and authorization is always granted as If I pressed "Allow" or the app was listed under Connected apps & sites, but it's not there. The console output is exactly the same.
I have done these tests using two browsers:
Version 51.0.2704.79 Built on 8.4, running on Debian 8.5 (64-bit)
Firefox ESR 45.2.0, running on Debian 8.5 (64-bit)
So, basically I have the following questions:
Are my expectations regarding popups correct or the idea behind the gapi.auth.authorize() call with immediate:true or immediate:false is different?
What can be the reason for this "misbehaving"? Is there any "sacred place" where the app is listed as authorized for some scope while the same app not shown under Connected apps & sites?
Note: The CLIENT_ID is listed in Google API Console under OAuth
2.0 client IDs, the type is Web application and the owner is completely different account not related to the above mentioned two.
Thanks.
To answer your questions:
Are my expectations regarding popups correct or the idea behind the gapi.auth.authorize() call with immediate:true or immediate:false is different?
Yes, your expectations regarding popups are correct. As discussed in a related issue #103 posted in GitHub, when user triggers 'gapi.auth.authorize' with button click (immediate:false), flow is as following:
Popup window with permissions authorization is shown
When user accept/deny, popup window is closed
instead of firing callback, TypeError appears in console (regardless of whether user authorized app to handle requested data or not)
What can be the reason for this "misbehaving"? Is there any "sacred place" where the app is listed as authorized for some scope while the same app not shown under Connected apps & sites?
That related issue #103 posted in GitHub can also be the reason for this "misbehaving" and based from the thread, this has already been fixed which can be found in that GitHub post.
I hope that helps.

Chrome Google-Cast sender not receiving media status updates on original media load

I've been working on both the receiver and web sender (Chrome) apps for Chromecast for a while, and since the new API (publicly released yesterday) I've been unable to receive any media status updates after performing a loadMedia request.
Upon refreshing the page, I am able to receive the updates as expected, but both channels receive the media session object the exact same way and perform the exact same methods upon it (addUpdateListener)
We're working with an HLS stream on the receiver side, but I can see updates being sent out to the senders (and obviously they are, as a reload allows the web sender to receive them). Also, on the sender side, I can see updates when I perform actions such as play/pause/volume/mute and seek requests, but only one update after each action.
TL;DR: Don't see any regular media status updates for events such as loading->buffering->playing, and I also don't even see the play/pause/volume/etc updates from other senders connected to the session.
Here's all I get from the sender before it just stops sending any status updates:
[chrome.cast.ApiImpl] Getting message from extension:
{"type":"v2_message","message":{"type":"MEDIA_STATUS","status":[{"mediaSessionId":1,"playbackRate":1,"playerState":"BUFFERING","currentTime":500,"supportedMediaCommands":15,"volume":{"level":1,"muted":false},"media":{removed}","streamType":"buffered","contentType":"contentType","metadata":null,"duration":1450.1558329999993,"customData":null},"sessionId":"FD5AA3F0-F93C-050A-4900-7C6D916525BA"}],"requestId":92278937},"seqNum":"a14772002","clientId":null,"appOrigin":null}
[chrome.cast.ApiImpl] Creating new Media object:
FD5AA3F0-F93C-050A-4900-7C6D916525BA:1 cast_sender.js:3428 New media
session ID: 1 (loadMedia)
But I can tell that the receiver is still sending out updates by looking in the debug console, there's a bunch of these firing off:
[245.485s] [cast.receiver.IpcChannel] IPC message sent:
{"namespace":"urn:x-cast:com.google.cast.media","senderId":":","data":"{\"type\":\"MEDIA_STATUS\",\"status\"
...removed [247.495s] [cast.receiver.MediaManager] Sending broadcast status message
I've also noticed that when I do get messages prior to the loadMedia callback, they're all in the BUFFERING state, never a PLAYING
This bug is confirmed and it has been fixed as of Feb 10 update of Cast extension.
Old workaround:
For now there is a workaround for this type of issues where status updates from receiver apps are routed to Chrome Cast extension but not getting routed to the Chrome sender apps.
The trick for now to see incoming status updates from receiver apps is to open up at least 2 tabs that run the same sender app. You initiate a session from one of the tabs and then join that session from the other tab. Now any status update messages from receiver app will not be intercepted by the Cast extension anymore.
I see this bug has been fixed with Feb 10 update, however I am not seeing currentTime event. From the docs -
Changes to the following properties will trigger the listener: currentTime, volume, metadata, playbackRate, playerState, customData.
Should we expect to get an update when the time changes?