Get ACK about job completion from GoogleDataflow - google-compute-engine

I'm trying to call the google dataflow from pods inside GKE. How to get job completion/failure notification from Google Dataflow? As once I submit the job to Google Dataflow it's not a blocking call.

Related

How to make Google App script triggers to run on script executions or API requests?

I am aware that installable triggers do not execute/run on Script executions or updates via API requests. For triggers to run, the change or edit has to be done 'manually'. Is there any work-around or solution to this?
Here is my usecase:
I have designed HTML UI that accepts some data and stores that in specific spreadsheet
Since I didn't want any users to have access to backend spreadsheet (to restrict view to other's submission), I have created REST service using TIBCO Cloud Integration > Google connector that writes data to Google spreadsheet
When user submit the response, UI internally invokes above-mentioned TIBCO API and that in turn updates the spreadsheet (using my Google credentials) with response submitted by user
With this arrangement, I don't have to share my Google spreadsheet with all users and expose all submissions. I could have used Google Form, but that is too basic and my UI requirements are quite complex.
Until this point, everything works fine, but as a part of some advance features, I want some functions (written in the script of backend spreadsheet) to trigger on 'onEdit' when my API call write to the spreadsheet and that is where I am stuck. So far, I have tried:
Exposing Google Script as an REST API
Exposing Google Script as Library function
but in all cases, I have to share the spreadsheet with user that invokes the calls, and I dont want that, thanks.

Do pub/sub messages get deleted if cloud function is deleted?

If I delete a google cloud function and redeploy the same function a day later will I have lost the pub/sub messages that trigger it for that day?
It depends how you have deployed your Cloud Functions
HTTP Functions: You deployed your Cloud Functions with an HTTP trigger. In this case, you have to create a PubSub Push subscription manually that call the Cloud Functions when a message arrived.
If you delete your Cloud Functions, the Pubsub subscription continue to live and to receive the messages. They are kept up to their expiration deadline. The push HTTP call fails in error continuously because the Cloud Functions no longer exists.
If you redeploy the Cloud Functions, on the same project, with the same name, the HTTP url will again exist and the messages will be delivered, without lost.
Background Functions: You have deployed your function with a trigger topic. In this case a PubSub subscription is created automatically with the function is created
If you delete the Cloud Functions, the PubSub subscription, created automatically, will be deleted automatically. So, the existing messages and the new ones will be lost, because there is no subscription to accept and store them.

How to schedule a REST API call with SSIS?

I have built a RESTFUL API for my webscraper and was wondering if there was a way to schedule API calls in SSIS? Something similar to cronjob scheduler
You can have an SSIS package with a Script Task inside for calling the API. Then you deploy the package. Then you create a Job in which you call the package and schedule it (daily/weekly/monthly/every x minutes etc)
Hope this helps.

Can I use Google Apps Script as Asana Webhooks endpoint (doPost)?

I'm trying to connect Google Docs with Asana. I can create tasks from Google Docs and save the connection to MySql database so I can display tasks inside Google Document.
Now I need those tasks to be synced with Asana all the time, so I wanted to create Asana webhooks. I created a doPost funtion in Google Apps Script which should serve as an endpoint. But when I initiate the starting handshake I don't receive a request from Asana to my Google Web App.
To be sure I'm doing everything right I also created a handshake in PHP, which I'm more familiar with. The only problem I had was a SSL certificate. But I think that shouldn't be the problem with Google. And also my Google Web App is public so there shouldn't be any restrictions (I tested it with Postman. I'm receiving requests from Postman. To be sure I receive a request I also created a log into a Google Document.)
What am I doing wrong?
Short answer:
Google Apps Script cannot be used as Asana Webhooks endpoint.
Long answer:
You can receive post requests to Google Apps Script with doPost function. So the first two steps of Asana Webhooks handshake can be accomplished. But there is no way to send a proper response for the third step of the handshake, because you can't read headers of the post request received from Asana and you also can't set the headers of the response back to Asana. Here is the answer I have found about reading and setting headers in Google Apps Script.

Can the Google Apps Script Execution API be called by a service account?

I'd like to use a service account to access a Google Sheet via the Apps Script Execution API, but it's not clear from the documentation whether this is supported.
The steps I've tried (which result in a 403 status from the Execution API) are:
Create a new (unbound) Apps Script
Visit the linked Developer Console project
Enable the Execution API
Create a new service account within the same project (downloading
the generated JSON file)
Create a new Google Sheet and share it with the service account's
email address (this is the step I'm least sure about)
Write an apps script function that reads from the spreadsheet
Run the script manually from the Script Editor (to set the scopes
on the script correctly)
Publish the script ("Deploy as API executable"), making it accessible
to 'anyone'
Mint a new OAuth2 token using the service account and the scopes
linked to the script (in our case just
'https://www.googleapis.com/auth/spreadsheets')
Attempt to make a call to the Execution API using the token
This is the response I got:
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
Does this not work because Service Accounts are never able to access the Execution API? Or is there something wrong with the steps above?
Your original 403 error indicates that you have incorrectly set up authentication for your service account. However, even if you get that working, as of now (10 Nov 2015) you cannot execute Apps Scripts via the Service Account.
It's a known bug, and is being tracked in the Apps Scripts Issue Tracker.
Currently(2020), Service accounts cannot work with Apps script API. As written in the documentation,
Warning: The Apps Script API does not work with service accounts.
Your problem is probably that the script is associated with the wrong project (i.e. its own project, instead of the project associated with your Service Account). Here is what you need to do:
From the Scripts editor select the following menu item: Resources > Developer Console Project.
On this screen enter the project number for your dev console.
cf this answer