So the scenario is that there is a ML model running in google cloud and the input to that needs to be send in form of an azure service bus message. Is this even possible?
I need to implement a handler (google cloud function) to retrieve messages from azure service bus.
You can use Azure service bus’ SDK and write your own code to retrieve it. You can go through these Azure Service Bus Client library for python samples, which may help you in writing your code.
You will need to trigger the function whenever you want to retrieve the messages from the service bus, to do that either you can use Cloud Scheduler to set a trigger timer or have Azure side trigger the HTTP URL of Cloud Functions.
Related
I would like to write a function using function app in microsoft azure to receive messages from an iot hub convert them from base64 format to string and store them in a container in BlobStorage. Could you please help me to do this?
Thanks in advance.
Br
Masoud
Create the Azure Function IoT Hub Trigger to receive the messages and this template comes by default when you create it.
Connection value should be provided in the local.settings.json file, which is IoT Hub Connection string, can get it from the Azure Portal > IoT Hub Resource > Buit-in endpoints under Hub Settings.
Runt the function where you will see your messages flowing from IoTHub to your Azure Functions (assuming you have devices or simulators connected that are sending data).
Refer to this article for step-by-step information on receiving the messages of IoT Hub to Azure Function.
To save those messages came from Sensors in Azure, you can use Azure Storage Account either Blob Container or Table Storage.
Step 1: Go to IoT Hub resource > Message routing > Custom Endpoints > Add the New Endpoint for the Storage Service.
Step 2: Give an endpoint name, pick a container created for storing the data (messages), encoding format either JSON or AVRO, File Name Format, Authentication Type, etc. in this window:
Now, we added Storage Account endpoint to route messages received by IoT Hub to the Azure Blob Container Storage.
Final Step is to add the route for routing data to the storage account.
Please visit these sources for detailed information:
Saving IoT Hub messages to Azure Blob Storage.
Saving IoT Hub Sensor Data to Azure Table Storage
Official Documentation of Azure Functions IoT Hub Trigger
Note: Azure Blob Storage is cost-effective (High Price). Basically, it is recommended this variant only for proof-of-concepts or very small, simple projects. Please refer this article for more information on using which storage account for IoT Hub Trigger to optimize cost.
I have created an HTTP Google Cloud Function that does not allow unauthenticated requests.
I have created a service account in the project with one role: Cloud Functions Invoker.
This service account is listed as a principal for my http cloud function and shows to have that role:
I have created a Cloud Scheduler Job to run this function.
In the job, I've specified that I want it to obtain an OIDC token for authenticating requests to the http function:
Whenever I trigger the job, it fails with a message indicating the request is unauthenticated:
Things I've tried:
Recreate the function
Recreate the job
Use a different user (the main service account user - that one doesn't work either)
Do a POST instead of a GET from the scheduler job (I've successfully created scheduled jobs for authenticated http functions before but this is the first time I've done a GET - just grasping at straws really)
Did I miss something? Any idea why it is coming back with the "Unauthenticated" message?
I revisited this today. My IAP protected HTTP function is expecting a query string parameter to be passed into it. The Cloud Platform Web UI automatically sets the audience to the same URL (including the parameter) when creating the Scheduled Job. I figured Google knows what they are doing, so I left it that way originally.
Out of desperation I tried removing this parameter from the audience and that made the authentication work properly.
So, I changed the audience from
https://<myProject>.cloudfunctions.net/myFunction?p=abc
to
https://<myProject>.cloudfunctions.net/myFunction
I've created lots of Google App Script scripts for Classroom API and the Admin Groups and Drive APIs but I can't seem to get started with AdminReports.
I get the message:
"AdminReports is not defined"
and I can't see it listed in the services that one can add in the scripting environment.
Is it because it's an advanced service? Is there something else that I need to do to get it up and running?
Enable Advanced Google Services:
To use an advanced Google service, follow these instructions:
New Editor:
Configuration
Select Admin SDK API
Select reports_v1 as API Version
(Optional) Replace AdminDirectory with AdminReports
*From https://developers.google.com/admin-sdk/reports/reference/rest?hl=en
Service: admin.googleapis.com
To call this service, we recommend that you use the Google-provided client libraries. If your application needs to use your own libraries to call this service, use the following information when you make the API requests.
Discovery document
A Discovery Document is a machine-readable specification for describing and consuming REST APIs. It is used to build client libraries, IDE plugins, and other tools that interact with Google APIs. One service may provide multiple discovery documents. This service provides the following discovery document:
https://admin.googleapis.com/$discovery/rest?version=reports_v1
Service endpoint
A service endpoint is a base URL that specifies the network address of an API service. One service might have multiple service endpoints. This service has the following service endpoint and all URIs below are relative to this service endpoint:
https://admin.googleapis.com
REST Resource: activities
list: GET /admin/reports/v1/activity/users/{userKey}/applications/{applicationName}
Retrieves a list of activities for a specific customer's account and application such as the Admin console application or the Google Drive application.
watch:
POST /admin/reports/v1/activity/users/{userKey}/applications/{applicationName}/watch
Start receiving notifications for account activities.
REST Resource: customerUsageReports
get: GET /admin/reports/v1/usage/dates/{date}
Retrieves a report which is a collection of properties and statistics for a specific customer's account.
REST Resource: entityUsageReports
get: GET /admin/reports/v1/usage/{entityType}/{entityKey}/dates/{date}
Retrieves a report which is a collection of properties and statistics for entities used by users within the account.
REST Resource: userUsageReport
get: GET /admin/reports/v1/usage/users/{userKey}/dates/{date}
Retrieves a report which is a collection of properties and statistics for a set of users with the account.
I have a google cloud function in Java.
Client will invoke the function using HTTP trigger URL.
But that is not secure. I have gone through some docs saying that you should pass a token or client ID and then verify it in server side.
Can anyone explain that in detail and please provide a code example if any.
My doubt is to authenticate the client while they invoke the function using Http trigger
This page explains quite well all the capacity that you have to authenticate a requester on Cloud Functions.
If you have users, the best way is to use Firebase Auth (our Google Cloud Identity Platform which is simply a more advance solution than Firebase Auth with more features)
However, you need to grant all you user with cloudfunction.invoker role, to allow them to invoke the Cloud Functions. It could be difficult. You can also perform the check on your side, but in this case you remove the security (filter) layer of google and you have to check all the traffic by yourselves (not really safe, in term of billing and in case of attack).
The latest solution, API keys, is not recommended, especially for the users. But for machine to machine it's sometime the only solution. However, there isn't out of the box solution and for this I wrote an article, that explains how to create a Cloud Endpoint (or now a Cloud API Gateway which is the serverless solution of Cloud Endpoint with ESPv2) to accept API Keys.
With this latest solution, if you change your security definition, you can also accept OAuth2 tokens coming from Firebase Auth (or Cloud Identity Platform), but this time, you don't need to grant all the users on your Cloud Functions IAM role. The token only need to be valid and it's the Cloud Endpoint service account which is used to perform the call (and thus which needs to be authorized on the Cloud Functions).
In addition, because you can accept OAuth2 token, you can also accept non Google token, and thus have your users in any IDP OAuth2 compliant (KeyCloak, Okta,...)
You could use external OAuth server like keycloack (https://github.com/keycloak/keycloak), or use somethging like Json Web Tokens -- https://jwt.io/ -- available for various languages, siutable for microservices.
I am new to google cloud functions and try to restrict access to my function by only requests from dialogflow webhooks. I see two options in gcloud console: allow unauthenticated requests and restrict by user accounts. I don't understand how to implement that authentication. Dialogflow webhooks has options to set http headers that sets in webhook requests. But gcloud console hasn't interface/options to obtain any data that I can write as http authentication header. So I see only option implement authetication flow in cloud function, but in that way why google added option to restrict access by http authentication. Can anyone give me an example step-by-step example how to obtain http headers names and data needed to implement http authentication on cloud functions from dialogflow webhooks?
There isn't built in authentication, you have to perform it by yourselves. You have some guidance here in the Google Cloud Documentation
In summary, set your function public (allow unauthenticated) and perform the check in your code.