Policy to validate API Subscription Key received in Request Body from Google Ads Lead Form Extension using Webhook integration - azure-api-management

Azure API Management checks for Subscription Key in either the Header or Query, but Google Ads Lead Form extension sends the key in the request body google_key
Sample body:
{
"lead_id": "TeSter-123-ABCDEFGHIJKLMNOPQRSTUVWXYZ-abcdefghijklmnopqrstuvwxyz-0123456789-AaBbCcDdEeFfGgHhIiJjKkLl",
"api_version": "1.0",
"form_id": 2,
"campaign_id": 281492028602095,
"google_key": "HERE IS THE KEY",
"is_test": true,
"gcl_id": "TeSter-123-ABCDEFGHIJKLMNOPQRSTUVWXYZ-abcdefghijklmnopqrstuvwxyz-0123456789-AaBbCcDdEeFfGgHhIiJjKkLl",
"adgroup_id": 20000000000,
"creative_id": 30000000000
}
How can we configure a custom policy in Azure API Management to validate the key in the request body?

There is built-in architecture in Azure API Management to validate subscription keys that cannot be accessed outside of the built-in Subscription validation.
To use validate the subscription, I created two APIs in Azure API Management. 1 has no security, 2 is secured by Subscription Key and is rate limited.
Restructure the request by Appending the Key to the Request Header and removing it from the Request Body
<inbound>
<base />
<set-header name="google_key" exists-action="append">
<value>#{
var reqBody = context.Request.Body.As<JObject>(preserveContent: true);
if(reqBody.ContainsKey("google_key"))
{
return reqBody.GetValue("google_key").ToString();
}
else
{
return "";
}
}</value>
</set-header>
<set-body template="none">#{
var reqBody = context.Request.Body.As<JObject>(preserveContent: true);
if(reqBody.ContainsKey("google_key"))
{
reqBody.Remove("google_key");
}
return JsonConvert.SerializeObject(reqBody);
}</set-body>
Send the restructured request to the secured API which validates the subscription key.

Related

How to Create a comment using WP Rest API v2?

I'm trying to create a new comment via the WP Rest APi v2 using a POST request to the following url:
https://www.turboweb.online/wp-json/wp/v2/comments?author_email=admin#admin.com&author_name=alex bhati&content=nice post&post=4002
But, this is the response I'm getting every time:
{
"code": "rest_forbidden_param",
"message": "Query parameter not permitted: author_email",
"data": {
"status": 401
}
}
The author_email field requires authorization in order to be added, maybe this answer can help: WP Rest API not allowing to post anonymous comments through a Laravel request

how to make a correct HTTP request to BigQuery from google script

I am working in google script API trying to get a schema of a table from BiqQuery... not sure why it is so troublesome.
I am sending a request like this :
let url = 'https://bigquery.googleapis.com/bigquery/v2/projects/'+ projectId +'/datasets/'+ datasetId +'/tables/' +tableId;
var response = UrlFetchApp.fetch(url)
I am getting this response:
Exception: Request failed for https://bigquery.googleapis.com returned code 401. Truncated server response: { "error": { "code": 401, "message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie ... (use muteHttpExceptions option to examine full response) (line 68, file "bigQuery")
I have been able to load data to bigQuery alright... not sure why this does not work. I have looked at the OAuth fields in manifest and the script does have access to bigQuery...
no success also when adding this to the options field of the UrlFetch request
var authHeader = 'Basic ' + Utilities.base64Encode(USERNAME + ':' + PASSWORD);
var options = {
headers: {Authorization: authHeader}
}
Use bearer tokens
The reason why the BigQuery API rejects your requests is that the endpoint requires one of the following scopes to be provided with the access token to work, and it is missing from the request:
https://www.googleapis.com/auth/bigquery
https://www.googleapis.com/auth/cloud-platform
https://www.googleapis.com/auth/bigquery.readonly
https://www.googleapis.com/auth/cloud-platform.read-only
The actual issue here is that the basic authorization scheme lacks info about any claims, only sending over correct credentials. Since you are requesting the endpoint directly with UrlFetch service, despite correctly specifying the scopes in the manifest, they will not be sent over.
ScriptApp service now provides an easy method to get a valid bearer token without using an OAuth 2.0 library or building the flow from scratch: getOAuthToken. Pass it to an Authorization header as bearer token, and you should be all set:
const token = ScriptApp.getOAuthToken();
const options = {
headers : {
Authorization : `Bearer ${token}`
}
};
Use Advanced Service
As an alternative, there is an official advanced service as a wrapper around BigQuery REST API that will manage authentication and response parsing for you.
You must enable the BigQuery advanced service before using it
Also, note that the advanced service identifier is configurable, so you have to reference the identifier you chose.
In your case, the service can be used as follows (assuming you used the default BigQuery identifier). There is also the 4th argument of type object that contains optional arguments (not shown here):
Bigquery.Tables.get("projectId","datasetId", "tableId");
The method chain above corresponds to tables.get method of the BigQuery API.

Accessing user token in IBM Cloud Functions serverless app secured with OAuth user authentication

I am creating a serverless app using IBM Cloud Functions. My Cloud Functions API is secured with OAuth user authentication using an IBM Cloud App ID service. When a user logs into my app, an access token is generated by this service. I want to extract user data from that access token so that I can customize the user experience.
How do I access that token from within a Cloud Functions action that is coded for Node.js 10?
Example
The openwhisk webaction doc
https://github.com/apache/openwhisk/blob/master/docs/webactions.md
states that the following code
function main(params) {
return { response: params };
}
generates the following response
{
"response": {
"__ow_method": "get",
"__ow_headers": {
"accept": "*/*",
"connection": "close",
"host": "172.17.0.1",
"user-agent": "curl/7.43.0"
},
"__ow_path": ""
}
}
From that data I should be able to get HTTP request details. Specifically, I should be able to get the Authorization header value off the "__ow_headers" property of the action argument (params).
However, the same code inside an IBM Cloud Functions web action generates nothing. Nothing exists on the params object.

How to determine why an Azure Function App is not triggered by a webhook

I have:
An JavaScript Azure Function in an HTTP webhook configuration; the Function provides a URL; the Function performs an action
A webhook configured in the software I hope to receive notifications from
An Azure Logic App with an HTTP/webhook step that provides a URL for the webhook notification to go to
My goal is that the Azure Function's URL receives notifications from the software's webhook and performs an action. The Azure Logic App is for testing only.
What works
When the the Azure Logic App's URL is used in the software's webhook configuration, the desired action is performed. All works as expected.
The Azure Logic App's logging shows the JSON output from the incoming webhook. I expect (but believe this may be where I am going wrong) that this is the JSON the webhook is sending to the Azure Logic App's URL. When this JSON is used in the Azure Function UI's "Test" tab > "Request body" field, the desired action is performed. All works as expected.
When the Azure Function's URL and the JSON is in a Postman request, the desired action is performed. All works as expected.
What doesn't work
When the Azure Function's URL is used in the software's webhook configuration, no action is performed. This is of course my goal. From everything I have read, I understand that this URL as a webhook endpoint should work.
Azure Function's URL
This is from Get function URL > default (Function key).
https://<app_name>.azurewebsites.net/api/content?code=<api_key>
Other Azure Function config settings
Allowed HTTP methods: GET, POST
Authorization level: Function
The JSON I believe to be coming over the webhook
{
"headers": {
"Expect": "100-continue",
"Host": "redacted",
"X-Telligent-Webhook-Sender": "redacted",
"Content-Length": "16908",
"Content-Type": "application/json; charset=utf-8"
},
"body": {
"events": [{
"TypeId": "ec9da4f4-0703-4029-b01e-7ca9c9ed6c85",
"DateOccurred": "2018-12-17T22:55:37.7846546Z",
"EventData": {
"ActorUserId": 9999,
"ContentId": "redacted",
"ContentTypeId": "redacted",
"ForumReplyId": 9999,
"ForumThreadId": 9999,
"ForumId": 9999
}
}]
}
}
I also tried with the following test code for the same results. It aligns more closely with the sample payload data provided by the software company:
What I tried
{
"events": [{
"TypeId": "ec9da4f4-0703-4029-b01e-7ca9c9ed6c85",
"DateOccurred": "2018-12-17T22:55:37.7846546Z",
"EventData": {
"ActorUserId": 9999,
"ContentId": "redacted",
"ContentTypeId": "redacted",
"ForumReplyId": 9999,
"ForumThreadId": 9999,
"ForumId": 9999
}
}]
}
Sample payload data
{
"events": [
{
"TypeId": "407ad3bc-8269-493e-ac56-9127656527df",
"DateOccurred": "2015-12-04T16:31:55.5383926Z",
"EventData": {
"ActorUserId": 2100,
"ContentId": "4c792b81-6f09-4a45-be8c-476198ba47be"
}
},
{
"TypeId": "3b75c5b9-4705-4a97-93f5-a4941dc69bc9",
"DateOccurred": "2015-12-04T16:48:03.7343926Z",
"EventData": {
"ActorUserId": 2100,
"ContentId": "4c792b81-6f09-4a45-be8c-476198ba47be"
}
}
]
}
I do not know how to determine why the Azure Function is not triggered by the webhook. The software's API documentation does not seem to provide a way to look at the JSON being sent over the webhook, although in my inexperience I may be wrong.
Is there a mechanism within Azure, or Postman, or another tool that lets me see what JSON is being sent over the webhook? Or perhaps is there another approach to determining the cause of the issue?
Thank you for any help.
This is how I got the JSON file from Azure alerts.
Install Ruby on the server
Install Sinatra with following command gem install sinatra
Create file webhook.rb and paste code bellow
require 'sinatra'
set :port, 80
set :bind, '0.0.0.0'
post '/event' do
status 204 #successful request with no body content
request.body.rewind
request_payload = JSON.parse(request.body.read)
#append the payload to a file
File.open("events.txt", "a") do |f|
f.puts(request_payload)
end
end
Run the web service with command ruby webhook.rb
JSON fill be written to file events.txt

"Insufficient Permission" when trying to authenticate to cloud-storage via apps-script

I am about to give up on this as I can't find out what I am doing wrong.
I have a cloud-storage bucket with our companies billing data (json file objects written by google) that I am supposed to process into spreadsheets.
As there is no apps script API for oauth2, I am using the custom OAuth2 library provided by google with the key "1B7FSrk5Zi6L1rSxxTDgDEUsPzlukDsi4KGuTMorsTQHhGBzBkMun4iDF", and have setup the auth request as shown in this example for service accounts:https://github.com/googlesamples/apps-script-oauth2/blob/master/samples/GoogleServiceAccount.gs
The token is being created and put into the scripts property store, where I can view it. So far so good.
I have this code for requesting the token and then I am trying to list the contents of the bucket in the function "getFilesList()":
function getService() {
return OAuth2.createService('CloudStoreGrab-Service')
.setTokenUrl('https://accounts.google.com/o/oauth2/token')
.setPrivateKey(creds_private_key)
.setIssuer(creds_client_email)
.setSubject(creds_user_email)
.setPropertyStore(PropertiesService.getScriptProperties())
.setScope(['https://www.googleapis.com/auth/drive','https://www.googleapis.com/auth/script.external_request','https://www.googleapis.com/auth/script.storage','https://www.googleapis.com/auth/spreadsheets']);
}
function getFilesList() {
var service = getService();
service.reset();
if (service.hasAccess()) {
var url = 'https://www.googleapis.com/storage/v1/b/'+bucket+'/o';
var response = UrlFetchApp.fetch(url, {
method: "GET",
muteHttpExceptions: true,
headers: {
Authorization: 'Bearer ' + service.getAccessToken()
}
});
}
Logger.log("Response:", response.getContentText())
}
No matter what I seem to try, the request always returns "403 Insufficient Permission". The service account has all necessary roles and permissions activated though (DwD, Storage-Administrator, Project-Owner). When I authenticate with the same credentials from gcloud and then browse the bucket with gsutils I can see the listing. This leads me to believe, that I am still requesting the auth token incorrectly. I tried using the generated token with curl and am getting the same Insufficient Permission response.
What am I doing wrong, while requesting the token?
Are the requested scopes too narrow?
Are the requested scopes too narrow?
That they are. You can find the OAuth scopes for Google's Cloud Storage API listed below (you won't need to use all of them, pick the ones best suited to your use-case, the 1st and 5th scopes in the list should be sufficient):
https://www.googleapis.com/auth/cloud-platform
https://www.googleapis.com/auth/cloud-platform.read-only
https://www.googleapis.com/auth/devstorage.full_control
https://www.googleapis.com/auth/devstorage.read_only
https://www.googleapis.com/auth/devstorage.read_write
In future, you can find the required OAuth scopes for any Google API you need at the following link:
https://developers.google.com/identity/protocols/googlescopes