How to change an Apify actor parameter via API - google-apps-script

I want to call an Apify actor and specify a parameter value via a call to the Apify API.
The actor is the Google Search Results Scraper located here.
Here is where the docs say to use queries as the object property name in the API call payload.
The following table shows specification of the actor INPUT fields as defined by its input schema. These fields can be [...] provided in a JSON object when running the actor using the API. Read more in docs.
...
Search queries or URLs
Google Search queries (e.g. food in NYC) and/or full URLs (e.g. https://www.google.com/search?q=food+NYC).
Enter one item per line.
Optional
Type: String
JSON example
"queries": "Hotels in NYC
Restaurants in NYC
https://www.google.com/search?q=restaurants+in+NYC"
After I run my Google Apps Script code, I expect to see a change in the searchQueries.term parameter to be the following.
Apify — what I expect to see
"searchQuery": {
"term": "Banks in Phoenix", // what I am trying to change to by API call
// [...]
},
But what I actually get is the same parameter value as existed the last time I ran the actor manually. As follows.
Apify — what I actually see
"searchQuery": {
"term": "CPA firms in Newark", // remaining from last time I ran the actor manually
// [...]
},
Here is the code I'm running from Google Apps Script.
Code.gs
const runSearch = () => {
const apiEndpoint= `https://api.apify.com/v2/actor-tasks/<MY-TASK-NAME>/run-sync?token=<MY-TOKEN>`
const formData = {
method: 'post',
queries: 'Banks in Phoenix',
};
const options = {
body: formData,
headers: {
'Content-Type': 'application/json',
},
};
UrlFetchApp.fetch(apiEndpoint, options,);
}
What am I doing wrong?

You are missing the payload property in the request object.
Change:
queries: 'Banks in Phoenix',
to:
payload: {
queries: 'Banks in Phoenix',
}
Code.gs
const runSearch = () => {
const apiEndpoint= `https://api.apify.com/v2/actor-tasks/<MY-TASK-NAME>/run-sync?token=<MY-TOKEN>`
const formData = {
method: 'post',
payload: {
queries: 'Banks in Phoenix',
},
};
const options = {
body: formData,
headers: {
'Content-Type': 'application/json',
},
};
UrlFetchApp.fetch(apiEndpoint, options,);
}

Related

Google Apps Script - URL Fetch App returning random numbers

I am new to Apps Script and was trying to build an API and call that API through a different script. I created the web app and published it.
This is the URL:
https://script.google.com/macros/s/AKfycbxKVmGy3fxDfoHxyDtQh7psqj7IdKF7qHbgxLAwNRoiKTA-bpKN4QKtArzwsYdFb-Hb/exec
When I open this link, I can see the data correctly but when I try to fetch this data from a different script using urlfetchapp, it returns random numbers. I need help on what I am doing incorrectly.
Script which I am using to call this data:
function GetCopies()
{
var options = {
'contentType': "application/json",
'method' : 'get',
};
var Data = UrlFetchApp.fetch('https://script.google.com/macros/s/AKfycbxKVmGy3fxDfoHxyDtQh7psqj7IdKF7qHbgxLAwNRoiKTA-bpKN4QKtArzwsYdFb-Hb/exec',options)
Logger.log(Data.getContent())
}
This is the log I get:
I tried parsing it, but it throws an error:
How can I get data from URL correctly?
A working sample:
Create two Google Apps Script projects. In my case API and fetcher
API
const doGet = () => {
const myObj = {
"name": "Mr.GAS",
"email": "mrgas#blabla.com"
}
return ContentService
.createTextOutput(JSON.stringify(myObj))
.setMimeType(
ContentService.MimeType.JSON
)
}
fetcher
const LINK = "API_LINK"
const fetchTheAPI = async () => {
const options = {
'contentType': "application/json",
'method': 'get',
}
const res = UrlFetchApp.fetch(LINK, options)
const text = res.getContentText()
console.log(JSON.parse(text))
}
Deploy the API: Select type > Web app and Who has access > Anyone, copy the URL (it is important to copy that URL not the one redirected in the browser)
Replace the "API_LINK" by the URL.
Run the function.
You only need to adapt this example to suit your needs.
Documentation:
Content Service
Web Apps

How to use Resmush.it API from Apps Script (multi files upload)

I'm trying to use the resmush.it API from Apps Script and I'm struggling when making the request with UrlFetch.
From their documentation, I understand they need a files array, in a multipart/form-data request.
What I'm doing so far (the images come from a Google Slides)
function myFunction() {
const OPTIMIZER_URL = "http://api.resmush.it/ws.php"
let slides = SlidesApp.openById("1TUZSgG_XXX_ni6VhdbE5usRyMc");
let pages = slides.getSlides();
pages.forEach((page) => {
let images = page.getImages();
images.forEach(image => {
let payload = {
files: [{
file: image.getBlob()
}]
}
let options = {
method: "POST",
payload: payload,
muteHttpExceptions : true
}
let response = UrlFetchApp.fetch(OPTIMIZER_URL, options)
let jsonResponse = JSON.parse(response.getContentText())
console.log(jsonResponse);
})
})
}
I also tried with payload = { file: image.getBlob() } but no success.
I get this error everytime:
{ error: 400,
error_long: 'No file or url provided',
generator: 'reSmush.it rev.3.0.4.20210124' }
Can you see what is wrong ?
Thank you
Although I'm not sure whether from your provided document I could correctly understand the specification of the API you want to use, how about the following modified script?
When I saw your script, I'm worried that your image.getBlob() has no name. In that case, the data is not sent as files. I thought that this might be the reason for your issue. When my this guess is reflected in your script, it becomes as follows.
Modified script:
function myFunction() {
const OPTIMIZER_URL = "http://api.resmush.it/ws.php"
let slides = SlidesApp.openById("1TUZSgG_XXX_ni6VhdbE5usRyMc");
let pages = slides.getSlides();
pages.forEach((page) => {
let images = page.getImages();
images.forEach(image => {
let options = {
method: "POST",
payload: { files: image.getBlob().setName("sample") },
muteHttpExceptions: true
}
let response = UrlFetchApp.fetch(OPTIMIZER_URL, options)
console.log(response.getContentText());
})
})
}
In this modification, the request body uses { files: image.getBlob().setName("sample") }. I'm not sure about the detailed specification of the API you want to use. So, if the key of files cannot be used, please modify files to file and test it again.
And, if your API is required to use the unique names of the uploaded files for each request, please set the unique names by modifying sample of image.getBlob().setName("sample").
Reference:
fetch(url, params)

How to call the API with dynamic url parameters based on number of id's using angular

I have the API from the backend and I need to call the API with id's changing dynamically.
ex API:
http://13.567.544/api/meters/start?id=m1 (for id 'm1')
and for m2
http://13.567.544/api/meters/start?id=m2 (for id 'm2') and so on.
How to call the above API with id's(m1,m2,m4,....) passing dynamically(pass ids dynamically in API),and I want the typescript and service code for the same. Also I want to check with the console if it is working or not in developer tools.
In
Postman, I received the following.
{
"status": true,
"action": "m1 meter started"
}
and for 'm4'
{
"status": true,
"action": "m4 meter started"
}
the 'm' values will be changed according to the server.
.service.ts
meterstart( token) {
let httpOptions = {
headers: new HttpHeaders({
'Content-Type': 'application/json',
'Authorization': 'Token ' + token
}),
Params:new URLSearchParams({
'id':this.meters.name;
})
};
this.http.get(environment.apiUrl+'/api/meters/start'+this.id,httpOptions).subscribe(
(data:any)=>{
console.log(data);
}
.component.ts
jammerstop(){
this.data=JSON.parse(localStorage.getItem("data"));
console.log(data);
}
I want the hard code for the above component and service code for calling the API, because I have not involved in this scenario

How to add media upload for BigQuery Rest API using UrlFetchApp?

I need to stream data into BigQuery from my Google Apps Script addon.
But I need to use my service account only (I need to insert data into my BigQuery table, not user's BigQuery table)
I followed this example: https://developers.google.com/apps-script/advanced/bigquery#load_csv_data
Because Apps Script Advanced Service doesn't support service account natively, so I need to change this example a bit:
Instead of using Advanced Service BigQuery, I need to get the OAuth token from my service account, then using BigQuery Rest API to handle the same job:
This is what I did:
function getBigQueryService() {
return (
OAuth2.createService('BigQuery')
// Set the endpoint URL.
.setTokenUrl('https://accounts.google.com/o/oauth2/token')
// Set the private key and issuer.
.setPrivateKey(PRIVATE_KEY)
.setIssuer(CLIENT_EMAIL)
// Set the property store where authorized tokens should be persisted.
.setPropertyStore(PropertiesService.getScriptProperties())
// Caching
.setCache(CacheService.getUserCache())
// Locking
.setLock(LockService.getUserLock())
// Set the scopes.
.setScope('https://www.googleapis.com/auth/bigquery')
)
}
export const insertLog = (userId, type) => {
const bigQueryService = getBigQueryService()
if (!bigQueryService.hasAccess()) {
console.error(bigQueryService.getLastError())
return
}
const projectId = bigqueryCredentials.project_id
const datasetId = 'usage'
const tableId = 'logs'
const row = {
timestamp: new Date().toISOString(),
userId,
type,
}
const data = Utilities.newBlob(convertToNDJson(row), 'application/octet-stream')
// Create the data upload job.
const job = {
configuration: {
load: {
destinationTable: {
projectId,
datasetId,
tableId,
},
sourceFormat: 'NEWLINE_DELIMITED_JSON',
},
},
}
const url = `https://bigquery.googleapis.com/upload/bigquery/v2/projects/${projectId}/jobs`
const headers = {
Authorization: `Bearer ${bigQueryService.getAccessToken()}`,
'Content-Type': 'application/json',
}
const options = {
method: 'post',
headers,
payload: JSON.stringify(job),
}
try {
const response = UrlFetchApp.fetch(url, options)
const result = JSON.parse(response.getContentText())
console.log(JSON.stringify(result, null, 2))
} catch (err) {
console.error(err)
}
}
As you can see in my code, I get the Blob data (which is the actual json data that I need to put in BigQuery table) using this line:
const data = Utilities.newBlob(convertToNDJson(row), 'application/octet-stream')
But I don't know where to use this data with the BigQuery Rest API
The documentation doesn't mention it: https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/insert
How this can be done? Thank you.
I can solve this problem using Tanaike's FetchApp library:
https://github.com/tanaikech/FetchApp#fetch
Anyone has this issue in the future: please check my comment in code to understand what was done.
Turn out, the job variable is treated as metadata, and the data variable is treated as file in the form data object
// First you need to convert the JSON to Newline Delimited JSON,
// then turn the whole thing to Blob using Utilities.newBlob
const data = Utilities.newBlob(convertToNDJson(row), 'application/octet-stream')
// Create the data upload job.
const job = {
configuration: {
load: {
destinationTable: {
projectId,
datasetId,
tableId,
},
sourceFormat: 'NEWLINE_DELIMITED_JSON',
},
},
}
const url = `https://bigquery.googleapis.com/upload/bigquery/v2/projects/${projectId}/jobs?uploadType=multipart`
const headers = {
Authorization: `Bearer ${bigQueryService.getAccessToken()}`,
}
const form = FetchApp.createFormData() // Create form data
form.append('metadata', Utilities.newBlob(JSON.stringify(job), 'application/json'))
form.append('file', data)
const options = {
method: 'post',
headers,
muteHttpExceptions: true,
body: form,
}
try {
FetchApp.fetch(url, options)
} catch (err) {
console.error(err)
}
Note: When you create the service account, choose role BigQuery Admin, or any role that has permission bigquery.jobs.create
https://cloud.google.com/bigquery/docs/access-control#bigquery-roles
Because if you don't, you will have the error
User does not have bigquery.jobs.create permission...

Calling third party JIRA REST API in Cloud Function for Dialogflow always timeout

We need to call JIRA Rest API to get a specific information from the given query in Dialogflow.
We need to provide response to user based on the response from the API. However, the Dialogflow is unable to retrieve any response from JIRA API through the fulfillment in Firebase Cloud function as it's always timeout.
Based on the log in Firebase console, it always take more than 6000 ms for the execution.
Meanwhile if I use postman to call the JIRA REST API, it takes less than 1 second to get the response.
Some said we need to use promise but I does not seem to work as well.
Please help how should I solve this problem?
Please see my code below
function checkcontract(agent){
var parameters = request.body.queryResult.parameters;
var customer_id = parameters.customer_id;
var bodyData = JSON.stringify({"jql": "project = CDB AND 'Customer ID' ~ "+customer_id,
"maxResults": 1,
"fieldsByKeys": false,
"fields": [
"summary",
"customfield_11949", //Customer ID custom field
"customfield_11937", // Contract Start Date
"customfield_11938", //Contract End Date
"customfield_11936", //email
"customfield_11946", //default JSD request id
"customfield_11943", //project id
"customfield_11941" //project key
],
"startAt": 0
});
var options = {
method: 'POST',
url: '/rest/api/3/search',
auth: { bearer: authorization_token },
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
body: bodyData
};
request(options, function (error, response, body) {
if (error) throw new Error(error);
console.log(
'Response: ' + response.statusCode + ' ' + response.statusMessage
);
console.log(body);
});
}
EDIT:
JIRA API returns a response to the function. But the agent.add("message") does not return anything to the chat.