How to pass data from file when calling gcloud function from CLI? - google-cloud-functions

I would like to pass data to my cloud function from a local JSON file by using the command gcloud functions call <MY-FUNCTION>.
I can successfully pass data following the official docs but I would like to know if there's a way to do such a thing:
gcloud functions call <MY-FUNCTION> --data './path/to/my/file.json'
Copy pasting the data works, but it gets inconvenient as it might contain hundreds of lines worth of information.
Is there a way to do such a thing? Perharps a workaround by using something different from the gcloud CLI?
Thanks!

You can do this:
gcloud beta functions call [[YOUR-FUNCTION]] \
--data="$(cat ./path/to/your/file.json)" \
--region=[[YOUR-REGION]] \
--project=[[YOUR-PROJECT]]
Alternatively, you can do this with curl:
curl \
--data #./path/to/your/file.json \
https://[YOUR-REGION]]-[[YOUR-PROJECT]].cloudfunctions.net/[[YOUR-FUNCTION]]

Related

Is There A Better Way To Scrape Data Off An API Curl Request?

I’m trying to scrape some data off an API and export as a JSON txt file. I have about 10,000 separate requests I would like to do and unfortunately the sequencing is not sequential and each request has a separate number I need to insert into the URL.
I’ve been doing them manually off a CURL command in terminal (macOS) and that seems to be working fine although somewhat time consuming. An example is shown below…
Request 1
curl --compressed -o 182969088.txt 'https://example.com/example/example/182969088/example' \
-X 'GET' \
-H 'x-api-key: i74lIf1J3CFa49sCZYmizr4oMtUS0t2U49m7YRNeF'
Request 2
curl --compressed -o 182962045.txt 'https://example.com/example/example/182962045/example' \
-X 'GET' \
-H 'x-api-key: i74lIf1J3CFa49sCZYmizr4oMtUS0t2U49m7YRNeF'
Does anyone know of a better way? All the separate 10,000 numbers are stored in an excel sheet. I was hoping there would be a way just to create a template and have the numbers copied in automatically and then I can just copy each individual request to the terminal instead of having to copy in the number twice and then go terminal.

Is it possible to include multiple buckets as --trigger-resource for a google cloud function?

Is it possible to include two buckets as -trigger-resource for a gcloud function? I tried the following deployment, but it only seems to be listening to events occuring in the bucket-2.
gcloud functions deploy my-function \
--entry-point functions.MyFunction \
--runtime java11 \
--memory 512MB \
--trigger-resource gs://bucket-1 \
--trigger-resource gs://bucket-2 \
--trigger-event google.storage.object.finalize \
--allow-unauthenticated \
--region=europe-west1
Would appreciate any sort of help.
A single deployed function can only trigger on changes to a single bucket at a time. If you want to trigger on multiple buckets, you can deploy the function once for each bucket. You will have to give each function a different name and --trigger-resource flag, but everything else can stay the same.
As documentation doesn't suggest such option as multiple trigger resource it is not possible I believe.
If you don't want to create extra functions you can receive bucket notifications in Pubsub and trigger your function on Pubsub events.

Filtering out jenkins console output of a job

I'm quite new in Jenkins and I would like to filter out from the jenkins console output only the json output of my unix script run via a jenkins job
To simplify my scenario, I have a MyScript unix script that returns a json output. A jenkins job wraps the MyScript execution using a "Execute shell" build action.
When I run the jenkins job, MyScript is executed and the jenkins console output returns below output:
Started by remote host ...
Building remotely on ... in workspace ...
Set build name.
New build name is '#11-/products/software/myScript.py'
[ScriptWrapper] $ /bin/sh -xe /tmp/hudson9139846468482145951.sh
+ /products/software/myScript.py -t ...
{'ip': '...', 'host': '...'}
Set build name.
New build name is '#11-/products/software/myScript.py'
Variable with name 'BUILD_DISPLAY_NAME' already exists, ...
Finished: SUCCESS
From the above output I would like to filter out only the json output of my unix script that is "{'ip': '...', 'host': '...'}" .
That it is needed as we call the jenkins job via REST API and we need to get only the json output of the called unix script:
curl -s -k -u ... --request GET "https://<jenkins uri>/jenkins/view/ScriptWrapper/job/ScriptWrapper/19/consoleText"
We tried defining a parsing rules file but in this way we are able only to highlight some lines in the console output in the "Parsed Console Output" jenkins view.
In addition it seems that this "Parsed Console Output" is not accessible via rest api:
curl -s -k -u ... --request GET "https://<jenkins uri>/jenkins/view/ScriptWrapper/job/ScriptWrapper/19/parsed_console"
-> it doesn't work
Is there any way to filter out the jenkins console output?
We are also evaluating the possibility to use the Jenkins Groovy Postbuild Plugin. Do you think it can help ?
I thank you in advance for any suggestion.
If I understand the question correctly, you wish to generate clean output containing only the text you want?
If so, then I'd suggest you modify your shell script to output the desired text to a file, and then use either the "archive artifact" function in Jenkins to make the file content available, or the "html publisher" plugin to "publish" that file.
https://wiki.jenkins-ci.org/display/JENKINS/HTML+Publisher+Plugin
I third option could be to modify your shell script to output "magic cookies" as delimiters around the string you want.
That way you can fetch the entire console output using the REST API, and then easily filter out the text you want using a simple regex.

curl command unable to pass bash parameter

so I am new to curl and am trying to write a bash script that will I can run that downloads a file. So I start off by authentication then make a POST to request a download. I am given a Foo_ID in which I parse using bash and set to a parameter. I then try to use GET the certain Foo data via a download URL. The issue I am having is that whenever I pass in the parameter I parsed from the POST response I get nothing. Here is an example of what I am doing.
#!/bin/bash
curl -b cookies -c cookies -X POST #login_info -d "https://foo.bar.com/auth"
curl -b cookies -c cookies -X POST #Foo_info -d "https://foo.bar.com/foos" > ./tmp/stuff.t
myFooID=`cat ./tmp/stuff.t |grep -Po '"foo_id:.*?",'|cut -d '"' -f 4`
curl -b cookies -c cookies "http://foo.bar.com/foo-download?id=${myFooID}" > ./myFoos/Foo1.data
I have echo'd myFooID to make sure it is correct and it is. I have also echo'd "https://foo.bar.com/foo-download?id=${myFooID}" and it is properly showing the URL I need. Could anyone help me with this like I said I am new to using curl and a little rusty on using bash commands.
So I have solved the issue. The problem was after doing my post for the Foo I didn't give enough time for my foo to be created before trying to download it. I added a sleep command between both the last two curl commands and now it works perfectly. I would like to thank Dennis Williamson for helping me clean up my code wich led me to understanding my issue. I have created a shrine for him on my desk.

Send json HTTP post with bash

I would like to send some data to a web service which accept only a json object. I will use curl to make the HTTP POST but I am wondering it there is a library to create a json object in bash.
Another requirement is to avoid installation of other packages (rpm/apt-get) but only other bash files as libraries.
This is an example taken form BigQuery API
curl -H "Authorization: GoogleLogin auth=<<YOUR_TOKEN>>" \
-X POST \
-H "Content-type: application/json" \
-d '{"params":{"q":"select count(*) from [bigquery/samples/shakespeare];"},"method":"bigquery.query"}' \
'https://www.googleapis.com/rpc'
Checkout TickTick.
It's a true Bash JSON parser.
A stringifier should be in the works shortly, (but it wouldn't be difficult to create your own just using bash's foreach).
#!/bin/bash
. /path/to/ticktick.sh
# File
DATA=`cat data.json`
# cURL
#DATA=`curl http://foobar3000.com/echo/request.json`
tickParse "$DATA"
echo ``pathname``
echo ``headers["user-agent"]``
I recommand Jshon.
http://kmkeen.com/jshon/
It is designed to be as usable as possible from within the shell and replaces fragile adhoc parsers made from grep/sed/awk as well as heavyweight one-line parsers made from perl/python.
Requires Jansson,But you can build a static link version.