I'm trying to use take a JSON object from a REST API GET call, then add the contents of a raw file (.md file in this case) into it using jq, before updating the same object using a PUT call.
I'm using the following to GET the file and write it locally:
curl -u USERNAME:PASSWORD 'https://example.com/myfile.json' | cat > ./test.json
The format of the JSON file is as follows:
{
"content" : "My JSON content",
"owner":...
}
I'd like to add the content of the raw file into content, so the result is as follows:
{
"content" : "My markdown file contents.\n\nMy JSON content",
"owner":...
}
and then update the original JSON using a PUT call:
curl -u USERNAME:PASSWORD -d #./test.json -H "Content-Type: application/json" -X PUT 'https://example.com/myfile.json'
I'm wondering how I can add the file content into my JSON file like that using jq, if it is possible?
The key to a simple jq solution here is to read the raw text file using 'jq -R -s', and to read the JSON using one of the options in the --arg family. Here, I'll use --argfile for simplicity and robustness across jq versions, but please note that the documentation says it is deprecated.
With the following jq program in a file, say program.jq:
. as $file
| $json
| (.content = $file + "\n" + .content)
and the following text in the file contents.txt:
Line 1 of contents.txt;
line 2.
and the following JSON in curl.json:
{
"content": "My JSON content",
"owner": "etc"
}
the invocation:
jq -R -s --argfile json curl.json -f program.jq contents.txt
produces:
{
"content": "Line 1 of contents.txt;\nline 2.\n\nMy JSON content",
"owner": "etc"
}
If using bash, instead of putting the curl output into a file, you could use: --argfile json <(curl ....)
Related
While there are several posts about this topic on Stack Overflow, none match my exact use case. I am using a Linux shell script to run SnowSQL to generate a json file.
========================
My json file needs to have a comma between json objects.
This:
{
"CAMPAIGN": "Welcome_New",
"UUID": "fe881781-bdc2-41b2-95f2-e0e8c19dc597"
}
{
"CAMPAIGN": "Welcome_Existing",
"UUID": "77a41c02-beb9-48bf-ada4-b2074c1a78cb"
}
...needs to look this:
{
"CAMPAIGN": "Welcome_New",
"UUID": "fe881781-bdc2-41b2-95f2-e0e8c19dc597"
},
{
"CAMPAIGN": "Welcome_Existing",
"UUID": "77a41c02-beb9-48bf-ada4-b2074c1a78cb"
}
Here is my complete ksh script:
#!/usr/bin/ksh
. /appl/.snf_logon
export SNOW_PKEY_FILE=$(mktemp ./pkey-XXXXXX)
trap "rm -f ${SNOW_PKEY_FILE}" EXIT
LibGetSnowCred
{
outFile=JSON_FILE_TYPE_TEST.json
inDir=/testing
outFileNm=#my_db.my_schema.my_file_stage/${outFile}
snowsql \
--private-key-path $SNOW_PKEY_FILE \
-o exit_on_error=true \
-o friendly=false \
-o timing=false \
-o log_level=ERROR \
-o echo=true <<!
COPY INTO ${outFileNm}
FROM (SELECT object_construct(
'UUID',UUID
,'CAMPAIGN',CAMPAIGN)
FROM my_db.my_schema.JSON_Test_Table
LIMIT 2)
FILE_FORMAT=(
TYPE=JSON
COMPRESSION=NONE
)
OVERWRITE=True
HEADER=False
SINGLE=True
MAX_FILE_SIZE=4900000000
;
get ${outFileNm} file://${inDir}/;
rm ${outFileNm};
!
if [ $? -eq 0 ]; then
echo "Export successful"
else
echo "ERROR in export"
fi
}
Is the best practice to add the comma during the SELECT or after the file is generated and how?
With or without that comma, the text is still not JSON but just a random text that looks like JSON. You export several rows, each row as an independent object. You need to gather all these objects into an array to produce a valid JSON.
A JSON that encodes an array of rows looks like this:
[
{
"CAMPAIGN": "Welcome_New",
"UUID": "fe881781-bdc2-41b2-95f2-e0e8c19dc597"
},
{
"CAMPAIGN": "Welcome_Existing",
"UUID": "77a41c02-beb9-48bf-ada4-b2074c1a78cb"
}
]
The easiest way to produce this output would be to ask the database, if it supports this option (to wrap all the records into a list before generating the JSON, to not export each record in a separate JSON).
If this is not possible then you have a file that contains multiple JSONs. You can use jq to convert these individual JSONs into a JSON similar to the one described above (encoding an array of objects).
It is as simple as that:
jq --slurp '.' input_file > output_file
The option --slurp tells jq to read all the JSONs from the file input_file in memory, to parse them and to put them into an array. That is the program input.
'.' is the jq program. It says "dump the current object". It does not do any processing to the input data. The current object is the array.
After it executes the program (which, in this case doesn't do anything), jq dumps the modified value (as JSON, of course) to the standard output (by default, on screen).
The > output_file part redirects this output to a file (named output_file) instead of showing it on screen.
You can see how it works on the jq playground.
I need to grab variables from JSON properties.
The JSON array looks like this (GitHub API for repository tags), which I obtain from a curl request.
[
{
"name": "my-tag-name",
"zipball_url": "https://api.github.com/repos/path-to-my-tag-name",
"tarball_url": "https://api.github.com/repos/path-to-my-tag-name-tarball",
"commit": {
"sha": "commit-sha",
"url": "https://api.github.com/repos/path-to-my-commit-sha"
},
"node_id": "node-id"
},
{
"name": "another-tag-name",
"zipball_url": "https://api.github.com/repos/path-to-my-tag-name",
"tarball_url": "https://api.github.com/repos/path-to-my-tag-name-tarball",
"commit": {
"sha": "commit-sha",
"url": "https://api.github.com/repos/path-to-my-commit-sha"
},
"node_id": "node-id"
},
]
In my actual JSON there are 100s of objects like these.
While I loop each one of these I need to grab the name and the commit URL, then perform more operations with these two variables before I get to the next object and repeat.
I tried (with and without -r)
tags=$(curl -s -u "${GITHUB_USERNAME}:${GITHUB_TOKEN}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/repos/path-to-my-repository/tags?per_page=100&page=${page}")
for row in $(jq -r '.[]' <<< "$tags"); do
tag=$(jq -r '.name' <<< "$row")
# I have also tried with the syntax:
url=$(echo "${row}" | jq -r '.commit.url')
# do stuff with $tag and $url...
done
But I get errors like:
parse error: Unfinished JSON term at EOF at line 2, column 0 jq: error
(at :1): Cannot index string with string "name" } parse error:
Unmatched '}' at line 1, column 1
And from the terminal output it appears that it is trying to parse $row in a strange way, trying to grab .name from every substring? Not sure.
I am assuming the output from $(jq '.[]' <<< "$tags") could be valid JSON, from which I could again use jq to grab the object properties I need, but maybe that is not the case? If I output ${row} it does look like valid JSON to me, and I tried pasting the results in a JSON validator, everything seems to check out...
How do I grab the ".name" and ".commit.url" for each of these object before I move onto the next one?
Thanks
It would be better to avoid calling jq more than once. Consider, for example:
while read -r name ; do
read -r url
echo "$name" "$url"
done < <( curl .... | jq -r '.[] | .name, .commit.url' )
where curl .... signifies the relevant invocation of curl.
I'm using conversocial API:
https://api-docs.conversocial.com/1.1/reports/
Using the sample from the documentation, as after all tweaks I receive this "output"
{
"report": {
"name": "dump", "generation_start_date": "2012-05-30T17:09:40",
"url": "https://api.conversocial.com/v1.1/reports/5067",
"date_from": "2012-05-21",
"generated_by": {
"url": "https://api.conversocial.com/v1.1/moderators/11599",
"id": "11599"
},
"generated_date": "2012-05-30T17:09:41",
"channel": {
"url": "https://api.conversocial.com/v1.1/channels/387",
"id": "387"
},
"date_to": "2012-05-28",
"download": "https://s3.amazonaws.com/conversocial/reports/70c68360-1234/#twitter-from-may-21-2012-to-may-28-2012.zip",
"id": "5067"
}
}
Currently, I can sort this JSON output to download only and will receive this output
{
"report" : {
"download" : "https://s3.amazonaws.com/conversocial/reports/70c68360-1234/#twitter-from-may-21-2012-to-may-28-2012.zip"
}
}
Is it anyway of automating this process by using CURL, to make curl download this file?
To download I'm planning to use simple way as:
curl URL_LINK > FILEPATH/EXAMPLE.ZIP
Currently thinking is there is a way to replace URL_LINK with download link?? Or any other way, method, way around????
Give a try to this:
curl $(curl -s https://httpbin.org/get | jq ".url" -r) > file
Just replace your url and the jq params, based in your json, thay may be:
jq ".report.download" -r
The -r will remove the double quotes "
The way it works is by using a command substitution $():
$(curl -s https://httpbin.org/get | jq ".url" -r)
This will fetch you URL and extract the new URL from the returned JSON using jq the one later is passed to curl as an argument.
This question already has answers here:
Parsing JSON with Unix tools
(45 answers)
Closed 5 years ago.
I am writing a shell script to run some api's. It return response fine but i need some specific parameter to grep from the response and want to save in file.
My script look like
#!/bin/sh
response=$(curl 'https://example.com' -H 'Content-Type: application/json' )
echo "$response"
reponse is something like
{
status:"success",
response:{
"target":"",
"content":"test content"
}
}
Response is fine and i am able to write whole response in file but My requirement is to save only "content" inside "response" object using the script. which i need for another api.
Note: I cannot change api responses as I am working third party api's;
Thank you
If the output is proper JSON:
$ cat proper.json
{
"status": "success",
"response": {
"target": "",
"content": "test content"
}
}
$ response=$(cat proper.json)
You could use jq:
$ echo $response | jq -r '.response.content'
test content
You can grep for the content and then use awk to split by : and take only the value, not the key
grep "\"content\":" | awk -F":" '{ print $2}'
Will print "test content"
You can do this to get the value of contents into a variable ($content).
content=$(echo "$response" | cut -d'"' -f 7)
Explanation - Split the $response using " (double quote) as the delimiter and use the 7th field of the output (i.e the value (test content) of content in the json response)
Here is an excerpt from the description and usage of the cut command
if you like to extract a whole field, you can combine option -f and -d. The option -f specifies which field you want to extract, and the option -d specifies what is the field delimiter that is used in the input file.
I have a JSON and I need to extract a base64-encoded value by particular key and decode it.
JSON has the following structure:
[
{
"LockIndex": 0,
"Key": "Arul/key1",
"Flags": 0,
"Value": "MzAKCg==",
"CreateIndex": 369,
"ModifyIndex": 554
}
]
In the above JSON, I need to extract only "Value":"MzAKCg==" and decode the base64-encoded "MzAKCg==" value. I would like to perform this using shell scripting.
Please assist.
jq has recently added support for base64 encoding and decoding
https://stedolan.github.io/jq/manual/#Formatstringsandescaping
#base64:
The input is converted to base64 as specified by RFC 4648.
#base64d:
The inverse of #base64, input is decoded as specified by RFC 4648.
Note: If the decoded string is not UTF-8, the results are undefined.
For your data, the command would be
jq -r 'map(.Value | #base64d)' < file.json
https://github.com/stedolan/jq/issues/47
It's not released yet, but you can install the latest development version to use it.
brew reinstall --HEAD jq
Once the next version of jq is released then you can switch back to the latest stable version.
Using jq and base64:
jq -r '.[].Value' < file.json | base64 --decode
if this JSON will have always the same structure, you can use cut -d and later decode the value, for example:
$echo "Value": "MzAKCg==" | cut -d ":" -f 2 | base64 -D
30
This worked for me: curl --silent localhost:8500/v1/kv/key1 | jq -r '.[0].Value | #base64d'
use built-in in jq base64 decoder:
echo '{"val" : "MzAKCg=="}' | jq -r '.val |=#base64d'
yields:
{ "val": "30\n\n" }
side note: jq is simply a file symbolic for jq-win64.exe
if you want to keep the origin struct, you could use like jq -r '.|=map(.key|=#base64d)
With xidel:
xidel -s input.json -e '$json()/binary-to-string(base64Binary(Value))'
or
xidel -s input.json -e 'binary-to-string(base64Binary($json//Value))'