Get specific string line from file bash - json

I have a file with this kind of text with pattern
[{"foo":"bar:baz:foo*","bar*":"baz*","etc":"etc"},
{"foo2":"bar2:baz2:foo2*","bar2*":"baz2*","etc":"etc"},
{"foo3":"bar3:baz3:foo3*","bar3*":"baz3*","etc":"etc"},
{"foo4":"bar4:baz4:foo4*","bar4*":"baz4*","etc":"etc"}]
I need to take every string like this
{"foo":"bar:baz:foo*","bar*":"baz*","etc":"etc"} and send each of them to some url via curl
for i in text.txt
do (awk,sed,grep etc)
then curl $string
I can't figure out how to get the desired lines properly from the file without unnecessary symbols

I suggest that you can use jq to process your json file. jq is capable of reading json, and formatting output. Here's an example jq script to process your json file (which I unimaginatively call 'jsonfile'):
jq -r '.[] | "curl -d '\'' \(.) '\'' http://restful.com/api " ' jsonfile
Here's the output:
curl -d ' {"foo":"bar:baz:foo*","bar*":"baz*","etc":"etc"} ' http://restful.com/api
curl -d ' {"foo2":"bar2:baz2:foo2*","bar2*":"baz2*","etc":"etc"} ' http://restful.com/api
curl -d ' {"foo3":"bar3:baz3:foo3*","bar3*":"baz3*","etc":"etc"} ' http://restful.com/api
curl -d ' {"foo4":"bar4:baz4:foo4*","bar4*":"baz4*","etc":"etc"} ' http://restful.com/api
Here's what's going on:
We pass three arguments to the jq program: jq -r <script> <inputfile>.
The -r tells jq to output the results in raw format (that is, please don't escape quotes and stuff).
The script looks like this:
.[] | "some string \(.)"
The first . means take the whole json structure and the [] means iterate through each array element in the structure. The | is a filter that processes each element in the array. The filter is to output a string. We are using \(.) to interpolate the whole element passed into the | filter.
Wow... I've never really explained a jq script before (and it shows). But the crux of it is, we are using jq to find each element in the json array and insert it into a string. Our string is this:
curl -d '<the json dictionary array element>' http://restful.com/api
Ok. And you see the output. It works. But wait a second, we only have output. Let's tell the shell to run each line like this:
jq -r '.[] | "curl -d '\'' \(.) '\'' http://restful.com/api " ' jsonfile | bash
By piping the output to bash, we execute each line that we output. Essentially, we are writing a bash script with jq to curl http://restful.com/api passing the json element as the -d data parameter to POST the json element.
Revisiting for single quote issue
#oguz ismail pointed out that bash will explode if there is a single quote in the json input file. This is true. We can avoid the quote by escaping, but we gain more complexity - making this a non-ideal approach.
Here's the problem input (I just inserted a single quote):
[{"foo":"bar:'baz:foo*","bar*":"baz*","etc":"etc"},
{"foo2":"bar2:baz2:foo2*","bar2*":"baz2*","etc":"etc"},
{"foo3":"bar3:baz3:foo3*","bar3*":"baz3*","etc":"etc"},
{"foo4":"bar4:baz4:foo4*","bar4*":"baz4*","etc":"etc"}]
Notice above that baz is now 'baz. The problem is that a single single quote makes the bash shell complain about unmatched quotes:
$ jq -r '.[] | "curl -d '\'' \(.) '\'' http://restful.com/api " ' jsonfile | bash
bash: line 4: unexpected EOF while looking for matching `"'
bash: line 5: syntax error: unexpected end of file
Here's the solution:
$ jq -r $'.[] | "\(.)" | gsub( "\'" ; "\\\\\'" ) | "echo $\'\(.)\'" ' jsonfile | bash
{"foo":"bar'baz:foo*","bar*":"baz*","etc":"etc"}
{"foo2":"bar2:baz2:foo2*","bar2*":"baz2*","etc":"etc"}
{"foo3":"bar3:baz3:foo3*","bar3*":"baz3*","etc":"etc"}
{"foo4":"bar4:baz4:foo4*","bar4*":"baz4*","etc":"etc"}
Above I am using $'' to quote the jq script. This allows me to escape single quotes using '. I've also changed the curl command to echo so I can test the bash script without bothering the folks at http://restful.com/api.
The 'trick' is to make sure that the bash script we generate also escapes all single quotes with a backslash . So, we have to change ' to \'. That's what gsub is doing.
gsub( "\'" ; "\\\\\'" )
After making that substitution ( ' --> \' ) we pipe the entire string to this:
"echo $\'\(.)\'"
which surrounds the output of gsub with echo $''. Now we are using $' again so the \' is properly understood by bash.
So we wind up with this when we put the curl back in:
jq -r $'.[] | "\(.)" | gsub( "\'" ; "\\\\\'" ) | "curl -d $\'\(.)\' http://restful.com/api " ' jsonfile | bash

Use jq command. This is just example parsing.
for k in $(jq -c '.[]' a.txt); do
echo "hello-" $k
done
Output:
hello- {"foo":"bar:baz:foo*","bar*":"baz*","etc":"etc"}
hello- {"foo2":"bar2:baz2:foo2*","bar2*":"baz2*","etc":"etc"}
hello- {"foo3":"bar3:baz3:foo3*","bar3*":"baz3*","etc":"etc"}
hello- {"foo4":"bar4:baz4:foo4*","bar4*":"baz4*","etc":"etc"}
You can use the $k anywhere inside the loop you want.
for k in $(jq -c '.[]' a.txt); do
curl -d "$k" <url>
done

Related

/bin/jq: Argument list too long

I'm using groovy script and write this line
Version_List=$(jq -r '.items[] .version' "${projectContent}")
the ${projectContent} is another variable which is result of curl command
when i run the pipeline this is the error /bin/jq: Argument list too long
Use jq -Rs . <<< "${projectContent}" to convert the variable content into a JSON string (you may even use curl directly, like so: curl … | jq -Rs), then import that string into a jq variable using --slurpfile with Process Substitution
jq --slurpfile projectContent <(
jq -Rs . <<< "${projectContent}"
) '...' # your jq code here using variable $projectContent
Don't forget to also add either -n or an input file.

use curl/bash command in jq

I am trying to get a list of URL after redirection using bash scripting. Say, google.com gets redirected to http://www.google.com with 301 status.
What I have tried is:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read line; do
curl -LSs -o /dev/null -w %{url_effective} $line 2>/dev/null
done
So, is it possible for us to use commands like curl inside jq for processing JSON objects.
I want to add the resulting URL to existing JSON structure like:
[
{
"url": "google.com",
"redirection": "http://www.google.com"
},
{
"url": "microsoft.com",
"redirection": "https://www.microsoft.com"
}
]
Thank you in advance..!
curl is capable of making multiple transfers in a single process, and it can also read command line arguments from a file or stdin, so, you don't need a loop at all, just put that JSON into a file and run this:
jq -r '"-o /dev/null\nurl = \(.[].url)"' file |
curl -sSLK- -w'%{url_effective}\n' |
jq -R 'fromjson | map(. + {redirection: input})' file -
This way only 3 processes will be spawned for the whole task, instead of n + 2 where n is the number of URLs.
I would generate a dictionary with jq per url and slurp those dictionaries into the final list with jq -s:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read url; do
redirect=$(curl -LSs \
-o /dev/null \
-w '%{url_effective}' \
"${url}" 2>/dev/null)
jq --null-input --arg url "${url}" --arg redirect "${redirect}" \
'{url:$url, redirect: $redirect}'
done | jq -s
Alternative (first) solution:
You can output the url and the effective_url as tab separated data and create the output json with jq:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read line; do
prefix="${line}\t"
curl -LSs -o /dev/null -w "${prefix}"'%{url_effective}'"\n" "$line" 2>/dev/null
done | jq -r --raw-input 'split("\t")|{"url":.[0],"redirection":.[1]}'
Both solutions will generate valid json, independently of whatever characters the url/effective_url might contain.
Trying to keep this in JSON all the way is pretty cumbersome. I would simply try to make Bash construct a new valid JSON fragment inside the loop.
So in other words, if $url is the URL and $redirect is where it redirects to, you can do something like
printf '{"url": "%s", "redirection": "%s"}\n' "$url" "$redirect"
to produce JSON output from these strings. So tying it all together
jq -r '.[].url' <<<"$json" |
while read -r url; do
printf '{"url:" "%s", "redirection": "%s"}\n' \
"$url" "$(curl -LSs -o /dev/null -w '%{url_effective}' "$url")"
done |
jq -s
This is still pretty brittle; in particular, if either of the printf input strings could contain a literal double quote, that should properly be escaped.

Extract json response to shell variable using jq

I have a sample json response as shown below which i am trying to parse using jq in shell script.[{"id":1,"notes":"Demo1\nDemo2"}]
This is the command through which I am trying to access notes in the shell script.
value=($(curl $URL | jq -r '.[].notes'))
When I echo "$value" I only get Demo1. How to get the exact value: Demo1\nDemo2 ?
To clarify, there is no backslash or n in the notes field. \n is JSON's way of encoding a literal linefeed, so the value you should be expecting is:
Demo1
Demo2
The issue you're seeing is because you have split the value on whitespace and created an array. Each value can be accessed by index:
$ cat myscript
data='[{"id":1,"notes":"Demo1\nDemo2"}]'
value=($(printf '%s' "$data" | jq -r '.[].notes'))
echo "The first value was ${value[0]} and the second ${value[1]}"
$ bash myscript
The first value was Demo1 and the second Demo2
To instead get it as a simple string, remove the parens from value=(..):
$ cat myscript2
data='[{"id":1,"notes":"Demo1\nDemo2"}]'
value=$(printf '%s' "$data" | jq -r '.[].notes')
echo "$value"
$ bash myscript2
Demo1
Demo2

Build a JSON string with Bash variables

I need to read these bash variables into my JSON string and I am not familiar with bash. any help is appreciated.
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING='{"bucketname":"$BUCKET_NAME"","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}'
echo $JSON_STRING
You are better off using a program like jq to generate the JSON, if you don't know ahead of time if the contents of the variables are properly escaped for inclusion in JSON. Otherwise, you will just end up with invalid JSON for your trouble.
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING=$( jq -n \
--arg bn "$BUCKET_NAME" \
--arg on "$OBJECT_NAME" \
--arg tl "$TARGET_LOCATION" \
'{bucketname: $bn, objectname: $on, targetlocation: $tl}' )
You can use printf:
JSON_FMT='{"bucketname":"%s","objectname":"%s","targetlocation":"%s"}\n'
printf "$JSON_FMT" "$BUCKET_NAME" "$OBJECT_NAME" "$TARGET_LOCATION"
much clear and simpler
A possibility:
#!/bin/bash
BUCKET_NAME="testbucket"
OBJECT_NAME="testworkflow-2.0.1.jar"
TARGET_LOCATION="/opt/test/testworkflow-2.0.1.jar
# one line
JSON_STRING='{"bucketname":"'"$BUCKET_NAME"'","objectname":"'"$OBJECT_NAME"'","targetlocation":"'"$TARGET_LOCATION"'"}'
# multi-line
JSON_STRING="{
\"bucketname\":\"${BUCKET_NAME}\",
\"objectname\":\"${OBJECT_NAME}\",
\"targetlocation\":\"${TARGET_LOCATION}\"
}"
# [optional] validate the string is valid json
echo "${JSON_STRING}" | jq
In addition to chepner's answer, it's also possible to construct the object completely from args with this simple recipe:
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING=$(jq -n \
--arg bucketname "$BUCKET_NAME" \
--arg objectname "$OBJECT_NAME" \
--arg targetlocation "$TARGET_LOCATION" \
'$ARGS.named')
Explanation:
--null-input | -n disabled reading input. From the man page: Don't read any input at all! Instead, the filter is run once using null as the input. This is useful when using jq as a simple calculator or to construct JSON data from scratch.
--arg name value passes values to the program as predefined variables: value is available as $name. All named arguments are also available as $ARGS.named
Because the format of $ARGS.named is already an object, jq can output it as is.
First, don't use ALL_CAPS_VARNAMES: it's too easy to accidentally overwrite a crucial shell variable (like PATH)
Mixing single and double quotes in shell strings can be a hassle. In this case, I'd use printf:
bucket_name=testbucket
object_name=testworkflow-2.0.1.jar
target_location=/opt/test/testworkflow-2.0.1.jar
template='{"bucketname":"%s","objectname":"%s","targetlocation":"%s"}'
json_string=$(printf "$template" "$BUCKET_NAME" "$OBJECT_NAME" "$TARGET_LOCATION")
echo "$json_string"
For homework, read this page carefully: Security implications of forgetting to quote a variable in bash/POSIX shells
A note on creating JSON with string concatenation: there are edge cases. For example, if any of your strings contain double quotes, you can broken JSON:
$ bucket_name='a "string with quotes"'
$ printf '{"bucket":"%s"}\n' "$bucket_name"
{"bucket":"a "string with quotes""}
Do do this more safely with bash, we need to escape that string's double quotes:
$ printf '{"bucket":"%s"}\n' "${bucket_name//\"/\\\"}"
{"bucket":"a \"string with quotes\""}
I had to work out all possible ways to deal json strings in a command request, Please look at the following code to see why using single quotes can fail if used incorrectly.
# Create Release and Tag commit in Github repository
# returns string with in-place substituted variables
json=$(cat <<-END
{
"tag_name": "${version}",
"target_commitish": "${branch}",
"name": "${title}",
"body": "${notes}",
"draft": ${is_draft},
"prerelease": ${is_prerelease}
}
END
)
# returns raw string without any substitutions
# single or double quoted delimiter - check HEREDOC specs
json=$(cat <<-!"END" # or 'END'
{
"tag_name": "${version}",
"target_commitish": "${branch}",
"name": "${title}",
"body": "${notes}",
"draft": ${is_draft},
"prerelease": ${is_prerelease}
}
END
)
# prints fully formatted string with substituted variables as follows:
echo "${json}"
{
"tag_name" : "My_tag",
"target_commitish":"My_branch"
....
}
Note 1: Use of single vs double quotes
# enclosing in single quotes means no variable substitution
# (treats everything as raw char literals)
echo '${json}'
${json}
echo '"${json}"'
"${json}"
# enclosing in single quotes and outer double quotes causes
# variable expansion surrounded by single quotes(treated as raw char literals).
echo "'${json}'"
'{
"tag_name" : "My_tag",
"target_commitish":"My_branch"
....
}'
Note 2: Caution with Line terminators
Note the json string is formatted with line terminators such as LF \n
or carriage return \r(if its encoded on windows it contains CRLF \r\n)
using (translate) tr utility from shell we can remove the line terminators if any
# following code serializes json and removes any line terminators
# in substituted value/object variables too
json=$(echo "$json" | tr -d '\n' | tr -d '\r' )
# string enclosed in single quotes are still raw literals
echo '${json}'
${json}
echo '"${json}"'
"${json}"
# After CRLF/LF are removed
echo "'${json}'"
'{ "tag_name" : "My_tag", "target_commitish":"My_branch" .... }'
Note 3: Formatting
while manipulating json string with variables, we can use combination of ' and " such as following, if we want to protect some raw literals using outer double quotes to have in place substirution/string interpolation:
# mixing ' and "
username=admin
password=pass
echo "$username:$password"
admin:pass
echo "$username"':'"$password"
admin:pass
echo "$username"'[${delimiter}]'"$password"
admin[${delimiter}]pass
Note 4: Using in a command
Following curl request already removes existing \n (ie serializes json)
response=$(curl -i \
--user ${username}:${api_token} \
-X POST \
-H 'Accept: application/vnd.github.v3+json' \
-d "$json" \
"https://api.github.com/repos/${username}/${repository}/releases" \
--output /dev/null \
--write-out "%{http_code}" \
--silent
)
So when using it for command variables, validate if it is properly formatted before using it :)
If you need to build a JSON representation where members mapped to undefined or empty variables should be ommited, then jo can help.
#!/bin/bash
BUCKET_NAME=testbucket
OBJECT_NAME=""
JO_OPTS=()
if [[ ! "${BUCKET_NAME}x" = "x" ]] ; then
JO_OPTS+=("bucketname=${BUCKET_NAME}")
fi
if [[ ! "${OBJECT_NAME}x" = "x" ]] ; then
JO_OPTS+=("objectname=${OBJECT_NAME}")
fi
if [[ ! "${TARGET_LOCATION}x" = "x" ]] ; then
JO_OPTS+=("targetlocation=${TARGET_LOCATION}")
fi
jo "${JO_OPTS[#]}"
The output of the commands above would be just (note the absence of objectname and targetlocation members):
{"bucketname":"testbucket"}
can be done following way:
JSON_STRING='{"bucketname":"'$BUCKET_NAME'","objectname":"'$OBJECT_NAME'","targetlocation":"'$TARGET_LOCATION'"}'
For Node.js Developer, or if you have node environment installed, you can try this:
JSON_STRING=$(node -e "console.log(JSON.stringify({bucketname: $BUCKET_NAME, objectname: $OBJECT_NAME, targetlocation: $TARGET_LOCATION}))")
Advantage of this method is you can easily convert very complicated JSON Object (like object contains array, or if you need int value instead of string) to JSON String without worrying about invalid json error.
Disadvantage is it's relying on Node.js environment.
These solutions come a little late but I think they are inherently simpler that previous suggestions (avoiding the complications of quoting and escaping).
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
# Initial unsuccessful solution
JSON_STRING='{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}'
echo $JSON_STRING
# If your substitution variables have NO whitespace this is sufficient
JSON_STRING=$(tr -d [:space:] <<JSON
{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}
JSON
)
echo $JSON_STRING
# If your substitution variables are more general and maybe have whitespace this works
JSON_STRING=$(jq -c . <<JSON
{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}
JSON
)
echo $JSON_STRING
#... A change in layout could also make it more maintainable
JSON_STRING=$(jq -c . <<JSON
{
"bucketname" : "$BUCKET_NAME",
"objectname" : "$OBJECT_NAME",
"targetlocation" : "$TARGET_LOCATION"
}
JSON
)
echo $JSON_STRING
To build upon Hao's answer using NodeJS: you can split up the lines, and use the -p option which saves having to use console.log.
JSON_STRING=$(node -pe "
JSON.stringify({
bucketname: process.env.BUCKET_NAME,
objectname: process.env.OBJECT_NAME,
targetlocation: process.env.TARGET_LOCATION
});
")
An inconvenience is that you need to export the variables beforehand, i.e.
export BUCKET_NAME=testbucket
# etc.
Note: You might be thinking, why use process.env? Why not just use single quotes and have bucketname: '$BUCKET_NAME', etc so bash inserts the variables? The reason is that using process.env is safer - if you don't have control over the contents of $TARGET_LOCATION it could inject JavaScript into your node command and do malicious things (by closing the single quote, e.g. the $TARGET_LOCATION string contents could be '}); /* Here I can run commands to delete files! */; console.log({'a': 'b. On the other hand, process.env takes care of sanitising the input.
You could use envsubst:
export VAR="some_value_here"
echo '{"test":"$VAR"}' | envsubst > json.json
also it might be a "template" file:
//json.template
{"var": "$VALUE", "another_var":"$ANOTHER_VALUE"}
So after you could do:
export VALUE="some_value_here"
export ANOTHER_VALUE="something_else"
cat json.template | envsubst > misha.json
For a general case of building JSON from bash with arbitrary inputs, many of the previous responses (even the high voted ones with jq) omit cases when the variables contain " double quote, or \n newline escape string, and you need complex string concatenation of the inputs.
When using jq you need to printf %b the input first to get the \n converted to real newlines, so that once you pass through jq you get \n back and not \\n.
I found this with version with nodejs to be quite easy to reason about if you know javascript/nodejs well:
TITLE='Title'
AUTHOR='Bob'
JSON=$( TITLE="$TITLE" AUTHOR="$AUTHOR" node -p 'JSON.stringify( {"message": `Title: ${process.env.TITLE}\n\nAuthor: ${process.env.AUTHOR}`} )' )
It's a bit verbose due to process.env. but allows to properly pass the variables from shell, and then format things inside (nodejs) backticks in a safe way.
This outputs:
printf "%s\n" "$JSON"
{"message":"Title: Title\n\nAuthor: Bob"}
(Note: when having a variable with \n always use printf "%s\n" "$VAR" and not echo "$VAR", whose output is platform-dependent! See here for details)
Similar thing with jq would be
TITLE='Title'
AUTHOR='Bob'
MESSAGE="Title: ${TITLE}\n\nAuthor: ${AUTHOR}"
MESSAGE_ESCAPED_FOR_JQ=$(printf %b "${MESSAGE}")
JSON=$( jq '{"message": $jq_msg}' --arg jq_msg "$MESSAGE_ESCAPED_FOR_JQ" --null-input --compact-output --raw-output --monochrome-output )
(the last two params are not necessary when running in a subshell, but I just added them so that the output is then same when you run the jq command in a top-level shell).
Bash will not insert variables into a single-quote string. In order to get the variables bash needs a double-quote string.
You need to use double-quote string for the JSON and just escape double-quote characters inside JSON string.
Example:
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING="{\"bucketname\":\"$BUCKET_NAME\",\"objectname\":\"$OBJECT_NAME\",\"targetlocation\":\"$TARGET_LOCATION\"}"
echo $JSON_STRING
if you have node.js and get minimist installed in global:
jc() {
node -p "JSON.stringify(require('minimist')(process.argv), (k,v) => k=='_'?undefined:v)" -- "$#"
}
jc --key1 foo --number 12 --boolean \
--under_score 'abc def' --'white space' ' '
# {"key1":"foo","number":12,"boolean":true,"under_score":"abc def","white space":" "}
you can post it with curl or what:
curl --data "$(jc --type message --value 'hello world!')" \
--header 'content-type: application/json' \
http://server.ip/api/endpoint
be careful that minimist will parse dot:
jc --m.room.member #gholk:ccns.io
# {"m":{"room":{"member":"#gholk:ccns.io"}}}
Used this for AWS Macie configuration:
JSON_CONFIG=$( jq -n \
--arg bucket_name "$BUCKET_NAME" \
--arg kms_key_arn "$KMS_KEY_ARN" \
'{"s3Destination":{"bucketName":$bucket_name,"kmsKeyArn":$kms_key_arn}}'
)
aws macie2 put-classification-export-configuration --configuration "$JSON_CONFIG"
You can simply make a call like this to print the JSON.
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
echo '{ "bucketName": "'"$BUCKET_NAME"'", "objectName": "'"$OBJECT_NAME"'", "targetLocation": "'"$TARGET_LOCATION"'" }'
or
JSON_STRING='{ "bucketName": "'"$BUCKET_NAME"'", "objectName": "'"$OBJECT_NAME"'", "targetLocation": "'"$TARGET_LOCATION"'" }'
echo $JOSN_STRING

Shell Script CURL JSON value to variable

I was wondering how to parse the CURL JSON output from the server into variables.
Currently, I have -
curl -X POST -H "Content: agent-type: application/x-www-form-urlencoded" https://www.toontownrewritten.com/api/login?format=json -d username="$USERNAME" -d password="$PASSWORD" | python -m json.tool
But it only outputs the JSON from the server and then have it parsed, like so:
{
"eta": "0",
"position": "0",
"queueToken": "6bee9e85-343f-41c7-a4d3-156f901da615",
"success": "delayed"
}
But how do I put - for example the success value above returned from the server into a variable $SUCCESS and have the value as delayed & have queueToken as a variable $queueToken and 6bee9e85-343f-41c7-a4d3-156f901da615 as a value?
Then when I use-
echo "$SUCCESS"
it shows this as the output -
delayed
And when I use
echo "$queueToken"
and the output as
6bee9e85-343f-41c7-a4d3-156f901da615
Thanks!
Find and install jq (https://stedolan.github.io/jq/). jq is a JSON parser. JSON is not reliably parsed by line-oriented tools like sed because, like XML, JSON is not a line-oriented data format.
In terms of your question:
source <(
curl -X POST -H "$content_type" "$url" -d username="$USERNAME" -d password="$PASSWORD" |
jq -r '. as $h | keys | map(. + "=\"" + $h[.] + "\"") | .[]'
)
The jq syntax is a bit weird, I'm still working on it. It's basically a series of filters, each pipe taking the previous input and transforming it. In this case, the end result is some lines that look like variable="value"
This answer uses bash's "process substitution" to take the results of the jq command, treat it like a file, and source it into the current shell. The variables will then be available to use.
Here's an example of Extract a JSON value from a BASH script
#!/bin/bash
function jsonval {
temp=`echo $json | sed 's/\\\\\//\//g' | sed 's/[{}]//g' | awk -v k="text" '{n=split($0,a,","); for (i=1; i<=n; i++) print a[i]}' | sed 's/\"\:\"/\|/g' | sed 's/[\,]/ /g' | sed 's/\"//g' | grep -w $prop`
echo ${temp##*|}
}
json=`curl -s -X GET http://twitter.com/users/show/$1.json`
prop='profile_image_url'
picurl=`jsonval`
`curl -s -X GET $picurl -o $1.png`
A bash script which demonstrates parsing a JSON string to extract a
property value. The script contains a jsonval function which operates
on two variables, json and prop. When the script is passed the name of
a twitter user it attempts to download the user's profile picture.
You could use perl module on command line:
1st, ensure they is installed, under debian based, you could
sudo apt-get install libjson-xs-perl
But for other OS, you could install perl modules via CPAN (the Comprehensive Perl Archive Network):
cpan App::cpanminus
cpan JSON::XS
Note: You may have to run this with superuser privileges.
then:
curlopts=(-X POST -H
"Content: apent-type: application/x-www-form-urlencoded"
-d username="$USERNAME" -d password="$PASSWORD")
curlurl=https://www.toontownrewritten.com/api/login?format=json
. <(
perl -MJSON::XS -e '
$/=undef;my $a=JSON::XS::decode_json <> ;
printf "declare -A Json=\047(%s)\047\n", join " ",map {
"[".$_."]=\"".$a->{$_}."\""
} qw|queueToken success eta position|;
' < <(
curl "${curlopts[#]}" $curlurl
)
)
The line qw|...| let you precise which variables you want to be driven... This could be replaced by keys $a, but could have to be debugged as some characters is forbiden is associative arrays values names.
echo ${Json[queueToken]}
6bee9e85-343f-41c7-a4d3-156f901da615
echo ${Json[eta]}
0