json string in bash variable - json

I am trying to use a bash variable to store json.
testConfig=",{
\"Classification\": \"mapred-site\",
\"Properties\": {
\"mapreduce.map.java.opts\": \"-Xmx2270m\",
\"mapreduce.map.memory.mb\": \"9712\"
}
}"
echo $testConfig
Output: ,{
If I give it in a single line it works.
But i would like to store values in my variable in a clean format.
I tried using cat >>ECHO
That didn't work either
Any help is appreciated as to how I can store this in order to get the output in an expected format.
Thanks.

You may use a here doc as described here:
read -r -d '' testConfig <<'EOF'
{
"Classification": "mapred-site",
"Properties": {
"mapreduce.map.java.opts": "-Xmx2270m",
"mapreduce.map.memory.mb": "9712"
}
}
EOF
# Just to demonstrate that it works ...
echo "$testConfig" | jq .
Doing so you can avoid quoting.

You can use read -d
read -d '' testConfig <<EOF
,{ /
"Classification": "mapred-site",
"Properties": {/
"mapreduce.map.java.opts": "-Xmx2270m",
"mapreduce.map.memory.mb": "9712"
}
}
EOF
echo $testConfig;

Related

Create a JSON file with parameters from a shell script

Let's say that with a bash script I want to create a file, so to the command to create the file will be something like this:
myscript hostgroup_one 2
hostgroup_one and the number 2 are parameters.
How can I insert the parameters in the lines below and output all the lines as a file?
{
"bool": {
"should": [
{
"match_phrase": {
"hostgroup.keyword": "$1"
}
}
],
"minimum_should_match": $2
}
}
I'd use jq to build the JSON:
jq -n \
--arg hostgroup "$1" \
--argjson minimum "$2" \
'{bool: {should: [{match_phrase: {"hostgroup.keyword": $hostgroup}}], minimum_should_match: $minimum}}'
will produce your desired output.
While jq is a great tool for manipulating json, as glenn jackman recommends, if you don't have it and your sysadmin won't install it (?!?)...
You can use a "here document"
Your myscript could look something like this:
#!/bin/bash
echo "dollar-1 is '${1}' dollar-2 is '${2}' dollar-3 is '${3}'"
cat <<EOF >"${1}"
{
"bool": {
"should": [
{
"match_phrase": {
"hostgroup.keyword": "${2}"
}
}
],
"minimum_should_match": ${3}
}
}
EOF
echo "done"
I've made the output filename the first parameter, and then your two parameters, so it would be run like:
myscript output.json hostgroup_one 2
You don't need to do that, you could use 2 params and redirect output:
myscript hostgroup_one 2 > output.json
(note you don't have to use EOF as your here-document delimiter, I just like it)
Of course you don't need the echo statements, and you should have error checking (does ${#} equal 3 parameters?)

Accessing JSON value via jq using variable key

I have a file with JSON like:
test.json
{
"NUTS|/nuts/2010": {
"type": "small",
"mfg": "TSQQ",
"colors": []
}
}
I am getting "NUTS|/nuts/2010" from outside and I am storing it in a shell variable. I am trying to use the below snippet and using jq util, but I am not able to access the corresponding json against the above key.
test.sh
#!/bin/bash
NUTS_PATH="NUTS|/nuts/2010" #Storing in shell variable
INPUT_FILE="test.json"
RESULT=($(jq -r --arg NUTS_PATH_ALIAS "$NUTS_PATH" '.[$NUTS_PATH_ALIAS]' $INPUT_FILE))
echo "Result: $RESULT"
echo $RESULT > item.json
When I run this, I am getting:
Result: {
But it should return
{
"type": "small",
"mfg": "TSQQ",
"colors": []
}
Any help. Thanks
The problem isn't associated with jq at all. What you have should work fine, but the issue is with the assignment of the result to an array when you might have intended to store it in a variable
RESULT=($(jq -r --arg NUTS_PATH_ALIAS "$NUTS_PATH" '.[$NUTS_PATH_ALIAS]' $INPUT_FILE))
# ^^^ =( .. ) result is stored into an array
A variable like expansion of an array of form $RESULT refers to element at index 0, i.e. ${RESULT[0]}, which contains the { character of the raw JSON output
You should ideally be doing
RESULT="$(jq -r --arg NUTS_PATH_ALIAS "$NUTS_PATH" '.[$NUTS_PATH_ALIAS]' "$INPUT_FILE")"
I always end up swearing at jq, too!
For me, this jq query works:
$ jq '.["NUTS|/nuts/2010"]' test.json
{
"type": "small",
"mfg": "TSQQ",
"colors": []
}
However, because you've got pipes and slashes in your string, the variable quoting gets a bit funny.
NUTS_PATH='"NUTS|/nuts/2010"' #Note the two sets of quotes
INPUT_FILE="test.json"
RESULT=$(jq ".[$NUTS_PATH]" $INPUT_FILE)
echo "Result: $RESULT"
Result: {
"type": "small",
"mfg": "TSQQ",
"colors": []
}
Disclaimer, I'm not a BASH expert, there may (probably is) be a better way to sort out the quoting

How to iterate a JSON array of objects with jq and grab multiple variables from each object in each loop

I need to grab variables from JSON properties.
The JSON array looks like this (GitHub API for repository tags), which I obtain from a curl request.
[
{
"name": "my-tag-name",
"zipball_url": "https://api.github.com/repos/path-to-my-tag-name",
"tarball_url": "https://api.github.com/repos/path-to-my-tag-name-tarball",
"commit": {
"sha": "commit-sha",
"url": "https://api.github.com/repos/path-to-my-commit-sha"
},
"node_id": "node-id"
},
{
"name": "another-tag-name",
"zipball_url": "https://api.github.com/repos/path-to-my-tag-name",
"tarball_url": "https://api.github.com/repos/path-to-my-tag-name-tarball",
"commit": {
"sha": "commit-sha",
"url": "https://api.github.com/repos/path-to-my-commit-sha"
},
"node_id": "node-id"
},
]
In my actual JSON there are 100s of objects like these.
While I loop each one of these I need to grab the name and the commit URL, then perform more operations with these two variables before I get to the next object and repeat.
I tried (with and without -r)
tags=$(curl -s -u "${GITHUB_USERNAME}:${GITHUB_TOKEN}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/repos/path-to-my-repository/tags?per_page=100&page=${page}")
for row in $(jq -r '.[]' <<< "$tags"); do
tag=$(jq -r '.name' <<< "$row")
# I have also tried with the syntax:
url=$(echo "${row}" | jq -r '.commit.url')
# do stuff with $tag and $url...
done
But I get errors like:
parse error: Unfinished JSON term at EOF at line 2, column 0 jq: error
(at :1): Cannot index string with string "name" } parse error:
Unmatched '}' at line 1, column 1
And from the terminal output it appears that it is trying to parse $row in a strange way, trying to grab .name from every substring? Not sure.
I am assuming the output from $(jq '.[]' <<< "$tags") could be valid JSON, from which I could again use jq to grab the object properties I need, but maybe that is not the case? If I output ${row} it does look like valid JSON to me, and I tried pasting the results in a JSON validator, everything seems to check out...
How do I grab the ".name" and ".commit.url" for each of these object before I move onto the next one?
Thanks
It would be better to avoid calling jq more than once. Consider, for example:
while read -r name ; do
read -r url
echo "$name" "$url"
done < <( curl .... | jq -r '.[] | .name, .commit.url' )
where curl .... signifies the relevant invocation of curl.

parse json and put in array

I am trying to parse a JSON file which looks like this and then trying to store each field in an array, then trying to read it. However, its going in an infinite loop, I think. Anyone knows what I am doing wrong?
#!/bin/bash
test(){
local file="/Users/f.json"
if [ -f "$file" ]; then
echo "present"
else
echo "absent"
fi
#jq . f.json
while read rule; do
local idd
local username
idd=$(jq --raw-output '.id' <<< ${rule})
username=$(jq --raw-output '.username' <<< ${rule})
#username=$(jq --raw-output '.username')
done
for (( i=0; i<${#idd[#]}; i++ )); do
echo "${idd[i]}"
done
}
test
Here is json:
{
"id": 5679162,
"username": "ryderw1"
}
{
"id": 5679163,
"username": "ryderw3"
}
{
"id": 5679164,
"username": "ryderw4"
}
My desired o/p should be:
5679162
5679163
5679164
I suggest this to read output from jq to an array.
mapfile -t idd < <(jq '.id' /Users/f.json)
declare -p idd
Output:
declare -a idd=([0]="5679162" [1]="5679163" [2]="5679164")

How delete last comma in json file using Bash?

I wrote some script that takes all user data of aws ec2 instance, and echo to local.json. All this happens when I install my node.js modules.
I don't know how to delete last comma in the json file. Here is the bash script:
#!/bin/bash
export DATA_DIR=/data
export PATH=$PATH:/usr/local/bin
#install package from git repository
sudo -- sh -c "export PATH=$PATH:/usr/local/bin; export DATA_DIR=/data; npm install git+https://reader:secret#bitbucket.org/somebranch/$1.git#$2"
#update config files from instance user-data
InstanceConfig=`cat /instance-config`
echo '{' >> node_modules/$1/config/local.json
while read line
do
if [ ! -z "$line" -a "$line" != " " ]; then
Key=`echo $line | cut -f1 -d=`
Value=`echo $line | cut -f2 -d=`
if [ "$Key" = "Env" ]; then
Env="$Value"
fi
printf '"%s" : "%s",\n' "$Key" "$Value" >> node_modules/*/config/local.json
fi
done <<< "$InstanceConfig"
sed -i '$ s/.$//' node_modules/$1/config/local.json
echo '}' >> node_modules/$1/config/local.json
To run him im doing that way: ./script
I get json(OUTPUT), but with comma in all lines. Here is local.json that I get:
{
"Env" : "dev",
"EngineUrl" : "engine.url.net",
}
All I trying to do, is delete in last line of the json file - comma(",").
I try many ways, that I found in internet. I know that it should be after last "fi"(end of the loop). And I know that it should be something like this line:
sed -i "s?${Key} : ${Value},?${Key} : ${Value}?g" node_modules/$1/config/local.json
Or this:
sed '$ s/,$//' node_modules/$1/config/local.json
But they not work for me.
Can someone help me with that? Who knows Bash scripting well?
Thanks!
If you know that it is the last comma that needs to be replaced, a reasonably robust way is to use GNU sed in "slurp" mode like this:
sed -zr 's/,([^,]*$)/\1/' local.json
Output:
{
"Env" : "dev",
"EngineUrl" : "engine.url.net"
}
If you'd just post some sample input/output it'd remove the guess-work but IF this is your input file:
$ cat file
Env=dev
EngineUrl=engine.url.net
Then IF you're trying to do what I think you are then all you need is:
$ cat tst.awk
BEGIN { FS="="; sep="{\n" }
{
printf "%s \"%s\" : \"%s\"", sep, $1, $2
sep = ",\n"
}
END { print "\n}" }
which you'd execute as:
$ awk -f tst.awk file
{
"Env" : "dev",
"EngineUrl" : "engine.url.net"
}
Or you can execute the awk script inline within a shell script if you prefer:
awk '
BEGIN { FS="="; sep="{\n" }
{
printf "%s \"%s\" : \"%s\"", sep, $1, $2
sep = ",\n"
}
END { print "\n}" }
' file
{
"Env" : "dev",
"EngineUrl" : "engine.url.net"
}
The above is far more robust, portable, efficient and better in every other way than the shell script you posted because it's using the right tool for the job. A UNIX shell is an environment from which to call tools with a language to sequence those calls. It is NOT a language to process text which is why it's so difficult to get it right. The UNIX tool for general text processing is awk so when you need to process text in UNIX, you just have shell call awk, that's all.
Here a jq version if it's available:
jq --raw-input 'split("=") | {(.[0]):.[1]}' /instance-config | jq --slurp 'add'
There might be a way to do it with one jqpass, but I couldn't see it.
You an remove all trailing commas from invalid json with:
sed -i.bak ':begin;$!N;s/,\n}/\n}/g;tbegin;P;D' FILE
sed -i.bak = creates a backup of the original file, then applies changes to the file
':begin;$!N;s/,\n}/\n}/g;tbegin;P;D' = anything ending with , followed by
"new line and }". Remove the , on the previous line
FILE = the file you want to make the change to
If you're willing to use it, xidel is rather forgiving for trailing commas:
xidel -s local.json -e '$json'
{
"Env": "dev",
"EngineUrl": "engine.url.net"
}
xidel - -se '$json' <<< '{"Env":"dev","EngineUrl":"engine.url.net",}'
#or
xidel - -se 'parse-json($raw,{"liberal":true()})' <<< '{"Env":"dev","EngineUrl":"engine.url.net",}'
{
"Env": "dev",
"EngineUrl": "engine.url.net"
}