I'm looking for a solution to add a new attribute with a JSON object value into an existing JSON file.
My current script:
if [ ! -f "$src_file" ]; then
echo "Source file $src_file does not exists"
exit 1
fi
if [ ! -f "$dst_file" ]; then
echo "Destination file $dst_file does not exists"
exit 1
fi
if ! jq '.devDependencies' "$src_file" >/dev/null 2>&1; then
echo "The key "devDependencies" does not exists into source file $src_file"
exit 1
fi
dev_dependencies=$(jq '.devDependencies' "$src_file" | xargs )
# Extract data from source file
data=$(cat $src_file)
# Add new key-value
data=$(echo $data | jq --arg key "devDependencies" --arg value "$dev_dependencies" '. + {($key): ($value)}')
# Write data into destination file
echo $data > $dst_file
It's working but the devDependencies value from $dev_dependencies is wrote as string:
"devDependencies": "{ #nrwl/esbuild: 15.6.3, #nrwl/eslint-pl[...]".
How can I write it as raw JSON ?
I think you want the --argjson option instead of --arg. Compare
$ jq --arg k '{"foo": "bar"}' -n '{x: $k}'
{
"x": "{\"foo\": \"bar\"}"
}
with
$ jq --argjson k '{"foo": "bar"}' -n '{x: $k}'
{
"x": {
"foo": "bar"
}
}
--arg will create a string variable. Use --argjson to parse the value as JSON (can be object, array or number).
From the docs:
--arg name value:
This option passes a value to the jq program as a predefined variable.
If you run jq with --arg foo bar, then $foo is available in the
program and has the value "bar". Note that value will be treated as a
string, so --arg foo 123 will bind $foo to "123".
Named arguments are also available to the jq program as $ARGS.named.
--argjson name JSON-text:
This option passes a JSON-encoded value to the jq program as a
predefined variable. If you run jq with --argjson foo 123, then $foo
is available in the program and has the value 123.
Note that you don't need multiple invocations of jq, xargs, command substitution or variables (don't forget to quote all your variables when expanding).
To "merge" the contents of two files, read both files with jq and let jq do the work. This avoids all the complications that arise from jumping between jq and shell context. A single line is all that's needed:
jq --slurpfile deps "$dep_file" '. + { devDependencies: $deps[0].devDependencies }' "$source_file" > "$dest_file"
or
jq --slurpfile deps "$dep_file" '. + ($deps[0]|{devDependencies})' "$source_file" > "$dest_file"
alternatively (still a one-liner):
jq --slurpfile deps "$dev_file" '.devDependencies = $deps[0].devDependencies' "$source_file" > "$dest_file"
peak's answer here reminded me of the very useful input filter, which can make the program even shorter as it avoids the variable:
jq '. + (input|{devDependencies})' "$source_file" "$dep_file" > "$dest_file"
Related
for region in $(jq '.data | keys | .[]' <<< "$data"); do
value=$(jq -r ".data[$region]" <<< "$data");
deliveryRegionId=$(jq -r '.deliveryRegionId' <<< "$value");
json_template='{}';
json_data=$(jq --argjson deliveryRegionId "$deliveryRegionId" --arg deliverableDistance 5000 '.deliveryRegionId=$deliveryRegionId | .deliverableDistance=5000' <<<"$json_template"); echo $json_data;
requestArray=$(jq '. += [$json_data]' <<< $requestArray)
done;
As in the code above, I'm going to create a json value called json_data and add it to the array.
What should I do to make this work?
jq: error: $json_data is not defined at <top-level>, line 1:
. += [$json_data]
jq: 1 compile error
this is error
There's no jq variable named $json_data.
There is a shell variable named that, but you can't access another program's variables.
Provide the value via the environment
json_data="$json_data" jq '. += [ env.json_data ]' <<<"$requestArray"
Provide the value via the environment
export json_data
jq '. += [ env.json_data ]' <<<"$requestArray"
Provide the value as an argument
jq --arg json_data "$json_data" '. += [ $json_data ]' <<<"$requestArray"
There's no reason to use jq so many times! Your entire program can be replaced with this:
requestArray="$(
jq '.data | map( { deliveryRegionId, deliverableDistance: 5000 } )' \
<<<"$data"
)"
Demo on jqplay
I have a .json file which I want to read, take all the contents.. and append it to a json string that's in a bash variable
input.json
[{
"maven": {
"coordinates": "somelib",
"repo": "somerepo"
}
},
{
"maven": {
"coordinates": "anotherlib",
"exclusions": ["exclude:this", "*:and-all-that"]
}
}]
OUR_BIG_JSON variable
{
"name": "",
"myarray": [
{
"generic": {
"package": "somepymodule"
}
},
{
"rcann": {
"package": "anothermodule==3.0.0"
}
}
],
and a json I want to append to.. it's residing in a variable so we must echo it as we use jq
command attempt
echo "starting..."
inputs_json=`cat input.json`
echo "$inputs_json"
echo "$OUR_BIG_JSON"
OUR_BIG_JSON=$(echo "${OUR_BIG_JSON}" |./jq '.myarray += [{"date":"today"}]') # This worked..
next line fails
OUR_BIG_JSON=$(echo "${OUR_BIG_JSON}" |./jq '.myarray += ' $inputs_json)
out comes errors when trying to take the $inputs_json contents and just pipe it into the variable.. basically append to the JSON in memory..
jq: error: syntax error, unexpected INVALID_CHARACTER (Windows cmd shell quoting issues?) at <top-level>, line 1: .myarray += \"$inputs_json\" jq: 1 compile error
Using the --argjson option to read in the contents of your shell variable enables you to use it as variable inside the jq filter, too. Your input file can normally be read in via parameter.
jq --argjson v "${OUR_BIG_JSON}" '…your filter here using $v…' input.json
For example:
jq --argjson v "${OUR_BIG_JSON}" '$v + {myarray: ($v.myarray + .)}' input.json
Or by making the input too a variable inside the jq filter
jq --argjson v "${OUR_BIG_JSON}" '. as $in | $v | .myarray += $in' input.json
With the -n option you could also reference the input using input:
jq --argjson v "${OUR_BIG_JSON}" -n '$v | .myarray += input' input.json
And there's also the options --argfile and --slurpfile to read in the input file as variable directly...
As you have tagged bash you could also do it the other way around and use a herestring <<< to read in the bash variable as input and then using either --argfile or --slurpfile to reference the file content through a variable. For instance:
jq --argfile in input.json '.myarray += $in' <<< "${OUR_BIG_JSON}"
I parsed a json file with jq like this :
# cat test.json | jq '.logs' | jq '.[]' | jq '._id' | jq -s
It returns an array like this : [34,235,436,546,.....]
Using bash script i described an array :
# declare -a msgIds = ...
This array uses () instead of [] so when I pass the array given above to this array it won't work.
([324,32,45..]) this causes problem. If i remove the jq -s, an array forms with only 1 member in it.
Is there a way to solve this issue?
We can solve this problem by two ways. They are:
Input string:
// test.json
{
"keys": ["key1","key2","key3"]
}
Approach 1:
1) Use jq -r (output raw strings, not JSON texts) .
KEYS=$(jq -r '.keys' test.json)
echo $KEYS
# Output: [ "key1", "key2", "key3" ]
2) Use #sh (Converts input string to a series of space-separated strings). It removes square brackets[], comma(,) from the string.
KEYS=$(<test.json jq -r '.keys | #sh')
echo $KEYS
# Output: 'key1' 'key2' 'key3'
3) Using tr to remove single quotes from the string output. To delete specific characters use the -d option in tr.
KEYS=$((<test.json jq -r '.keys | #sh')| tr -d \')
echo $KEYS
# Output: key1 key2 key3
4) We can convert the comma-separated string to the array by placing our string output in a round bracket().
It also called compound Assignment, where we declare the array with a bunch of values.
ARRAYNAME=(value1 value2 .... valueN)
#!/bin/bash
KEYS=($((<test.json jq -r '.keys | #sh') | tr -d \'\"))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
Approach 2:
1) Use jq -r to get the string output, then use tr to delete characters like square brackets, double quotes and comma.
#!/bin/bash
KEYS=$(jq -r '.keys' test.json | tr -d '[],"')
echo $KEYS
# Output: key1 key2 key3
2) Then we can convert the comma-separated string to the array by placing our string output in a round bracket().
#!/bin/bash
KEYS=($(jq -r '.keys' test.json | tr -d '[]," '))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
To correctly parse values that have spaces, newlines (or any other arbitrary characters) just use jq's #sh filter and bash's declare -a. (No need for a while read loop or any other pre-processing)
// foo.json
{"data": ["A B", "C'D", ""]}
str=$(jq -r '.data | #sh' foo.json)
declare -a arr="($str)" # must be quoted like this
$ declare -p arr
declare -a arr=([0]="A B" [1]="C'D" [2]="")
The reason that this works correctly is that #sh will produce a space-separated list of shell-quoted words:
$ echo "$str"
'A B' 'C'\''D' ''
and this is exactly the format that declare expects for an array definition.
Use jq -r to output a string "raw", without JSON formatting, and use the #sh formatter to format your results as a string for shell consumption. Per the jq docs:
#sh:
The input is escaped suitable for use in a command-line for a POSIX shell. If the input is an array, the output will be a series of space-separated strings.
So can do e.g.
msgids=($(<test.json jq -r '.logs[]._id | #sh'))
and get the result you want.
From the jq FAQ (https://github.com/stedolan/jq/wiki/FAQ):
𝑸: How can a stream of JSON texts produced by jq be converted into a bash array of corresponding values?
A: One option would be to use mapfile (aka readarray), for example:
mapfile -t array <<< $(jq -c '.[]' input.json)
An alternative that might be indicative of what to do in other shells is to use read -r within a while loop. The following bash script populates an array, x, with JSON texts. The key points are the use of the -c option, and the use of the bash idiom while read -r value; do ... done < <(jq .......):
#!/bin/bash
x=()
while read -r value
do
x+=("$value")
done < <(jq -c '.[]' input.json)
++ To resolve this, we can use a very simple approach:
++ Since I am not aware of you input file, I am creating a file input.json with the following contents:
input.json:
{
"keys": ["key1","key2","key3"]
}
++ Use jq to get the value from the above file input.json:
Command: cat input.json | jq -r '.keys | #sh'
Output: 'key1' 'key2' 'key3'
Explanation: | #sh removes [ and "
++ To remove ' ' as well we use tr
command: cat input.json | jq -r '.keys | #sh' | tr -d \'
Explanation: use tr delete -d to remove '
++ To store this in a bash array we use () with `` and print it:
command:
KEYS=(`cat input.json | jq -r '.keys | #sh' | tr -d \'`)
To print all the entries of the array: echo "${KEYS[*]}"
Trying to write a bash script that replaces values in a JSON file we are running into issues with Environment Variables that contain whitespaces.
Given an original JSON file.
{
"version": "base",
"myValue": "to be changed",
"channelId": 0
}
We want to run a command to update some variables in it, so that after we run:
CHANNEL_ID=1701 MY_VALUE="new value" ./test.sh
The JSON should look like this:
{
"version": "base",
"myValue": "new value",
"channelId": 1701
}
Our script is currently at something like this:
#!/bin/sh
echo $MY_VALUE
echo $CHANNEL_ID
function replaceValue {
if [ -z $2 ]; then echo "Skipping $1"; else jq --argjson newValue \"${2}\" '. | ."'${1}'" = $newValue' build/config.json > tmp.json && mv tmp.json build/config.json; fi
}
replaceValue channelId ${CHANNEL_ID}
replaceValue myValue ${MY_VALUE}
In the above all values are replaced by string and strings are getting truncated at whitespace. We keep alternating between this issue and a version of the code where substitutions just stop working entirely.
This is surely an issue with expansions but we would love to figure out, how we can:
- Replace values in the JSON with both strings and values.
- Use whitespaces in the strings we pass to our script.
You don't have to mess with --arg or --argjson to import the environment variables into jq's context. It can very well read the environment on its own. You don't need a script separately, just set the values along with the invocation of jq
CHANNEL_ID=1701 MY_VALUE="new value" \
jq '{"version": "base", myValue: env.MY_VALUE, channelId: env.CHANNEL_ID}' build/config.json
Note that in the case above, the variables need not be exported globally but just locally to the jq command. This allows you to not export multiple variables into the shell and pollute the environment, but just the ones needed for jq to construct the desired JSON.
To make the changes back to the original file, do > tmp.json && mv tmp.json build/config.json or more clearly download the sponge(1) utility from moreutils package. If present, you can pipe the output of jq as
| sponge build/config.json
Pass variables with --arg. Do:
jq --arg key "$1" --arg value "$2" '.[$key] = $value'
Notes:
#!/bin/sh indicates that this is posix shell script, not bash. Use #!/bin/bash in bash scripts.
function replaceValue { is something from ksh shell. Prefer replaceValue() { to declare functions. Bash obsolete and deprecated syntax.
Use newlines in your script to make it readable.
--argjson passes a json formatted argument, not a string. Use --arg for that.
\"${2}\" doesn't quote $2 expansion - it only appends and suffixes the string with ". Because the expansion is not qouted, word splitting is performed, which causes your input to be split on whitespaces when creating arguments for jq.
Remember to quote variable expansions.
Use http://shellcheck.net to check your scripts.
. | means nothing in jq, it's like echo $(echo $(echo))). You could jq '. | . | . | . | . | .' do it infinite number of times - it passes the same thing. Just write the thing you want to do.
Do:
#!/bin/bash
echo "$MY_VALUE"
echo "$CHANNEL_ID"
replaceValue() {
if [ -z "$2" ]; then
echo "Skipping $1"
else
jq --arg key "$1" --arg value "$2" '.[$key] = $value' build/config.json > tmp.json &&
mv tmp.json build/config.json
fi
}
replaceValue channelId "${CHANNEL_ID}"
replaceValue myValue "${MY_VALUE}"
#edit Replaced ."\($key)" with easier .[$key]
jq allows you to build new objects:
MY_VALUE=foo;
CHANNEL_ID=4
echo '{
"version": "base",
"myValue": "to be changed",
"channelId": 0
}' | jq ". | {\"version\": .version, \"myValue\": \"$MY_VALUE\", \"channelId\": $CHANNEL_ID}"
The . selects the whole input, and inputs that (|) to the construction of a new object (marked by {}). For version is selects .version from the input, but you can set your own values for the other two. We use double quotes to allow the Bash variable expansion, which means escaping the double quotes in the JSON.
You'll need to adapt my snippet above to scriptify it.
I have a json file names test.json with the below content.
{
"run_list": ["recipe[cookbook-ics-op::setup_server]"],
"props": {
"install_home": "/test/inst1",
"tmp_dir": "/test/inst1/tmp",
"user": "tuser
}
}
I want to read this file into a variable in shell script & then extract the values of install_home,user & tmp_dir using expr. Can someone help, please?
props=cat test.json
works to get the json file into a variable. Now how can I extract the values using expr. Any help would be greatly appreciated.
Install jq
yum -y install epel-release
yum -y install jq
Get the values in the following way
install_home=$(cat test.json | jq -r '.props.install_home')
tmp_dir=$(cat test.json | jq -r '.props.tmp_dir')
user=$(cat test.json | jq -r '.props.user')
For a pure bash solution I suggest this:
github.com/dominictarr/JSON.sh
It could be used like this:
./json.sh -l -p < example.json
print output like:
["name"] "JSON.sh"
["version"] "0.2.1"
["description"] "JSON parser written in bash"
["homepage"] "http://github.com/dominictarr/JSON.sh"
["repository","type"] "git"
["repository","url"] "https://github.com/dominictarr/JSON.sh.git"
["bin","JSON.sh"] "./JSON.sh"
["author"] "Dominic Tarr <dominic.tarr#gmail.com> (http://bit.ly/dominictarr)"
["scripts","test"] "./all-tests.sh"
From here is pretty trivial achive what you are looking for
jq is a dedicated parser for JSON files. Install jq.
values in the json can be retrieved as:
jq .<top-level-attr>.<next-level-attr> <json-file-path>
if JSON contains an array
jq .<top-level-attr>.<next-level-array>[].<elem-in-array> <json-file-path>
if you want a value in a shell variable
id = $(jq -r .<top-level-attr>.<next-level-array>[].<next-level-attr> <json-file-path>)
echo id
use -r if you need unquoted value
For simple JSON, it may be treated as a plain text file.
In that case, we can use simple text pattern matching to extract the information we need.
If you observe the following lines:
"install_home": "/test/inst1",
"tmp_dir": "/test/inst1/tmp",
"user": "user"
There exists a pattern on each line that can be described as key and value:
"key" : "value"
We can use perl with regular expressions to exact the value for any given key:
"key" hardcoded for each case "install_home", "tmp_dir" and "user"
"value" as (.*) regular expression
Then we use the $1 matching group to retrieve the value.
i=$(perl -ne 'if (/"install_home": "(.*)"/) { print $1 . "\n" }' test.json)
t=$(perl -ne 'if (/"tmp_dir": "(.*)"/) { print $1 . "\n" }' test.json)
u=$(perl -ne 'if (/"user": "(.*)"/) { print $1 . "\n" }' test.json)
cat <<EOF
install_home: $i
tmp_dir : $t
user : $u
EOF
Which outputs:
install_home: /test/inst1
tmp_dir : /test/inst1/tmp
user : tuser