Looping through a json file and retrieving the values into bash variables - json

I have a json file which has the below data
{"Item": {"ID": {"S": "4869949"},"no":{"N": "2"}}}
I need to retrieve the S value from ID and N value from no and put them into a query as parameters.. the query looks like this:
query --table-name validation \
--key-condition-expression "ID = :v1 AND no = :v2" \
--expression-attribute-values '{":v1": {"S": "${ID}"},":v2": {"N": "${no}"}}' \
--region us-east-1
I have tried using jq but couldn't figure out how to read a json file and retrieve the values as parameters. This is the first time I am using jq. Can someone help with it?

The following may not be exactly what you're after but does show how you can avoid calling jq more than once, and how the command can partly be constructed by jq. In other words, assuming the templates for --key-condition-expression and --expression-attribute-values are fixed, you should be able to adapt the following to your needs:
Assuming a bash or bash-like shell, and that data.json holds:
{"Item": {"ID": {"S": "4869949"},"no":{"N": "2"}}}
we could write:
ID=ID
no=no
< data.json jq --arg ID "$ID" --arg no "$no" -cr '
.Item
| .ID.S as $v1
| .no.N as $v2
| $v1, $v2, {($v1): {"S": $ID}, ($v2): {"N": $no}}
' | while read -r v1
do
read -r v2
read -r query
echo query --table-name validation \
--key-condition-expression "\"ID = $v1 AND no = $v2\"" \
--expression-attribute-values "'""$query""'" \
--region us-east-1
done
jq as a template engine
If the template with :v1 and :v2 in the Q are variable, then you could adapt the above by using jq as a template engine, a topic which is covered in the jq Cookbook: https://github.com/stedolan/jq/wiki/Cookbook#using-jq-as-a-template-engine

It's basic jq syntax:
$ ID=$(cat data.json | jq .Item.ID.S -r)
$ no=$(cat data.json | jq .Item.no.N -r)
$ echo $ID $no
4869949 2
I'd suggest you read something like https://shapeshed.com/jq-json/ to get more familiar though.

Related

bash & jq: add attribute with object value

I'm looking for a solution to add a new attribute with a JSON object value into an existing JSON file.
My current script:
if [ ! -f "$src_file" ]; then
echo "Source file $src_file does not exists"
exit 1
fi
if [ ! -f "$dst_file" ]; then
echo "Destination file $dst_file does not exists"
exit 1
fi
if ! jq '.devDependencies' "$src_file" >/dev/null 2>&1; then
echo "The key "devDependencies" does not exists into source file $src_file"
exit 1
fi
dev_dependencies=$(jq '.devDependencies' "$src_file" | xargs )
# Extract data from source file
data=$(cat $src_file)
# Add new key-value
data=$(echo $data | jq --arg key "devDependencies" --arg value "$dev_dependencies" '. + {($key): ($value)}')
# Write data into destination file
echo $data > $dst_file
It's working but the devDependencies value from $dev_dependencies is wrote as string:
"devDependencies": "{ #nrwl/esbuild: 15.6.3, #nrwl/eslint-pl[...]".
How can I write it as raw JSON ?
I think you want the --argjson option instead of --arg. Compare
$ jq --arg k '{"foo": "bar"}' -n '{x: $k}'
{
"x": "{\"foo\": \"bar\"}"
}
with
$ jq --argjson k '{"foo": "bar"}' -n '{x: $k}'
{
"x": {
"foo": "bar"
}
}
--arg will create a string variable. Use --argjson to parse the value as JSON (can be object, array or number).
From the docs:
--arg name value:
This option passes a value to the jq program as a predefined variable.
If you run jq with --arg foo bar, then $foo is available in the
program and has the value "bar". Note that value will be treated as a
string, so --arg foo 123 will bind $foo to "123".
Named arguments are also available to the jq program as $ARGS.named.
--argjson name JSON-text:
This option passes a JSON-encoded value to the jq program as a
predefined variable. If you run jq with --argjson foo 123, then $foo
is available in the program and has the value 123.
Note that you don't need multiple invocations of jq, xargs, command substitution or variables (don't forget to quote all your variables when expanding).
To "merge" the contents of two files, read both files with jq and let jq do the work. This avoids all the complications that arise from jumping between jq and shell context. A single line is all that's needed:
jq --slurpfile deps "$dep_file" '. + { devDependencies: $deps[0].devDependencies }' "$source_file" > "$dest_file"
or
jq --slurpfile deps "$dep_file" '. + ($deps[0]|{devDependencies})' "$source_file" > "$dest_file"
alternatively (still a one-liner):
jq --slurpfile deps "$dev_file" '.devDependencies = $deps[0].devDependencies' "$source_file" > "$dest_file"
peak's answer here reminded me of the very useful input filter, which can make the program even shorter as it avoids the variable:
jq '. + (input|{devDependencies})' "$source_file" "$dep_file" > "$dest_file"

jq how to pass json keys from a shell variable

I have a json file I am parsing with jq. This is a sample of the file
[{
"key1":{...},
"key2":{...}
}]
[{
"key1":{...},
"key2":{...}
}]
...
each line is a list containing a json (which I know is not technically a json format but jq still works on such a file)
The below jq command works:
cat file.json | jq -r '.[] | [.key1,.key2]'
The above correctly shows:
[
<value_of_key1>,<value_of_key2>
]
[
<value_of_key1>,<value_of_key2>
]
However, I want .key1,.key2 to be dynamic since these keys can change. So I want to pass a variable to jq. Something like:
$KEYS=.key1,.key2
cat file.json | jq -r --arg var "$KEYS" '.[] | [$var]'
But the above is returning the keys themselves:
[
".key1,.key2"
]
[
".key1,.key2"
]
why is this happening? what is the correct command to make this happen?
This answer does not help me. I am not getting any errors as the OP in that question.
Fetching the value of a jq variable doesn't cause it to be executed as jq code.
Furthermore, jq lacks the facility to take a string, compile it as jq code, and evaluate the result. (This is commonly known as eval.)
So, short of a writing a jq parser and evaluator in jq, you will need to impose limits and/or accept a different format.
For example,
keys='[ [ "key1", "childkey" ], [ "key2", "childkey2" ] ]' # JSON
jq --argjson keys "$keys" '.[] | [ getpath( $keys[] ) ]' file.json
or
keys='key1.childkey,key2.childkey2'
jq --arg keys "$keys" '
( ( $keys / "," ) | map( . / "." ) ) as $keys |
.[] | [ getpath( $keys[] ) ]
' file.json
Suppose you have:
cat file
[{
"key1":1,
"key2":2
}]
[{
"key1":1,
"key2":2
}]
You can use a jq command like so:
jq '.[] | [.key1,.key2]' file
[
1,
2
]
[
1,
2
]
You can use -f to execute a filter from a file and nothing keeps you from creating the file separately from the shell variables.
Example:
keys=".key1"
echo ".[] | [${keys}]" >jqf
jq -f jqf file
[
1
]
[
1
]
Or just build the string directly into jq:
# note double " causing string interpolation
jq ".[] | [${keys}]" file
You can use --argjson option and destructuring.
file.json
[{"key1":{"a":1},"key2":{"b":2}}]
[{"key1":{"c":1},"key2":{"d":2}}]
$ in='["key1","key2"]' jq -c --argjson keys "$in" '$keys as [$key1,$key2] | .[] | [.[$key1,$key2]]' file.json
output:
[{"a":1},{"b":2}]
[{"c":1},{"d":2}]
Elaborating on ikegami's answer.
To start with here's my version of the answer:
$ in='key1.a,key2.b'; jq -c --arg keys "$in" '($keys/","|map(./".")) as $paths | .[] | [getpath($paths[])]' <<<$'[{"key1":{"a":1},"key2":{"b":2}}] [{"key1":{"a":3},"key2":{"b":4}}]'
This gives output
[1,2]
[3,4]
Let's try it.
We have input
[{"key1":{"a":1},"key2":{"b":2}}]
[{"key1":{"a":3},"key2":{"b":4}}]
And we want to construct array
[["key1","a"],["key2","b"]]
then use it on getpath(PATHS) builtin to extract values out of our input.
To start with we are given in shell variable with string value key1.a,key2.b. Let's call this $keys.
Then $keys/"," gives
["key1.a","key2.b"]
["key1.a","key2.b"]
After that $keys/","|map(./".") gives what we want.
[["key1","a"],["key2","b"]]
[["key1","a"],["key2","b"]]
Let's call this $paths.
Now if we do .[]|[getpath($paths[])] we get the values from our input equivalent to
[.[] | .key1.a, .key2.b]
which is
[1,2]
[3,4]

Create a json from given list of filenames in unix script

Hello I am trying to write unix script/command where I have to list out all filenames from given directory with filename format string-{number}.txt(eg: filename-1.txt,filename-2.txt) from which I have to form a json object. any pointers would be helpful.
[{
"filenumber": "1",
"name": "filename-1.txt"
},
{
"filenumber": "2",
"name": "filename-2.txt"
}
]
In the above json file-number should be read from {number} format of the each filename
A single call to jq should suffice :
shopt -s extglob
printf "%s\0" *-+([0-9]).txt | \
jq -sR 'split("\u0000") |
map({filenumber:capture(".*-(?<n>.*)\\.txt").n,
name:.})'
Very easy for the command-line tool xidel and its integrated EXPath File Module:
$ xidel -se '
array{
for $x in file:list(.,false(),"*.txt")
return {
"filenumber":extract($x,"(\d+)\.txt",1),
"name":$x
}
}
'
Intuitively, I'd say you can do this with jq. However, in practice I've rarely been able to achieve what I wanted with jq :-)
With some lunch break puzzling, I've come up with this beauty:
ls | jq -R '{filenumber:input_line_number, name:.}' | jq -s .
Instead of ls you could use any other command that produces a newline separated list of strings.
I have tried with multiple examples to achieve exact use case of mine and finally found this working fine exactly how I wanted Thanks
for file in $(ls *.txt); do file_version=$(echo $file | sed 's/\(^.*-\)\(.*\)\(.txt.*$\)/\2/'); jq -n --arg name "$file_version" --arg path "$file" '{name: $name, name: $path}'; done | jq -n '.urls |= [inputs]'

How to Iterate over an array of objets using jq

I have a javascript file which prints a JSON array of objects:
// myfile.js output
[
{ "id": 1, "name": "blah blah", ... },
{ "id": 2, "name": "xxx", ... },
...
]
In my bash script, I want to iterate through each object.
I've tried following, but it doesn't work.
#!/bin/bash
output=$(myfile.js)
for row in $(echo ${output} | jq -c '.[]'); do
echo $row
done
You are trying to invoke myfile.js as a command. You need this:
output=$(cat myfile.js)
instead of this:
output=$(myfile.js)
But even then, your current approach isn't going to work well if the data has whitespace in it (which it does, based on the sample you posted). I suggest the following alternative:
jq -c '.[]' < myfile.js |
while read -r row
do
echo "$row"
done
Output:
{"id":1,"name":"blah blah"}
{"id":2,"name":"xxx"}
Edit:
If your data is arising from a previous process invocation, such as mongo in your case, you can pipe it directly to jq (to remain portable), like this:
mongo myfile.js |
jq -c '.[]' |
while read -r row
do
echo "$row"
done
How can I make jq -c '.[]' < (mongo myfile.js) work?
In a bash shell, you would write an expression along the following lines:
while read -r line ; do .... done < <(mongo myfile.js | jq -c .[])
Note that there are two occurrences of "<" in the above expression.
Also, the above assumes mongo is emitting valid JSON. If it emits //-style comments, those would have somehow to be removed.
Comparison with piping into while
If you use the idiom:
... | while read -r line ; do .... done
then the bindings of any variables in .... will be lost.

How to az group list using ubuntu bash foreach json array name value

azure cli az group list return data like
[
"demo3",
"demo",
"demo2",
"NetworkWatcherRG"
]
I'd like to foreach it's value on ubuntu bash then printing below result
demo3
demo
demo2
NetworkWatcherRG
What I've tried :
I tried below script
jq -c '.[]' $(az group list) | while read i; do echo $i ;done
but get image's error
Your command expands to this (see it for yourself with set -x):
jq -c '.[]' '[' '"demo3",' '"demo",' '"demo2",' '"NetworkWatcherRG"' ']'
The command substitution is replaced with the command output, but jq doesn't expect a JSON body as a parameter – either files containing JSON, or a stream in standard input. Since all the parameters ([, "demo3" etc.) are not filenames, you see the errors you do.
You could have Bash make it look like it's a file with process substitution:
jq -c '.[]' <(az group list)
or, more portably, use pipes:
az group list | jq -c '.[]'
Notice that quoting wouldn't help here either: if you ran
jq -c '.[] "$(az group list)"
it would expand to
jq -c '.[]' '[
"demo3",
"demo",
"demo2",
"NetworkWatcherRG"
]'
and jq would try to open a file with the name
[
"demo3",
"demo",
"demo2",
"NetworkWatcherRG"
]
which does not exist.