I am trying to write a shell script that loops through a JSON file and does some logic based on every object's properties. The script was initially written for Windows but it does not work properly on a MacOS.
The initial code is as follows
documentsJson=""
jsonStrings=$(cat "$file" | jq -c '.[]')
while IFS= read -r document; do
# Get the properties from the docment (json string)
currentKey=$(echo "$document" | jq -r '.Key')
encrypted=$(echo "$document" | jq -r '.IsEncrypted')
# If not encrypted then don't do anything with it
if [[ $encrypted != true ]]; then
echoComment " Skipping '$currentKey' as it's not marked for encryption"
documentsJson+="$document,"
continue
fi
//some more code
done <<< $jsonStrings
When ran on a MacOs, the whole file is processed at once, so it does not loop through objects.
The closest I got to making it work - after trying a lot of suggestions - is as follows:
jq -r '.[]' "$file" | while read i; do
for config in $i ; do
currentKey=$(echo "$config" | jq -r '.Key')
echo "$currentKey"
done
done
The console result is parse error: Invalid numeric literal at line 1, column 6
I just cannot find a proper way of grabbing the JSON object and reading its properties.
JSON file example
[
{
"Key": "PdfMargins",
"Value": {
"Left":0,
"Right":0,
"Top":20,
"Bottom":15
}
},
{
"Key": "configUrl",
"Value": "someUrl",
"IsEncrypted": true
}
]
Thank you in advance!
Try putting the $jsonStrings in doublequotes: done <<< "$jsonStrings"
Otherwise the standard shell splitting applies on the variable expansion and you probably want to retain the line structure of the output of jq.
You could also use this in bash:
while IFS= read -r document; do
...
done < <(jq -c '.[]' < "$file")
That would save some resources. I am not sure about making this work on MacOS, though, so test this first.
Related
Hello I am trying to write unix script/command where I have to list out all filenames from given directory with filename format string-{number}.txt(eg: filename-1.txt,filename-2.txt) from which I have to form a json object. any pointers would be helpful.
[{
"filenumber": "1",
"name": "filename-1.txt"
},
{
"filenumber": "2",
"name": "filename-2.txt"
}
]
In the above json file-number should be read from {number} format of the each filename
A single call to jq should suffice :
shopt -s extglob
printf "%s\0" *-+([0-9]).txt | \
jq -sR 'split("\u0000") |
map({filenumber:capture(".*-(?<n>.*)\\.txt").n,
name:.})'
Very easy for the command-line tool xidel and its integrated EXPath File Module:
$ xidel -se '
array{
for $x in file:list(.,false(),"*.txt")
return {
"filenumber":extract($x,"(\d+)\.txt",1),
"name":$x
}
}
'
Intuitively, I'd say you can do this with jq. However, in practice I've rarely been able to achieve what I wanted with jq :-)
With some lunch break puzzling, I've come up with this beauty:
ls | jq -R '{filenumber:input_line_number, name:.}' | jq -s .
Instead of ls you could use any other command that produces a newline separated list of strings.
I have tried with multiple examples to achieve exact use case of mine and finally found this working fine exactly how I wanted Thanks
for file in $(ls *.txt); do file_version=$(echo $file | sed 's/\(^.*-\)\(.*\)\(.txt.*$\)/\2/'); jq -n --arg name "$file_version" --arg path "$file" '{name: $name, name: $path}'; done | jq -n '.urls |= [inputs]'
I have a javascript file which prints a JSON array of objects:
// myfile.js output
[
{ "id": 1, "name": "blah blah", ... },
{ "id": 2, "name": "xxx", ... },
...
]
In my bash script, I want to iterate through each object.
I've tried following, but it doesn't work.
#!/bin/bash
output=$(myfile.js)
for row in $(echo ${output} | jq -c '.[]'); do
echo $row
done
You are trying to invoke myfile.js as a command. You need this:
output=$(cat myfile.js)
instead of this:
output=$(myfile.js)
But even then, your current approach isn't going to work well if the data has whitespace in it (which it does, based on the sample you posted). I suggest the following alternative:
jq -c '.[]' < myfile.js |
while read -r row
do
echo "$row"
done
Output:
{"id":1,"name":"blah blah"}
{"id":2,"name":"xxx"}
Edit:
If your data is arising from a previous process invocation, such as mongo in your case, you can pipe it directly to jq (to remain portable), like this:
mongo myfile.js |
jq -c '.[]' |
while read -r row
do
echo "$row"
done
How can I make jq -c '.[]' < (mongo myfile.js) work?
In a bash shell, you would write an expression along the following lines:
while read -r line ; do .... done < <(mongo myfile.js | jq -c .[])
Note that there are two occurrences of "<" in the above expression.
Also, the above assumes mongo is emitting valid JSON. If it emits //-style comments, those would have somehow to be removed.
Comparison with piping into while
If you use the idiom:
... | while read -r line ; do .... done
then the bindings of any variables in .... will be lost.
Trying to write a bash script that replaces values in a JSON file we are running into issues with Environment Variables that contain whitespaces.
Given an original JSON file.
{
"version": "base",
"myValue": "to be changed",
"channelId": 0
}
We want to run a command to update some variables in it, so that after we run:
CHANNEL_ID=1701 MY_VALUE="new value" ./test.sh
The JSON should look like this:
{
"version": "base",
"myValue": "new value",
"channelId": 1701
}
Our script is currently at something like this:
#!/bin/sh
echo $MY_VALUE
echo $CHANNEL_ID
function replaceValue {
if [ -z $2 ]; then echo "Skipping $1"; else jq --argjson newValue \"${2}\" '. | ."'${1}'" = $newValue' build/config.json > tmp.json && mv tmp.json build/config.json; fi
}
replaceValue channelId ${CHANNEL_ID}
replaceValue myValue ${MY_VALUE}
In the above all values are replaced by string and strings are getting truncated at whitespace. We keep alternating between this issue and a version of the code where substitutions just stop working entirely.
This is surely an issue with expansions but we would love to figure out, how we can:
- Replace values in the JSON with both strings and values.
- Use whitespaces in the strings we pass to our script.
You don't have to mess with --arg or --argjson to import the environment variables into jq's context. It can very well read the environment on its own. You don't need a script separately, just set the values along with the invocation of jq
CHANNEL_ID=1701 MY_VALUE="new value" \
jq '{"version": "base", myValue: env.MY_VALUE, channelId: env.CHANNEL_ID}' build/config.json
Note that in the case above, the variables need not be exported globally but just locally to the jq command. This allows you to not export multiple variables into the shell and pollute the environment, but just the ones needed for jq to construct the desired JSON.
To make the changes back to the original file, do > tmp.json && mv tmp.json build/config.json or more clearly download the sponge(1) utility from moreutils package. If present, you can pipe the output of jq as
| sponge build/config.json
Pass variables with --arg. Do:
jq --arg key "$1" --arg value "$2" '.[$key] = $value'
Notes:
#!/bin/sh indicates that this is posix shell script, not bash. Use #!/bin/bash in bash scripts.
function replaceValue { is something from ksh shell. Prefer replaceValue() { to declare functions. Bash obsolete and deprecated syntax.
Use newlines in your script to make it readable.
--argjson passes a json formatted argument, not a string. Use --arg for that.
\"${2}\" doesn't quote $2 expansion - it only appends and suffixes the string with ". Because the expansion is not qouted, word splitting is performed, which causes your input to be split on whitespaces when creating arguments for jq.
Remember to quote variable expansions.
Use http://shellcheck.net to check your scripts.
. | means nothing in jq, it's like echo $(echo $(echo))). You could jq '. | . | . | . | . | .' do it infinite number of times - it passes the same thing. Just write the thing you want to do.
Do:
#!/bin/bash
echo "$MY_VALUE"
echo "$CHANNEL_ID"
replaceValue() {
if [ -z "$2" ]; then
echo "Skipping $1"
else
jq --arg key "$1" --arg value "$2" '.[$key] = $value' build/config.json > tmp.json &&
mv tmp.json build/config.json
fi
}
replaceValue channelId "${CHANNEL_ID}"
replaceValue myValue "${MY_VALUE}"
#edit Replaced ."\($key)" with easier .[$key]
jq allows you to build new objects:
MY_VALUE=foo;
CHANNEL_ID=4
echo '{
"version": "base",
"myValue": "to be changed",
"channelId": 0
}' | jq ". | {\"version\": .version, \"myValue\": \"$MY_VALUE\", \"channelId\": $CHANNEL_ID}"
The . selects the whole input, and inputs that (|) to the construction of a new object (marked by {}). For version is selects .version from the input, but you can set your own values for the other two. We use double quotes to allow the Bash variable expansion, which means escaping the double quotes in the JSON.
You'll need to adapt my snippet above to scriptify it.
I have the following file
[
{
"id": 1,
"name": "Arthur",
"age": "21"
},
{
"id": 2,
"name": "Richard",
"age": "32"
}
]
To display login and id together, I am using the following command
$ jq '.[] | .name' test
"Arthur"
"Richard"
But when I put it in a shell script and try to assign it to a variable then the whole output is displayed on a single line like below
#!/bin/bash
names=$(jq '.[] | .name' test)
echo $names
$ ./script.sh
"Arthur" "Richard"
I want to break at every iteration similar to how it works on the command line.
Couple of issues in the information you have provided. The jq filter .[] | .login, .id will not produce the output as you claimed on jq-1.5. For your original JSON
{
"login":"dmaxfield",
"id":7449977
}
{
"login":"stackfield",
"id":2342323
}
It will produce four lines of output as,
jq -r '.login, .id' < json
dmaxfield
7449977
stackfield
2342323
If you are interested in storing them side by side, you need to do variable interpolation as
jq -r '"\(.login), \(.id)"' < json
dmaxfield, 7449977
stackfield, 2342323
And if you feel your output stored in a variable is not working. It is probably because of lack of double-quotes when you tried to print the variable in the shell.
jqOutput=$(jq -r '"\(.login), \(.id)"' < json)
printf "%s\n" "$jqOutput"
dmaxfield, 7449977
stackfield, 2342323
This way the embedded new lines in the command output are not swallowed by the shell.
For you updated JSON (totally new one compared to old one), all you need to do is
jqOutput=$(jq -r '.[] | .name' < json)
printf "%s\n" "$jqOutput"
Arthur
Richard
In case the .login or .id contains embedded spaces or other characters that might cause problems, a more robust approach is to ensure each JSON value is on a separate line. Consider, for example:
jq -c .login,.id input.json | while read login ; do read id; echo login="$login" and id="$id" ; done
login="dmaxfield" and id=7449977
login="stackfield" and id=2342323
I have a json in which I want to modify a particular value but the terminal always displays the json with the modified value but it does not actually change the value in the particular file. Sample json:
{
name: 'abcd',
age: 30,
address: 'abc'
}
I want to change the value of address in the file itself but so far I've been unable to do so. I tried using:
jq '.address = "abcde"' test.json
but it didn't work. Any suggestions?
Use a temporary file; it's what any program that claims to do in-place editing is doing.
tmp=$(mktemp)
jq '.address = "abcde"' test.json > "$tmp" && mv "$tmp" test.json
If the address isn't hard-coded, pass the correct address via a jq argument:
address=abcde
jq --arg a "$address" '.address = $a' test.json > "$tmp" && mv "$tmp" test.json
AFAIK jq does not support in-place editing, so you must redirect to a temporary file first and then replace your original file with it, or use sponge utility from the moreutils package, like that:
jq '.address = "abcde"' test.json|sponge test.json
There are other techniques to "redirect to the same file", like saving your output in a variable e.t.c. "Unix & Linux StackExchange" is a good place to start, if you want to learn more about this.
Temp files add more complexity when not needed (unless you are truly dealing with JSON files so large you cannot fit them in memory (GB to 100's of GB or TB, depending on how much RAM/parallelism you have)
The Pure bash way.
contents="$(jq '.address = "abcde"' test.json)" && \
echo -E "${contents}" > test.json
Pros
No temp file to juggle
Pure bash
Don't need an admin to install sponge, which is not installed by default
Simpler
Cons
This works perfectly fine for json because it cannot contain a literal null character. If you were to try this outside the json arena, it would fail when a null is encountered (and you would have to do some encoding/decoding workarounds). Bash variables cannot store literal nulls.
Note: this can not be combined as "one command" (like #codekandis
suggested), since redirection sometimes starts before the left hand side (LHS) of an expression is run, and starting redirection before running jq erroneously empties the file, hence two separate commands. It may "seem" to work when you try it, but this is misleading and has a very high probability of failing as soon as the circumstances change.
Update: Added -E option to disable escape characters just in case you are on systems where they are interpreted by default. (Which I've never actually seen)
Just to add to chepner answer and if you want it in a shell script.
test.json
{
"name": "abcd",
"age": 30,
"address": "abc"
}
script.sh
#!/bin/bash
address="abcde"
age=40
# Strings:
jq --arg a "${address}" '.address = $a' test.json > "tmp" && mv "tmp" test.json
# Integers:
jq --argjson a "${age}" '.age = $a' test.json > "tmp" && mv "tmp" test.json
Example for nested json with changing single and multiple values.
config.json
{
"Parameters": {
"Environment": "Prod",
"InstanceType": "t2.micro",
"AMIID": "ami-02d8e11",
"ConfigRegion": "eu-west-1"
}
}
with the below command, you can edit multiple values.
tmp=$(mktemp)
jq '.Parameters.AMIID = "ami-02d8sdfsdf" | .Parameters.Environment = "QA"' config.json > "$tmp" && mv "$tmp" config.json
with the below command, you can edit single value.
tmp=$(mktemp)
jq '.Parameters.AMIID = "ami-02d8sdfsdf"' config.json > "$tmp" && mv "$tmp" config.json
this should work
address = aaaaa
echo $(jq --arg a "$address" '.address = ($a)' test.json) > test.json
for whatever reason, without the echo, it makes a bin file and my python script was not able to parse it.
I took the best of a couple answers here and here.
This uses a parameter named actionname as an input to an assignment of the name property at the document level. ACTION_NAME is just an envvar I want to use as the replacement value.
contents="$(jq --arg actionname ${ACTION_NAME} '.name = $actionname' ./${ACTION_NAME}/package.json)" && \
echo -E "${contents}" > ${ACTION_NAME}/package.json;
I didn't like any of the solutions and created the sde utility.
pip install sde
Then, e.g. for the following JSON data:
{
"Parameters": {
"Environment": "Prod",
"InstanceType": "t2.micro",
"AMIID": "ami-02d8e11",
"ConfigRegion": "eu-west-1"
}
}
you can simply do:
sde Parameters.Environment Dev test.json