Bring multiple matches with regex in bash, using while - json

So I'm trying to make a bash script which brings the actual weather + some extra info. It should call an API with curl, then it gets the info in json, and then parse the information with some regex. The API brings the temp for each hour of the day, which is a lot.
#!/bin/bash
text="{
"temp": 286.92,
"feels_like": 286.05,
"temp_min": 286.17,
"temp_max": 286.92,
"pressure": 1019,
"sea_level": 1019,
"grnd_level": 959,
"humidity": 65,
"temp_kf": 0.75
"temp": 263.92, #<--these 2 "temp" are just examples I put
"temp": 277.92, #in here so I can work better
},
regex='((temp)...([0-9]+[.]?[0-9]+))'
while [[ $text =~ $regex ]]; do
echo ${BASH_REMATCH}
done
I want to get each one of the "temp": and print it to the user. It didn't work with an if statement because it matches only a single time, so I found that this can be done using a while statement. But this one I wrote results in a infinite loop of results!
My desired result is:
temp: 286.92
temp: 263.92
temp: 277.92
But what I get is an infinite loop of:
temp: 286.92
temp: 286.92
temp: 286.92
temp: 286.92
temp: 286.92
temp: 286.92
temp: 286.92
temp: 286.92
temp: 286.92
temp: 286.92
...

Read each line of the text, and do the pattern matching.
regex='((temp)...([0-9]+[.]?[0-9]+))'
IFS=$'\n'; for line in $text; do
if [[ $line=~ $regex ]]; then
echo ${BASH_REMATCH[0]}
fi
done
Live Demo

Try learning to use jq to deal with JSON in bash. It is powerful and is designed for this.
If your curl query returns a single JSON, the command will just do your job:
curl https://someurl | jq -r '"temp: \(.temp)"'
If it returns a list of JSON, use the following instead:
curl https://someurl | jq -r '"temp: \(.[].temp)"'

Did it.
regex='((temp)...([0-9]+[.]?[0-9]+))'
IFS=$'\n ,;'; while read -r line; do
#echo $line
if [[ $line =~ $regex ]]; then
echo $BASH_REMATCH
fi
done <<< $text
Don't even know if IFS$'\n ,;'; is doing something, but I searched a lot about for lops and while and it worked, thanks everyone for the help ^^

Related

How to parse json in shell script using jq add store into array in shell script [duplicate]

I'm parsing a JSON response with a tool called jq.
The output from jq will give me a list of full names in my command line.
I have the variable getNames which contains JSON, for example:
{
"count": 49,
"user": [{
"username": "jamesbrown",
"name": "James Brown",
"id": 1
}, {
"username": "matthewthompson",
"name": "Matthew Thompson",
"id": 2
}]
}
I pass this through JQ to filter the json using the following command:
echo $getNames | jq -r .user[].name
Which gives me a list like this:
James Brown
Matthew Thompson
I want to put each one of these entries into a bash array, so I enter the following commands:
declare -a myArray
myArray=( `echo $getNames | jq -r .user[].name` )
However, when I try to print the array using:
printf '%s\n' "${myArray[#]}"
I get the following:
James
Brown
Matthew
Thompson
How do I ensure that a new index is created after a new line and not a space? Why are the names being separated?
Thanks.
A simple script in bash to feed each line of the output into the array myArray.
#!/bin/bash
myArray=()
while IFS= read -r line; do
[[ $line ]] || break # break if line is empty
myArray+=("$line")
done < <(jq -r .user[].name <<< "$getNames")
# To print the array
printf '%s\n' "${myArray[#]}"
Just use mapfile command to read multiple lines into an array like this:
mapfile -t myArray < <(jq -r .user[].name <<< "$getNames")

shell script parsing issue: unexpected INVALID_CHARACTER (Windows cmd shell quoting issues?)

What I am doing?
I have one JSON file as sonar-report.json. I want to iterate sonar-report.json in shell script, to read values of json.
To parse JSON file I am using jq https://stedolan.github.io/jq/
So Following code I was trying to execute in shell script
alias jq=./jq-win64.exe
for key in $(jq '.issues | keys | .[]' sonar-report.json); do
echo "$key"
line=$(jq -r ".issues[$key].line" sonar-report.json)
done
Problem
When i execute this, console give me error:
jq: error: syntax error, unexpected INVALID_CHARACTER (Windows cmd shell quoting issues?) at <top-level>, line 1:
If I update my above script, and add static index of array then script works fine
alias jq=./jq-win64.exe
for key in $(jq '.issues | keys | .[]' sonar-report.json); do
echo "$key"
line0=$(jq -r ".issues[0].line" sonar-report.json)
line1=$(jq -r ".issues[1].line" sonar-report.json)
done
so at the end what i want :
I want to iterate values and print in console like
alias jq=./jq-win64.exe
for key in $(jq '.issues | keys | .[]' sonar-report.json); do
line=$(jq -r ".issues[$key].line" sonar-report.json)
echo $line
done
so the output should be
15
This is my JSON file as sonar-report.json
{
"issues": [
{
"key": "016B7970D27939AEBD",
"component": "bits-and-bytes:src/main/java/com/catalystone/statusreview/handler/StatusReviewDecisionLedHandler.java",
"line": 15,
"startLine": 15,
"startOffset": 12,
"endLine": 15,
"endOffset": 14,
"message": "Use the \"equals\" method if value comparison was intended.",
"severity": "MAJOR",
"rule": "squid:S4973",
"status": "OPEN",
"isNew": true,
"creationDate": "2019-06-21T15:19:18+0530"
},
{
"key": "AWtqCc-jtovxS8PJjBiP",
"component": "bits-and-bytes:src/test/java/com/catalystone/statusreview/service/StatusReviewInitiationSerivceTest.java",
"message": "Fix failing unit tests on file \"src/test/java/com/catalystone/statusreview/service/StatusReviewInitiationSerivceTest.java\".",
"severity": "MAJOR",
"rule": "common-java:FailedUnitTests",
"status": "OPEN",
"isNew": false,
"creationDate": "2019-06-18T15:32:08+0530"
}
]
}
please help me, Thanks in advance
This looks to me like an instance of Windows/Unix line-ending incompatibility, indicated in jq bugs 92 (for Cygwin) and 1870 (for MSYS2).
Any of the workarounds indicated in those bug reports should work, but once the fix gets into the release binary (presumably v1.7), the simplest solution is to use the new -b command-line option. (The option is available in recent jq preview builds; see the second bug report listed above):
for key in $(jq -b '.issues | keys | .[]' sonar-report.json); do
line=$(jq -rb ".issues[$key].line" sonar-report.json)
# I added quotes in the next line, because it's better style.
echo "$line"
done
Until the next version of jq is available, or if you don't want to upgrade for some reason, a good workaround is to just remove the CRs by piping the output of jq through tr -d '\r':
for key in $(jq -'.issues | keys | .[]' sonar-report.json | tr -d '\r'); do
line=$(jq -r ".issues[$key].line" sonar-report.json | tr -d '\r')
echo "$line"
done
However, as pointed out in a comment by Cyrus, you probably don't need to iterate line-by-line in a shell loop, which is incredibly inefficient since it leads to reparsing the entire JSON input many times. You can use jq itself to iterate, with the much simpler:
jq '.issues[].line' solar-response.json
which will parse the JSON file just once, and then produce each .line value in the file. (You probably still want to use the -b command-line option or other workaround, depending on what you intend to do with the output.)

JQ error: is not defined at <top-level> when trying to add values to jq template

I have a .jq template that I want to update values under the samples list, formatted as:
{
"test": "abc",
"p": "1",
"v": "1.0.0",
"samples": [
{
"uptime": $uptime,
"curr_connections": $curr_connections,
"listen_disabled_num": $listen_disabled_num,
"conn_yields": $conn_yields,
"cmd_get": $cmd_get,
"cmd_set": $cmd_set,
"bytes_read": $bytes_read,
"bytes_written": $bytes_writtem,
"get_hits": $get_hits,
"rejected_connections": $rejected_connections,
"limit_maxbytes": $limit_maxbytes,
"cmd_flush": $cmd_flush
}
]
}
My shell script to do this is below, I am basically running a command to pull some memcached output stats and want to insert some of the results into the jq template as key/values.
JQ=`cat template.jq`
SAMPLES=(uptime curr_connections listen_disabled_num conn_yields cmd_get cmd_set cmd_flush bytes_read bytes_written get_hits rejected_connections limit_maxbytes)
for metric in ${SAMPLES[*]}
do
KEY=$(echo stats | nc $HOST $PORT | grep $metric | awk '{print $2}')
VALUE=$(echo stats | nc $HOST $PORT | grep $metric | awk '{print $3}')
echo "Using KEY: $KEY with value: $VALUE"
jq -n --argjson $KEY $VALUE -f template.jq
done
Not sure if this is the best way to handle this scenario, but I am getting a ton of errors such as:
jq: error: conn_yields/0 is not defined at <top-level>, line 12:
"conn_yields": $conn_yields,
jq: error: cmd_get/0 is not defined at <top-level>, line 13:
"cmd_get": $cmd_get,
jq: error: cmd_set/0 is not defined at <top-level>, line 14:
"cmd_set": $cmd_set,
If you are going to invoke jq using -f template.jq, then each of the $-variables in template.jq will have to be set separately on the command-line, one by one. In your case, this does not look like a very happy option.
If you are stuck with template.jq as it is, then it will be hard slogging, though there are alternatives besides setting the $-variables on the command line.
Please see https://github.com/stedolan/jq/wiki/Cookbook#using-jq-as-a-template-engine in the jq Cookbook for an alternative to using $-variables. Consider for example the implications of this illustration of "destructuring":
jq -nc '{a:1,b:2} as {a: $a, b:$b} | [$a,$b]'
[1,2]
Another alternative
In your particular case, you could replace all the "$" characters in template.jq with ".", and then pass in a JSON object with the appropriate keys; e.g. change $uptime to .uptime, and then include a key/value pair for uptime.

Bash Jq parse json string

I've to call a file and pass a json as parameters in this way
(suppose that my file is called test.sh), from bash I need to do something like this:
./test.sh "[{\"username\":\"user1\",\"password\":\"pwd1\",\"group\":\"usergroup1\"},{\"username\":\"user2\",\"password\":\"pwd2\",\"group\":\"usergroup2\"},{\"username\":\"user3\",\"password\":\"pwd3\",\"group\":\"usergroup3\"}]"
and the content of test.sh is the following
#!/bin/bash
#read the json
system_user="$1"
printf "$system_user"
accounts=($(jq -s ".[]" <<< $system_user))
printf "$accounts"
for account in "${accounts[#]}"
do
printf "\n\n$account\n\n"
done
the output of -> printf "$system_user" is
[{"username":"user1","password":"pwd1","group":"usergroup1"},{"username":"user2","password":"pwd2","group":"usergroup2"},{"username":"user3","password":"pwd3","group":"usergroup3"}]
but the output of -> printf "$accounts" is something like this
[
[
{
"username":
"user1"
etc. etc. one object for each token :-(
and so on, but what I was expecting is an array of three object (like you can test on jqplay.org)
{
"username": "user1",
"password": "pwd1",
"group": "usergroup1"
}
{
"username": "user2",
"password": "pwd2",
"group": "usergroup2"
}
{
"username": "user3",
"password": "pwd3",
"group": "usergroup3"
}
In this way I can make a foreach on ${accounts[#]}
What I'm doing wrong?
Thank you
With the -c option, you can print each JSON object on a single line, making it easier to populate the array you want.
$ readarray -t arr < <(jq -c '.[]' <<< "[{\"username\":\"user1\",\"password\":\"pwd1\",\"group\":\"usergroup1\"},{\"username\":\"user2\",\"password\":\"pwd2\",\"group\":\"usergroup2\"},{\"username\":\"user3\",\"password\":\"pwd3\",\"group\":\"usergroup3\"}]")
$ printf "Object: %s\n" "${arr[#]}"
Object: {"username":"user1","password":"pwd1","group":"usergroup1"}
Object: {"username":"user2","password":"pwd2","group":"usergroup2"}
Object: {"username":"user3","password":"pwd3","group":"usergroup3"}
You are interchanging bash arrays and JSON arrays. When you are creating accounts array, bash splits the elements per each whitespace. That's why you don't get what you expect. You can try the following:
declare -A accounts
while IFS="=" read -r key value
do
accounts[$key]="$value"
done < <(jq -r "to_entries|map(\"\(.key)=\(.value)\")|.[]" <<< $system_user)
for account in "${accounts[#]}"
do
printf "$account\n"
done
(stolen from here: https://stackoverflow.com/a/26717401/328977)
to get the following output:
{"username":"user1","password":"pwd1","group":"usergroup1"}
{"username":"user2","password":"pwd2","group":"usergroup2"}
{"username":"user3","password":"pwd3","group":"usergroup3"}

Best way to parse JSON-like file using awk/sed

I have a file which contents are like the following:
{application_name, [
{settings, [
{generic_1, [
{key_1, "value"},
{key_2, 1},
{key_3, [something, other]}
]},
{generic_2, [
{key_1, "value"},
{key_3, [something, other]}
]},
{{generic_2, specific_1}, [
{key_3, [entirely, different]}
]},
]}
]}
Now I'm looking for a way to parse this using awk or sed (or something else). What I need is to be able to specify a key, and then get the "blockname" returned.
f.e. if I want all settings for key_3 returned as follows:
generic_1 [something, other]
generic_2 [something, other]
specific_1 [entirely, different]
What would be the best way to approach this?
The best solution for how to parse JSON data with sed or awk is... not to do that with sed or awk. They aren't designed for it.
Use a tool that understands JSON like
perl
python
ruby
javascript
jq
Just about anything else
Using anything like sed or awk on this is going to be fragile (at best).
I do agree with Etan that this is a job for another tools.
This is an gnu awk approach (due to multiple characters in RS), not a complete solution.
awk -v RS="generic_[0-9]" 'NR==1 {f=RT;next} {$1=$1;n=split($0,a,"[][]");if (a[1]~/}/) {split(a[1],b,"[ }]");f=b[2]};printf "%s [",f;for (i=1;i<=n;i++) if (a[i]~/key_3/) print a[i+1]"]";f=RT}' file
generic_1 [something, other]
generic_2 [something, other]
specific_1 [entirely, different]
Or some more readable:
awk -v RS="generic_[0-9]" '
NR==1 {
f=RT
next}
{
$1=$1
n=split($0,a,"[][]")
if (a[1]~/}/) {
split(a[1],b,"[ }]")
f=b[2]}
printf "%s [",f
for (i=1;i<=n;i++)
if (a[i]~/key_3/)
print a[i+1]"]"
f=RT
}' file