How to dynamically parse a JSON object with shell jq - json

I got a question about shell's jq. So my JSON object is:
{"1543446000": {"name": "John", "company": "foo"}, "1543359600": {"name": "Kate", "company": "bar"}}
The numers 1543446000 and 1543359600 are UNIX timestamps. How can I parse one of the JSON objects by the timestamp with a shell variable?
My shell script so far:
#!/bin/sh
URL="https://pastebin.com/raw/w7awz7kZ"
export DATE=$(date -d "$today 0" +%s)
JSON=$(curl -H "Accept: application/json" $API_URL)
JSON=$(echo $JSON | jq --arg date $DATE '.$date')
echo $JSON
Doesn't seem to work. My intention is to select the inner JSON object described by one of the timestamps, which are basically midnight of today. So I want to select today's data set.
Any suggestions?
Greets,
Innoberger

You need to use the full syntax for key access, as the dollar sign preclude you using the shorter form. The error message should provide this suggestion.
$ jq --arg date 1543359600 '.$date' tmp.json
jq: error: syntax error, unexpected '$' (Unix shell quoting issues?) at <top-level>, line 1:
.$date
jq: error: try .["field"] instead of .field for unusually named fields at <top-level>, line 1:
.$date
jq: 2 compile errors
Note the error message
'try .["field"] instead of .field'.
You won't need the quotes, though, as that would be how you specify a literal key $date.
$ jq --arg date 1543359600 '.[$date]' tmp.json
{
"name": "Kate",
"company": "bar"
}

Related

Retrieving value from JSON object in bash using JQ

I have the following script:
#!/bin/bash
CONFIG_RECORDER=`aws configservice describe-configuration-recorders`
NAME=$(jq -r ‘.ConfigurationRecorders[].name’ <<<“$CONFIG_RECORDER”)
echo $NAME
I am trying to retrieve the value for name from the following JSON object:
{
"ConfigurationRecorders": [{
"name": "default",
"roleARN": "arn:aws:iam::xxxxxxxxxxxx:role/Config-Recorder",
"recordingGroup": {
"allSupported": true,
"includeGlobalResourceTypes": true,
"resourceTypes": []
}
}]
}
When running the script, I am getting an error stating that jq could not open the file. That is because I am trying to pass in the result stored in a variable. How can I move past this? Below is the error:
jq: error: syntax error, unexpected INVALID_CHARACTER, expecting $end
(Unix shell quoting issues?) at <top-level>, line 1:
‘.ConfigurationRecorders[].name’
jq: 1 compile error
NAME=$(jq -r '.ConfigurationRecorders[].name' <<<"$CONFIG_RECORDER")
<<< is known as here-string.
-r removes the quotes in the result.
[] in the jq query is necessary as ConfigurationRecorders is a JSON array.
$( … ) is just an other form of command substitution than backticks. I prefer this one as it is more readable to me.

Replacing variables in a JSON template using JQ

I want to populate json template with value "Hello Jack", but the "Hello" part shall remain inside of template, is there are any way of doing that, I've tried code below but it gives me error:
jq -n --arg person "Jack" '{my_key: "Hello "$person}'
jq: error: syntax error, unexpected '$', expecting '}' (Unix shell quoting issues?) at <top-level>, line 1:
Use string interpolation syntax like so:
jq -n --arg person Jack '{my_key: "Hello \($person)"}'
And to load the template from a file, use the -f switch:
$ cat template.json
{
"my_key": "Hello \($person)"
}
$ jq -n --arg person Jack -f template.json
{
"my_key": "Hello Jack"
}

"Invalid numeric literal" error from jq trying to modify JSON with variable

I am wanting to pipe values into a bash script that will change values in a json file using jq. I've been working on this for a while now and I can't get past the first set of errors.
Here is my simple json file:
{
"0000000": {
"pogo": "AJHVUYKJBOIHKNNLNM"
},
"7000000": {
"pogo": "PPPVUYKJBOIHKNNLNM"
}
}
Here is my script
#!/bin/bash
#-- pass into script
pgid="${1}"
tpogo="${2}"
file="file.json"
#-- tmp files
tmp_dir="$(mktemp -d -t 'text.XXXXX' || mktemp -d 2>/dev/null)"
tmp_input1="${tmp_dir}/temp_input1.txt"
if [ ! -n "$2" ]; then
{ echo "Check your command as you are missing a variable ..."; echo "Example: script.sh "00000001" "jvkkjbkjbd" ..."; }
exit 1;
fi
jq -r ".${pgid}.pogo = \"${tpogo}\"" "$file" > "$tmp_input1"
cat "$tmp_input1"
rm -rf "$tmp_dir"
Here are the errors:
jq: error: Invalid numeric literal at EOF at line 1, column 9 (while parsing '.0000000.') at <top-level>, line 1:
.0000000.pogo = "XXXXXXX"
jq: error: syntax error, unexpected IDENT, expecting $end (Unix shell quoting issues?) at <top-level>, line 1:
.0000000.pogo = "XXXXXXX"
jq: 2 compile errors
I have tried so many variations from stack and most of them look similar to what I am doing now.
As for the immediate issue: .key works with { "foo": "value" }, but .100 doesn't work with { "100": "value" }; the syntax you're relying is sugar, only available for a limited subset of keys. .["100"] would work, but generating that by expanding shell variables into strings parsed as code is fragile (jq isn't a side-effecting language as of current releases, but in languages that do support I/O operations, such substitutions can be leveraged for injection attacks). To do things the Right Way, pass your variables out-of-band from your code, and use them for lookups in a manner that doesn't rely on what they contain.
The jq equivalent to awk's -v var="$value" is --arg var "$value", used thusly:
jq --arg pgid "$pgid" \
--arg tpogo "$tpogo" \
'.[$pgid].pogo = $tpogo'
Testing this with your data:
json='{"0000000":{"pogo":"AJHVUYKJBOIHKNNLNM"},"7000000":{"pogo":"PPPVUYKJBOIHKNNLNM"}}'
pgid="0000000"
tpogo="XXXXXXX"
jq --arg pgid "$pgid" --arg tpogo "$tpogo" \
'.[$pgid].pogo = $tpogo' <<<"$json"
...emits as output:
{
"0000000": {
"pogo": "XXXXXXX"
},
"7000000": {
"pogo": "PPPVUYKJBOIHKNNLNM"
}
}

JQ error: is not defined at <top-level> when trying to add values to jq template

I have a .jq template that I want to update values under the samples list, formatted as:
{
"test": "abc",
"p": "1",
"v": "1.0.0",
"samples": [
{
"uptime": $uptime,
"curr_connections": $curr_connections,
"listen_disabled_num": $listen_disabled_num,
"conn_yields": $conn_yields,
"cmd_get": $cmd_get,
"cmd_set": $cmd_set,
"bytes_read": $bytes_read,
"bytes_written": $bytes_writtem,
"get_hits": $get_hits,
"rejected_connections": $rejected_connections,
"limit_maxbytes": $limit_maxbytes,
"cmd_flush": $cmd_flush
}
]
}
My shell script to do this is below, I am basically running a command to pull some memcached output stats and want to insert some of the results into the jq template as key/values.
JQ=`cat template.jq`
SAMPLES=(uptime curr_connections listen_disabled_num conn_yields cmd_get cmd_set cmd_flush bytes_read bytes_written get_hits rejected_connections limit_maxbytes)
for metric in ${SAMPLES[*]}
do
KEY=$(echo stats | nc $HOST $PORT | grep $metric | awk '{print $2}')
VALUE=$(echo stats | nc $HOST $PORT | grep $metric | awk '{print $3}')
echo "Using KEY: $KEY with value: $VALUE"
jq -n --argjson $KEY $VALUE -f template.jq
done
Not sure if this is the best way to handle this scenario, but I am getting a ton of errors such as:
jq: error: conn_yields/0 is not defined at <top-level>, line 12:
"conn_yields": $conn_yields,
jq: error: cmd_get/0 is not defined at <top-level>, line 13:
"cmd_get": $cmd_get,
jq: error: cmd_set/0 is not defined at <top-level>, line 14:
"cmd_set": $cmd_set,
If you are going to invoke jq using -f template.jq, then each of the $-variables in template.jq will have to be set separately on the command-line, one by one. In your case, this does not look like a very happy option.
If you are stuck with template.jq as it is, then it will be hard slogging, though there are alternatives besides setting the $-variables on the command line.
Please see https://github.com/stedolan/jq/wiki/Cookbook#using-jq-as-a-template-engine in the jq Cookbook for an alternative to using $-variables. Consider for example the implications of this illustration of "destructuring":
jq -nc '{a:1,b:2} as {a: $a, b:$b} | [$a,$b]'
[1,2]
Another alternative
In your particular case, you could replace all the "$" characters in template.jq with ".", and then pass in a JSON object with the appropriate keys; e.g. change $uptime to .uptime, and then include a key/value pair for uptime.

Invalid numeric literal with jq

I have a large amount of JSON from a 3rd party system which I would like to pre-process with jq, but I am having difficulty composing the query, test case follows:
$ cat test.json
{
"a": "b",
"c": "d",
"e": {
"1": {
"f": "g",
"h": "i"
}
}
}
$ cat test.json|jq .e.1.f
jq: error: Invalid numeric literal at EOF at line 1, column 3 (while parsing '.1.') at <top-level>, line 1:
.e.1.f
How would I get "g" as my output here? Or how do I cast that 1 to a "1" so it is handled correctly?
From jq manual :
You can also look up fields of an object using syntax like .["foo"]
(.foo above is a shorthand version of this, but only for
identifier-like strings).
You also need quotes and use -r if you want raw output :
jq -r '.e["1"].f' test.json
I wrote a shell script function that calls the curl command, and pipes it into the jq command.
function getName {
curl http://localhost:123/getname/$1 | jq;
}
export -f getName
When I ran this from the CLI,
getName jarvis
I was getting this response:
parse error: Invalid numeric literal at line 1, column 72
I tried removing the | jq from the curl command, and I got back the result without jq parsing:
<Map><timestamp>1234567890</timestamp><status>404</status><error>Not Found</error><message>....
I first thought that I had a bad character in the curl command, or that I was using the function param $1 wrong.
Then I counted the number of chars in the result string, and I noticed that the 72nd char in that string was the empty space between "Not Found".
The underlying issue was that I didn't have a method named getname yet in my spring REST controller, so the response was coming back 404 Not Found. But in addition, jq wasn't handling the empty space in the response except by outputting the error message.
I'm new to jq so maybe there is a way to get around the empty space issue, but that's for another day.