Filter json results from a file - json

I'm trying to write a script that reads all lines of .txt file. Each line is valid JSON with fields: saddr, and data. I want to get Bytes size (with wc -c), but can't do that.
while IFS='' read -r line || [[ -n "$line" ]]; do
echo $line |
jq -r '.data' |
bytes=$(wc -c)
if (( $bytes > 200 )); then
echo $bytes
fi
done< "testear.txt"
Example testear.txt:
{ "saddr": "157.130.222.66", "data": "9f00032a30000000" }

You're trying to set bytes in a pipeline, with the command:
echo $line | jq -r '.data' | bytes=$(wc -c)
However, the commands in a pipeline are run in a subshell, so environment variables set there will not be available in the top-level shell.
Instead, try this:
bytes=$(echo "$line" | jq -r '.data' | wc -c)
- so that you're setting bytes at the top level shell.

Related

shell script-Unable to get expected response by running through loop inside .sh file, but getting it by individually defining variables in file

Fileread.sh
#!/bin/bash
s=ch.qos.logback
e=logback-access
curl -s "https://search.maven.org/solrsearch/select?q=g:$s+AND+a:$e&core=gav&rows=1&wt=json" | jq ".response.docs[].v"`
output:"1.2.11"
This code is working perfectly fine But when I try storing the s and e values in a .txt file with : seperated and then try running, I get nothing in response
textFile.txt
ch.qos.logback:logback-access
fileread.sh
#!/bin/bash
read -p "Enter file name:" filename
while IFS=':' read -r s e
do
curl -s "https://search.maven.org/solrsearch/select?q=g:${s}+AND+a:${e}&core=gav&rows=1&wt=json" | jq ".response.docs[].v"
done < $filename
I have tried :
xy=$(curl -s "https://search.maven.org/solrsearch/select?q=g:${s}+AND+a:${e}&core=gav&rows=1&wt=json" | jq ".response.docs[].v")
echo "$xy"
xy=$(curl -s "'https://search.maven.org/solrsearch/select?q=g:'${s}'+AND+a:'${e}&core=gav&rows=1&wt=json" | jq ".response.docs[].v")
echo "$xy"
url=`https://search.maven.org/solrsearch/select?q=g:${s}+AND+a:${e}&core=gav&rows=1&wt=json`
echo url
xx=`curl -s "$url" | jq ".response.docs[].v"`
echo $xx
Try this:
#!/bin/bash
echo "Enter file name:"
read filename
IFS=':' read -r s e < $filename
echo $s $e
curl -s "https://search.maven.org/solrsearch/select?q=g:${s}+AND+a:${e}&core=gav&rows=1&wt=json" | jq ".response.docs[].v"
~

diffculty using bash to pass the contents of `top` into a json file

I want to use a bash script to output the contents of top command and then write it to a json file. But I'm having difficulty writing the slashes/encodings/line breaks into a file with a valid json object
Here's what I tried:
#!/bin/bash
message1=$(top -n 1 -o %CPU)
message2=$(top -n 1 -o %CPU | jq -aRs .)
message3=$(top -n 1 -o %CPU | jq -Rs .)
message4=${message1//\\/\\\\/}
echo "{\"message\":\"${message2}\"}" > file.json
But when I look at the file.json, it looks soemthing like this:
{"message":""\u001b[?1h\u001b=\u001b[?25l\u001b[H\u001b[2J\u001b(B\u001b[mtop - 21:34:53 up 55 days, 5:14, 2 users, load average: 0.17, 0.09, 0.03\u001b(B\u001b[m\u001b[39;49m\u001b(B\u001b[m\u001b[39;49m\u001b[K\nTasks:\u001b(B\u001b[m\u001b[39;49m\u001b[1m 129 \u001b(B\u001b[m\u001b[39;49mtotal,\u001b(B\u001b[m\u001b[39;49m\u001b[1m 1 \u001b(B\u001b[m\u001b[39;49mrunning,\u001b(B\u001b[m\u001b[39;49m\u001b[1m 128 \u001b(B\u001b[m\u001b[39;49msleeping,\u001b(B\u001b[m
Each of the other attempts with message1 to message4 all result in various json syntax issues.
Can anyone suggest what I should try next?
You don't need all the whistle of echo and multiple jq invocations:
top -b -n 1 -o %CPU | jq -aRs '{"message": .}' >file.json
Or pass the output of the top command as an argument variable.
Using --arg to pass arguments to jq:
jq -an --arg msg "$(top -b -n 1 -o %CPU)" '{"message": $msg}' >file.json

Iterate over jq loop: assign key if value.subvalue matches condition

I'm trying to perform some work on keys whose value matches a certain condition. First, the json I'm working with looks similar to this when I execute a CLI tool in linux:
$ ./sacli ClusterQuery
"i-aaaaaaabbbbbbbbbb": {
"sacli_ip": "10.0.52.37",
"sacli_listen_ip": "10.0.52.37",
"sacli_port": "945"
},
"i-ccccccccddddddddd": {
"sacli_ip": "10.0.48.68",
"sacli_listen_ip": "10.0.48.68",
"sacli_port": "945"
}
I would ideally like to loop through these entries, do a check and then perform operations based on the check. So conceptually:
for i in $(./sacli ClusterQuery | jq -r '<some syntax to get each key, value>'); do
node_id=key
ip=value.sacli_ip
ping -c 1 -W 1 $ip
ping_result=$?
if [[ $ping_result -eq 1 ]]; then #ping has failed
./remove $node_id
fi
done
I'd avoid your for loop in favor of piping to while, like this:
./sacli ClusterQuery | jq -rc '
keys_unsorted[] as $k | $k, .[$k].sacli_ip' |
while read -r node_id ; do
read -r ip
ping -c 1 -W 1 "$ip"
ping_result=$?
if [[ $ping_result -eq 1 ]]; then #ping has failed
echo ./remove "$node_id"
fi
done
This assumes the keys and sacli_ip values don't have embedded newlines.
You can of course easily add the appropriate select filter(s) inside the jq program.
The -c is there for safety.

Using jq to combine json files, getting file list length too long error

Using jq to concat json files in a directory.
The directory contains a few hundred thousand files.
jq -s '.' *.json > output.json
returns an error that the file list is too long. Is there a way to write this that uses a method that will take in more files?
If jq -s . *.json > output.json produces "argument list too long"; you could fix it using zargs in zsh:
$ zargs *.json -- cat | jq -s . > output.json
That you could emulate using find as shown in #chepner's answer:
$ find -maxdepth 1 -name \*.json -exec cat {} + | jq -s . > output.json
"Data in jq is represented as streams of JSON values ... This is a cat-friendly format - you can just join two JSON streams together and get a valid JSON stream.":
$ echo '{"a":1}{"b":2}' | jq -s .
[
{
"a": 1
},
{
"b": 2
}
]
The problem is that the length of a command line is limited, and *.json produces too many argument for one command line. One workaround is to expand the pattern in a for loop, which does not have the same limits as a command line, because bash can iterate over the result internally rather than having to construct an argument list for an external command:
for f in *.json; do
cat "$f"
done | jq -s '.' > output.json
This is rather inefficient, though, since it requires running cat once for each file. A more efficient solution is to use find to call cat with as many files as possible each time.
find . -name '*.json' -exec cat '{}' + | jq -s '.' > output.json
(You may be able to simply use
find . -name '*.json' -exec jq -s '{}' + > output.json
as well; it may depend on what is in the files and how multiple calls to jq using the -s option compares to a single call.)
[EDITED to use find]
One obvious thing to consider would be to process one file at a time, and then "slurp" them:
$ while IFS= read -r f ; cat "$f" ; done <(find . -maxdepth 1 -name "*.json") | jq -s .
This however would presumably require a lot of memory. Thus the following may be closer to what you need:
#!/bin/bash
# "slurp" a bunch of files
# Requires a version of jq with 'inputs'.
echo "["
while read f
do
jq -nr 'inputs | (., ",")' $f
done < <(find . -maxdepth 1 -name "*.json") | sed '$d'
echo "]"

verify that a json field exists with jq and bash?

I have a script that uses jq for parsing a json string MESSAGE (that is read from another application). Meanwhile the json has changed and a field is split in 2 fields: file_path is now split into folder and file. The script was reading the file_path, now the folder may not be present, so for creating the path of the file I have to verify if the field is there. I have search for a while on the internet, and manage to do:
echo $(echo $MESSAGE | jq .folder -r)$'/'$(echo $MESSAGE | jq .file -r)
if [ $MESSAGE | jq 'has(".folder")' -r ]
then
echo $(echo $MESSAGE | jq .folder -r)$'/'$(echo $MESSAGE | jq .file -r)
else
echo $(echo $MESSAGE | jq .file -r)
fi
where MESSAGE='{"folder":"FLDR","file":"fl"}' or MESSAGE='{"file":"fl"}'
The first line is printing FLDR/fl or null/fl if the folder field is not present. So I have thought to create an if that is verifying if the folder field is present or not, but it seems that I am doing it wrong and cannot figure out what is wrong. The output is
bash: [: missing `]'
jq: ]: No such file or directory
null/fl
I'd do the whole thing in a jq filter:
echo "$MESSAGE" | jq -r '[ .folder, .file ] | join("/")'
In the event that you want to do it with bash (or to learn how to do this sort of thing in bash), two points:
Shell variables should almost always be quoted when they are used (i.e., "$MESSAGE" instead of $MESSAGE). You will run into funny problems if one of the strings in your JSON ever contains a shell metacharacter (such as *) and you forgot to do that; the string will be subject to shell expansion (and that * will be expanded into a list of files in the current working directory).
A shell if accepts as condition a command, and the decision where to branch is made depending on the exit status of that command (true if the exit status is 0, false otherwise). The [ you attempted to use is just a command (an alias for test, see man test) and not special in any way.
So, the goal is to construct a command that exits with 0 if the JSON object has a folder property, non-zero otherwise. jq has a -e option that makes it return 0 if the last output value was not false or null and non-zero otherwise, so we can write
if echo "$MESSAGE" | jq -e 'has("folder")' > /dev/null; then
echo "$MESSAGE" | jq -r '.folder + "/" + .file'
else
echo "$MESSAGE" | jq -r .file
fi
The > /dev/null bit redirects the output from jq to /dev/null (where it is ignored) so that we don't see it on the console.