I have output.txt file, where my script is storing some outputs, I just need to get the output of ID which is in the 1st line of the output.txt in myscript.sh file, can someone suggest a way to do that
{"id":"**dad04f6d-4e06-4420-b0bc-cb2dcfee2dcf**","name":"new","url":"https://dev.azure.com/vishalmishra2424/82c93136-470c-4be0-b6da-a8234f49a695/_apis/git/repositories/dad04f6d-4e06-4420-b0bc-cb2dcfee2dcf","project":{"id":"82c93136-470c-4be0-b6da-a8234f49a695","name":"vishalmishra","url":"https://dev.azure.com/vishalmishra2424/_apis/projects/82c93136-470c-4be0-b6da-a8234f49a695","state":"wellFormed","revision":12,"visibility":"public","lastUpdateTime":"2021-04-22T14:24:47.2Z"},"size":0,"remoteUrl":"https://vishalmishra2424#dev.azure.com/vishalmishra2424/vishalmishra/_git/new","sshUrl":"git#ssh.dev.azure.com:v3/vishalmishra2424/vishalmishra/new","webUrl":"https://dev.azure.com/vishalmishra2424/vishalmishra/_git/new","isDisabled":false}
The snippet you posted looks like
JSON and a utility named file
which can guess different types of file says that too:
$ file output.txt
output.txt: JSON data
You should use JSON-aware tools to extract value of id, for example
jq:
$ jq -r '.id' output.txt
**dad04f6d-4e06-4420-b0bc-cb2dcfee2dcf**
or jshon:
$ jshon -e id < output.txt
"**dad04f6d-4e06-4420-b0bc-cb2dcfee2dcf**"
Related
I want to read environment variables from .json-file:
{
"PASSPHRASE": "$(cat /home/x/secret)",
}
With the following script:
IFS=$'\n'
for s in $(echo $values | jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" $1); do
export $s
echo $s
done
unset IFS
But the I got $(cat /home/x/secret) in PASSPHRASE, and cat is not executed. When I execute the line export PASSPHRASE=$(cat /home/x/secret), I got the correct result (content of file in environment variable). What do I have to change on my script, to get it working?
When you do export PASSPHRASE=$(cat /home/x/secret) in the shell, it interpretes the $() expression, executes the command within and puts the output of it inside of the variable PASSPHRASE.
When you place $() in the json file, however, it is read by jq and treated as a normal string, which is the equivalent of doing export PASSPHRASE=\$(cat /home/x/secret) (notice the slash, which causes the dollar sign to be escaped and treated as a literal character, instead of creating a new shell). If you do that instead and try to echo the contents of the variable it will have similar results as running your script.
If you want to force bash to interpret the string as a command you could use sh -c <command> instead, for example like this:
test.json:
{
"PASSPHRASE": "cat /home/x/secret"
}
test.sh:
IFS=$'\n'
for s in $(echo $values | jq -r "to_entries|map(\"\(.value|tostring)\")|.[]" $1); do
echo $(sh -c $s)
done
unset IFS
This prints out the contents of /home/x/secret. It does not solve your problem directly but should give you an idea of how you could change your original code to achieve what you need.
Thanks to Maciej I changed the script and got it working:
IFS=$'\n'
for line in $(jq -r "to_entries|map(\"\(.key)=\(.value|tostring)\")|.[]" "$1"); do
lineExecuted=$(sh -c "echo $line")
export "$lineExecuted"
echo "$lineExecuted"
done
unset IFS
find dir1 | grep .json | python -mjson.tool $1 > /dev/null
I am using the above command but this doesn't take the files as inputs. What should i do to check for all the json files in a folder and validate whether its a proper json.
I found this a while back and have been using it:
for j in *.json; do python -mjson.tool "$j" > /dev/null || echo "INVALID $j" >&2; done;
I'm sure it compares similarly to dasho-o's answer, but am including it as another option.
You need to use 'xarg'.
The pipe find/grep will place the file names of the json file to STDIN. You need to create a command line with those file names. This is what xargs does:
find dir1 | grep .json | xargs -L1 python -mjson.tool > /dev/null
Side notes:
Since 'find' has filtering and execution predicates, more compact line can be created
find dir1 -name '*.json' -exec python -mjson.tool '{}' ';'
Also consider using 'jq' as light weight alternative to validating via python.
i want to run tshark for a live capture of my internet traffic my goal is to save the captures in json files in a folder now i'm using :
tshark -P -i 4 -w outfile.pcap
The prolem with this cxommand is that the output file is one gigantic file that i can't use while tshark is running and it's not json .
The -T json option instructs tshark to output JSON:
tshark -T json -i 4 > outfile.json
Is there a commandline documentation to use jq? I am currently running this command:
%jq% -f JSON.txt -r ".sm_api_content"
It is supposed to read from JSON.txt and to output the value of sm_api_content (which is a string).
But I am getting this error:
jq: error: Could not open file .sm_api_content: No such file or directory
Can anyone help me out here?
-f is for specifying a filename to read your "filter" from - the filter in this case being .sm_api_content
It sounds as if you just want to run jq without -f, e.g.
jq -r .sm_api_content JSON.txt
I have a 2MB json file that is only all in one line and now I get an error using jq:
$ jq .<nodes.json
parse error: Invalid literal at line 1, column 377140
How do I debug this on the console? To look at the mentioned column, I tried this:
head -c 377139 nodes.json|tail -c 1000
But I cannot find any error with a wrong t there, so it seems it is not the correct way to reach the position in the file.
How can I debug such a one-liner?
cut the file into more lines with
cat nodes.json|cut -f 1- -d} --output-delimiter=$'}\n'>/tmp/a.json
and analyse /tmp/a.json with, then you get an error with line nr:
parse error: Invalid literal at line 5995, column 47
use less -N /tmp/a.json to find that line
I see you are on a shell prompt. So you could try perl, because your operating system has it pre-installed, presumably.
cat nodes.json | json_xs -f json -t json-pretty
This tells the json_xs command line program to parse the file and prettify it.
If you don't have json_xs installed, you could try json_pp (pp is for pure-perl).
If you have neither, you must install the JSON::XS perl module with this command:
sudo cpanm JSON::XS
[sudo] password for knb:
--> Working on JSON::XS
Fetching http://www.cpan.org/authors/id/M/ML/MLEHMANN/JSON-XS-3.01.tar.gz ... OK
Configuring JSON-XS-3.01 ... OK
Building and testing JSON-XS-3.01 ... OK
Successfully installed JSON-XS-3.01 (upgraded from 2.34)
1 distribution installed
This installs JSON::XS and a few helper scripts, among them json_xs and json_pp.
Then you can run this simple one-liner:
cat dat.json | json_xs -f json -t json-pretty
After misplacing a parenthesis to force a nesting-error somewhere in the valid json file dat.json I got this:
cat dat.json | json_xs -f json -t json-pretty
'"' expected, at character offset 1331 (before "{"A_DESC":"density i...") at /usr/local/bin/json_xs line 181, <STDIN> line 1.
Maybe this is more informative than the jq output.