I have a json content (output.json)
{"project": {"id": "A", "content": [{"name": "XYZ", "location": "Berlin", "comments":""}, {"name": "ABC", "location": "NewYork", "comments": "Hwllo"}, {"name": "DEF", "location": "Paris", "comments": "Success"}]}}
I would like to extract location key with value when name matches say ABC from the above json using bash or shell commands
I tried something like below which gives be content within curly braces. but not sure on searching specific key.
cat output.json | grep -o -e "{.*}"
Output expectations:
if name matches ABC, get output as "location":"NewYork"
Any suggestions on processing further?
For extracting from json you should use jq if you can. According to authors "jq is like sed for JSON data" (source).
In your case it should be:
$ jq -r '.project' output.json | jq -r '.content' | jq '.[] | select(.name=="ABC")' | jq -r '.location'
Output will be:
NewYork
To get output which you required so:
"location":"NewYork"
You can use:
echo "\"location\":$(jq -r '.project' output.json | jq -r '.content' | jq '.[] | select(.name=="ABC")' | jq '.location')"
Before you use jq you should install it on Debian and Ubuntu it will be:
$ sudo apt install jq
for other OS you should check this site.
There may be better ways to do it, in a quick twisted way is here.
cat output.json | sed 's/"name"/\n"name"/g' | grep '"name"' | awk -F',' '{print $2}'
Add | grep <preferred name> also if need to filter based on name.
Use Perl
$ perl -0777 -lne ' while(/"name":\s+"ABC",\s+"location":\s+(\S+)/msg) { print "$1\n" } ' output.json
"NewYork",
$ cat output.json
{"project": {"id": "A", "content": [{"name": "XYZ", "location": "Berlin", "comments":""}, {"name": "ABC", "location": "NewYork", "comments": "Hwllo"}, {"name": "DEF", "location": "Paris", "comments": "Success"}]}}
$
Please use a JSON parser for processing JSON.
With Xidel it's as simple as:
xidel -s output.json -e '($json//content)()[name="ABC"]/location'
Alternatively:
xidel -s output.json -e '$json/(.//content)()[name="ABC"]/location'
or in full:
xidel -s output.json -e '$json/project/(content)()[name="ABC"]/location'
The above is XPath notation (example #11, Reading JSON). Dot notation (like jq) is also possible:
xidel -s output.json -e '($json).project.content()[name="ABC"].location'
[edit]
Commands above put out NewYork and I just realized you require the output to be "location":"NewYork". Xidel can do that too:
xidel -s output.json -e '($json//content)()[name="ABC"]/concat("""location"":""",location,"""")'
[/edit]
Related
I have the below JSON string with multiple JSON objects. I would like to extract just the id and name from each object and print it.
{
"users": [
{
"id": "1",
"name": "John Wick",
"location": "USA"
},
{
"id": "2",
"name": "Walter White",
"location": "USA"
}
]
}
I am using the below code to extract the id and name using 'jq'
for key in $(jq -c '.users | .[]' sample.json); do
id=$(jq -r '.id' <<< "$key");
name=$(jq -r '.name' <<< "$key")
echo $id $name
done
But I am getting parsing errors like below. Can someone help me with this?
parse error: Unfinished string at EOF at line 2, column 0
I tried replacing spaces with a combination of special chars and replace again special chars with spaces. It worked for me but I need a better solution than this.
You can use the csv or tsv features of jq - lots of great examples in the man page!
man jq
Try this:
jq -r '.users[] | [.id , .name] | #csv' sample.json
Example output:
$ jq -r '.users[] | [.id , .name] | #csv' sample.json
"1","John Wick"
"2","Walter White"
Or using string interpolation:
$ jq -r '.users[] | [.id, .name] | "\(.[0]) \(.[1])"' sample.json
1 John Wick
2 Walter White
Using 2 read's to capture the values, we can let JQ loop over the objects, and output the id and name. :
#!/bin/bash
jq -r -c '.users[] | .id, .name' /tmp/input3 | while read -r id && read -r name; do
echo -e "ID: ${id}\t Name: ${name}"
done
Output:
ID: 1 Name: John Wick
ID: 2 Name: Walter White
This question already has answers here:
Iterating through JSON array in Shell script
(10 answers)
Closed last year.
Intention: Need to iterate the each item in json array using jq and proccess seperately
api returns data in form
{
"name": [
{
"first": "first",
"class": false,
"type": "B"
},
{
"first": "second",
"class": false,
"type": "B"
},
{
"first": "third",
"class": false,
"type": "A"
}
]
}
And i am able to parse it as
data=`curl http://some.ur;l`
echo "$data" | jq -rc '.name[] | select(.type=="B") | .class=true'
it returns data as
{"first": "first","class": true, "type": "B"}
{"first": "second", "class": true, "type": "B"}
Now i want to proccess these two outputs in such a way so that i can make a PUT call for each of them . I tried to learn some concepts of xargs but could not make it done
I piped the output to | xargs -n1 but it removed all the quotes from the string
You could loop through using while read:
while IFS= read -r line
do
# your PUT goes here
# dummy
printf 'PUTting JSON: %s\n' "$line"
done < <(jq -c '.name[] | select(.type == "B") | .class = true')
If you absolutely need to use xargs then use a newline character as delimiter
jq -c '.name[] | select(.type == "B") | .class = true' \
| xargs -d $'\n' printf 'PUTting JSON: %s\n' # dummy
The only part you're missing is a while read loop.
data=$(curl ...)
while IFS= read -r line; do
curl -XPUT -d"$line" http://some_url
done < <(jq -c ... <<<"$data")
...or...
curl ... |
jq -c ... |
xargs -d $'\n' -n 1 curl -XPUT http://some_url -d
Note:
xargs -d $'\n' tells xargs to treat only newlines (and not other whitespace) as separators between items; it also turns off treatment of quotes and backslashes as syntactic rather than literal.
xargs -n 1 tells xargs only to pass one data item to each copy of curl.
Making the last argument to curl be -d means that xargs will place the data immediately after that position.
{
"db_status": {
"sysa": {
"taskname": "AB",
"state": "Running",
"status": "System ATTENTION",
"updated": "0727",
"version": "5"
},
"sysb": {
"taskname": "null",
"state": "Standby",
"status": "System OK",
"updated": "0727",
"version": "6"
}
}
}
CURL command returns json object. Trying to get both state variables in an array i.e. running and standby. So far i have tried
curl -s http://localhost:9099/api | grep state | sed 's/"//g' | awk -F ": " '/state/ {print $2}' | tr '\n' ' ' | sed s'/..$//'
FYI you don't need a pipeline of 20 different commands when you're using awk. The command line you provided can be written as just one command:
awk -F'"' '$2=="state"{printf "%s%s", (++c>1?", ":""),$4}'
but all you really need is:
awk -F'"' '$2=="state"{print $4}'
and to save the output in a shell array (assuming your json is really always formatted as you show in your question) would be:
$ arr=( $(cat file | awk -F'"' '$2=="state"{print $4}') )
$ echo "${arr[0]}"
Running
$ echo "${arr[1]}"
Standby
Replace cat file with your curl command.
You could use jq
$ curl -s http://localhost:9099/api | jq '.db_status.sysa.state, .db_status.sysb.state'
"Running"
"Standby"
Or if you want all the entries no matter how many
$ curl -s http://localhost:9099/api | jq '.db_status[].state'
"Running"
"Standby"
Alternatively, if you don't have jq (or just looking for ways of feeling pain) you can parse most json in bash using ticktick -- which is written in 250 lines of bash
I'd like to extract the "id" key from this single line of JSON.
I believe this can be accomplished with grep, but I am not sure on the correct way.
If there is a better way that does not have dependencies, I would be interested.
Here is my example output:
{
"data": {
"name": "test",
"id": "4dCYd4W9i6gHQHvd",
"domains": ["www.test.domain.com", "test.domain.com"],
"serverid": "bbBdbbHF8PajW221",
"ssl": null,
"runtime": "php5.6",
"sysuserid": "4gm4K3lUerbSPfxz",
"datecreated": 1474597357
},
"actionid": "WXVAAHQDCSILMYTV"
}
If you have a grep that can do Perl compatible regular expressions (PCRE):
$ grep -Po '"id": *\K"[^"]*"' infile.json
"4dCYd4W9i6gHQHvd"
-P enables PCRE
-o retains nothing but the match
"id": * matches "id" and an arbitrary amount of spaces
\K throws away everything to its left ("variable size positive look-behind")
"[^"]*" matches two quotes and all the non-quotes between them
If your grep can't do that, you an use
$ grep -o '"id": *"[^"]*"' infile.json | grep -o '"[^"]*"$'
"4dCYd4W9i6gHQHvd"
This uses grep twice. The result of the first command is "id": "4dCYd4W9i6gHQHvd"; the second command removes everything but a pair of quotes and the non-quotes between them, anchored at the end of the string ($).
But, as pointed out, you shouldn't use grep for this, but a tool that can parse JSON – for example jq:
$ jq '.data.id' infile.json
"4dCYd4W9i6gHQHvd"
This is just a simple filter for the id key in the data object. To get rid of the double quotes, you can use the -r ("raw output") option:
$ jq -r '.data.id' infile.json
4dCYd4W9i6gHQHvd
jq can also neatly pretty print your JSON:
$ jq . infile.json
{
"data": {
"name": "test",
"id": "4dCYd4W9i6gHQHvd",
"domains": [
"www.test.domain.com",
"test.domain.com"
],
"serverid": "bbBdbbHF8PajW221",
"ssl": null,
"runtime": "php5.6",
"sysuserid": "4gm4K3lUerbSPfxz",
"datecreated": 1474597357
},
"actionid": "WXVAAHQDCSILMYTV"
}
Just pipe your data to jq and select by keys
"data": {
"name": "test",
"id": "4dCYd4W9i6gHQHvd",
"domains": [
"www.test.domain.com",
"test.domain.com"
],
"serverid": "bbBdbbHF8PajW221",
"ssl": null,
"runtime": "php5.6",
"sysuserid": "4gm4K3lUerbSPfxz",
"datecreated": 1474597357
},
"actionid": "WXVAAHQDCSILMYTV"
} | jq '.data.id'
# 4dCYd4W9i6gHQHvd
Tutorial Here
I found myself that the best way is to use python, as it handles JSON natively and is preinstalled on most systems these days, unlike jq:
$ python -c 'import sys, json; print(json.load(sys.stdin)["data"]["id"])' < infile.json
4dCYd4W9i6gHQHvd
No python ,jq, awk, sed just GNU grep:
#!/bin/bash
json='{"data": {"name": "test", "id": "4dCYd4W9i6gHQHvd", "domains": ["www.test.domain.com", "test.domain.com"], "serverid": "bbBdbbHF8PajW221", "ssl": null, "runtime": "php5.6", "sysuserid": "4gm4K3lUerbSPfxz", "datecreated": 1474597357}, "actionid": "WXVAAHQDCSILMYTV"}'
echo $json | grep -o '"id": "[^"]*' | grep -o '[^"]*$'
Tested & working here: https://ideone.com/EG7fv7
source: https://brianchildress.co/parse-json-using-grep
$ grep -oP '"id": *"\K[^"]*' infile.json
4dCYd4W9i6gHQHvd
Hopefully it will work for all. As this will work for me to print without quotes.
In my Unix shell script, when I execute a curl command, the result will be displayed as below which I am redirecting to file:
{"type":"Show","id":"123","title":"name","description":"Funny","channelTitle":"ifood.tv","lastUpdateTimestamp":"2014-04-20T20:34:59","numOfVideos":"15"}
But, I want this output to put in the readable JSON format like below in the file:
{"type":"Show",
"id":"123",
"title":"name",
"description":"Funny",
"channelTitle":"ifood.tv",
"lastUpdateTimestamp":"2014-04-20T20:34:59",
"numOfVideos":"15"}
How do I format the output this way?
A few solutions to choose from:
json json is a fast CLI tool for working with JSON. It is a single-file node.js script with no external deps (other than node.js itself).
$ echo '{"type":"Bar","id":"1","title":"Foo"}' | json
{
"type": "Bar",
"id": "1",
"title": "Foo"
}
Require:
# npm install -g json
json_pp: command utility available in Linux systems for JSON decoding/encoding
echo '{"type":"Bar","id":"1","title":"Foo"}' | json_pp -json_opt pretty,canonical
{
"id" : "1",
"title" : "Foo",
"type" : "Bar"
}
You may want to keep the -json_opt pretty,canonical argument for predictable ordering.
jq: lightweight and flexible command-line JSON processor. It is written in portable C, and it has zero runtime dependencies.
echo '{"type":"Bar","id":"1","title":"Foo"}' | jq '.'
{
"type": "Bar",
"id": "1",
"title": "Foo"
}
The simplest jq program is the expression ., which takes the input and produces it unchanged as output.
For additional jq options check the manual
python yq yq: Command-line YAML/XML/TOML processor - jq wrapper for YAML, XML, TOML documents
$ echo '{"type":"Bar","id":"1","title":"Foo"}' | yq
{
"type": "Bar",
"id": "1",
"title": "Foo"
}
The go version go yq doesn't work here
With xidel Command line tool to download and extract data from HTML/XML pages or JSON-APIs, using CSS, XPath 3.0, XQuery 3.0, JSONiq or pattern matching. It can also create new or transformed XML/HTML/JSON documents.
$ echo '{"type":"Bar","id":"1","title":"Foo"}' | xidel -e '$json'
{
"type": "Bar",
"id": "1",
"title": "Foo"
}
with python:
echo '{"type":"Bar","id":"1","title":"Foo"}' | python -m json.tool
{
"id": "1",
"title": "Foo",
"type": "Bar"
}
with nodejs and bash:
echo '{"type":"Bar","id":"1","title":"Foo"}' | node -e "console.log( JSON.stringify( JSON.parse(require('fs').readFileSync(0) ), 0, 1 ))"
{
"type": "Bar",
"id": "1",
"title": "Foo"
}
I am guessing that you want to prettify the JSON output.
That could be achieved using python:
curl http://localhost:8880/test.json | python -mjson.tool > out.json
This is to add to of Gilles' Answer. There are many ways to get this done but personally I prefer something lightweight, easy to remember and universally available (e.g. come with standard LTS installations of your preferred Linux flavor or easy to install) on common *nix systems.
Here are the options in their preferred order:
Python Json.tool module
echo '{"foo": "lorem", "bar": "ipsum"}' | python -mjson.tool
pros: almost available everywhere; cons: no color coding
jq (may require one time installation)
echo '{"foo": "lorem", "bar": "ipsum"}' | jq
cons: needs to install jq; pros: color coding and versatile
json_pp (available in Ubuntu 16.04 LTS)
echo '{"foo": "lorem", "bar": "ipsum"}' | json_pp
For Ruby users
gem install jsonpretty
echo '{"foo": "lorem", "bar": "ipsum"}' | jsonpretty
You can use the json node module:
npm i -g json
then simply append | json after curl.
curl http://localhost:8880/test.json | json
python -m json.tool
Curl http://127.0.0.1:5000/people/api.json | python -m json.tool
can also help.
You can install jq and make the query like below:
curl http://localhost:8080/observations/station/221 | jq
I found json_reformat to be very handy. So I just did the following:
curl http://127.0.0.1:5000/people/api.json | json_reformat
that's it!
Motivation: You want to print prettify JSON response after curl command request.
Solution: json_pp - commandline tool that converts between some input and output formats (one of them is JSON). This program was copied from json_xs and modified. The default input format is json and the default output format is json with pretty option.
Synposis:
json_pp [-v] [-f from_format] [-t to_format] [-json_opt options_to_json1[,options_to_json2[,...]]]
Formula: <someCommand> | json_pp
Example:
Request
curl -X https://jsonplaceholder.typicode.com/todos/1 | json_pp
Response
{
"completed" : false,
"id" : 1,
"title" : "delectus aut autem",
"userId" : 1
}
Check out curljson
$ pip install curljson
$ curljson -i <the-json-api-url>
With xidel:
curl <...> | xidel - -se '$json'
xidel can probably retrieve the JSON for you as well.
A lot more features (slice, filter and map and transform structured ) apart from formatting.
https://stedolan.github.io/jq/