Related
Is there a (Unix) shell script to format JSON in human-readable form?
Basically, I want it to transform the following:
{ "foo": "lorem", "bar": "ipsum" }
... into something like this:
{
"foo": "lorem",
"bar": "ipsum"
}
With Python 2.6+ you can do:
echo '{"foo": "lorem", "bar": "ipsum"}' | python -m json.tool
or, if the JSON is in a file, you can do:
python -m json.tool my_json.json
if the JSON is from an internet source such as an API, you can use
curl http://my_url/ | python -m json.tool
For convenience in all of these cases you can make an alias:
alias prettyjson='python -m json.tool'
For even more convenience with a bit more typing to get it ready:
prettyjson_s() {
echo "$1" | python -m json.tool
}
prettyjson_f() {
python -m json.tool "$1"
}
prettyjson_w() {
curl "$1" | python -m json.tool
}
for all the above cases. You can put this in .bashrc and it will be available every time in shell. Invoke it like prettyjson_s '{"foo": "lorem", "bar": "ipsum"}'.
Note that as #pnd pointed out in the comments below, in Python 3.5+ the JSON object is no longer sorted by default. To sort, add the --sort-keys flag to the end. I.e. ... | python -m json.tool --sort-keys.
You can use: jq
It's very simple to use and it works great! It can handle very large JSON structures, including streams. You can find
their tutorials here.
Usage examples:
$ jq --color-output . file1.json file1.json | less -R
$ command_with_json_output | jq .
$ jq # stdin/"interactive" mode, just enter some JSON
$ jq <<< '{ "foo": "lorem", "bar": "ipsum" }'
{
"bar": "ipsum",
"foo": "lorem"
}
Or use jq with identity filter:
$ jq '.foo' <<< '{ "foo": "lorem", "bar": "ipsum" }'
"lorem"
I use the "space" argument of JSON.stringify to pretty-print JSON in JavaScript.
Examples:
// Indent with 4 spaces
JSON.stringify({"foo":"lorem","bar":"ipsum"}, null, 4);
// Indent with tabs
JSON.stringify({"foo":"lorem","bar":"ipsum"}, null, '\t');
From the Unix command-line with Node.js, specifying JSON on the command line:
$ node -e "console.log(JSON.stringify(JSON.parse(process.argv[1]), null, '\t'));" \
'{"foo":"lorem","bar":"ipsum"}'
Returns:
{
"foo": "lorem",
"bar": "ipsum"
}
From the Unix command-line with Node.js, specifying a filename that contains JSON, and using an indent of four spaces:
$ node -e "console.log(JSON.stringify(JSON.parse(require('fs') \
.readFileSync(process.argv[1])), null, 4));" filename.json
Using a pipe:
echo '{"foo": "lorem", "bar": "ipsum"}' | node -e \
"\
s=process.openStdin();\
d=[];\
s.on('data',function(c){\
d.push(c);\
});\
s.on('end',function(){\
console.log(JSON.stringify(JSON.parse(d.join('')),null,2));\
});\
"
I wrote a tool that has one of the best "smart whitespace" formatters available. It produces more readable and less verbose output than most of the other options here.
underscore-cli
This is what "smart whitespace" looks like:
I may be a bit biased, but it's an awesome tool for printing and manipulating JSON data from the command-line. It's super-friendly to use and has extensive command-line help/documentation. It's a Swiss Army knife that I use for 1001 different small tasks that would be surprisingly annoying to do any other way.
Latest use-case: Chrome, Dev console, Network tab, export all as HAR file, "cat site.har | underscore select '.url' --outfmt text | grep mydomain"; now I have a chronologically ordered list of all URL fetches made during the loading of my company's site.
Pretty printing is easy:
underscore -i data.json print
Same thing:
cat data.json | underscore print
Same thing, more explicit:
cat data.json | underscore print --outfmt pretty
This tool is my current passion project, so if you have any feature requests, there is a good chance I'll address them.
I usually just do:
echo '{"test":1,"test2":2}' | python -mjson.tool
And to retrieve select data (in this case, "test"'s value):
echo '{"test":1,"test2":2}' | python -c 'import sys,json;data=json.loads(sys.stdin.read()); print data["test"]'
If the JSON data is in a file:
python -mjson.tool filename.json
If you want to do it all in one go with curl on the command line using an authentication token:
curl -X GET -H "Authorization: Token wef4fwef54te4t5teerdfgghrtgdg53" http://testsite/api/ | python -mjson.tool
If you use npm and Node.js, you can do npm install -g json and then pipe the command through json. Do json -h to get all the options. It can also pull out specific fields and colorize the output with -i.
curl -s http://search.twitter.com/search.json?q=node.js | json
Thanks to J.F. Sebastian's very helpful pointers, here's a slightly enhanced script I've come up with:
#!/usr/bin/python
"""
Convert JSON data to human-readable form.
Usage:
prettyJSON.py inputFile [outputFile]
"""
import sys
import simplejson as json
def main(args):
try:
if args[1] == '-':
inputFile = sys.stdin
else:
inputFile = open(args[1])
input = json.load(inputFile)
inputFile.close()
except IndexError:
usage()
return False
if len(args) < 3:
print json.dumps(input, sort_keys = False, indent = 4)
else:
outputFile = open(args[2], "w")
json.dump(input, outputFile, sort_keys = False, indent = 4)
outputFile.close()
return True
def usage():
print __doc__
if __name__ == "__main__":
sys.exit(not main(sys.argv))
It is not too simple with a native way with the jq tools.
For example:
cat xxx | jq .
a simple bash script for pretty json printing
json_pretty.sh
#/bin/bash
grep -Eo '"[^"]*" *(: *([0-9]*|"[^"]*")[^{}\["]*|,)?|[^"\]\[\}\{]*|\{|\},?|\[|\],?|[0-9 ]*,?' | awk '{if ($0 ~ /^[}\]]/ ) offset-=4; printf "%*c%s\n", offset, " ", $0; if ($0 ~ /^[{\[]/) offset+=4}'
Example:
cat file.json | json_pretty.sh
With Perl, use the CPAN module JSON::XS. It installs a command line tool json_xs.
Validate:
json_xs -t null < myfile.json
Prettify the JSON file src.json to pretty.json:
< src.json json_xs > pretty.json
If you don't have json_xs, try json_pp . "pp" is for "pure perl" – the tool is implemented in Perl only, without a binding to an external C library (which is what XS stands for, Perl's "Extension System").
On *nix, reading from stdin and writing to stdout works better:
#!/usr/bin/env python
"""
Convert JSON data to human-readable form.
(Reads from stdin and writes to stdout)
"""
import sys
try:
import simplejson as json
except:
import json
print json.dumps(json.loads(sys.stdin.read()), indent=4)
sys.exit(0)
Put this in a file (I named mine "prettyJSON" after AnC's answer) in your PATH and chmod +x it, and you're good to go.
That's how I do it:
curl yourUri | json_pp
It shortens the code and gets the job done.
The JSON Ruby Gem is bundled with a shell script to prettify JSON:
sudo gem install json
echo '{ "foo": "bar" }' | prettify_json.rb
Script download: gist.github.com/3738968
$ echo '{ "foo": "lorem", "bar": "ipsum" }' \
> | python -c'import fileinput, json;
> print(json.dumps(json.loads("".join(fileinput.input())),
> sort_keys=True, indent=4))'
{
"bar": "ipsum",
"foo": "lorem"
}
NOTE: It is not the way to do it.
The same in Perl:
$ cat json.txt \
> | perl -0007 -MJSON -nE'say to_json(from_json($_, {allow_nonref=>1}),
> {pretty=>1})'
{
"bar" : "ipsum",
"foo" : "lorem"
}
Note 2:
If you run
echo '{ "Düsseldorf": "lorem", "bar": "ipsum" }' \
| python -c'import fileinput, json;
print(json.dumps(json.loads("".join(fileinput.input())),
sort_keys=True, indent=4))'
the nicely readable word becomes \u encoded
{
"D\u00fcsseldorf": "lorem",
"bar": "ipsum"
}
If the remainder of your pipeline will gracefully handle unicode and you'd like your JSON to also be human-friendly, simply use ensure_ascii=False
echo '{ "Düsseldorf": "lorem", "bar": "ipsum" }' \
| python -c'import fileinput, json;
print json.dumps(json.loads("".join(fileinput.input())),
sort_keys=True, indent=4, ensure_ascii=False)'
and you'll get:
{
"Düsseldorf": "lorem",
"bar": "ipsum"
}
UPDATE I'm using jq now as suggested in another answer. It's extremely powerful at filtering JSON, but, at its most basic, also an awesome way to pretty print JSON for viewing.
jsonpp is a very nice command line JSON pretty printer.
From the README:
Pretty print web service responses like so:
curl -s -L http://<!---->t.co/tYTq5Pu | jsonpp
and make beautiful the files running around on your disk:
jsonpp data/long_malformed.json
If you're on Mac OS X, you can brew install jsonpp. If not, you can simply copy the binary to somewhere in your $PATH.
Try pjson. It has colors!
Install it with pip:
⚡ pip install pjson
And then pipe any JSON content to pjson.
Or, with Ruby:
echo '{ "foo": "lorem", "bar": "ipsum" }' | ruby -r json -e 'jj JSON.parse gets'
You can use this simple command to achieve the result:
echo "{ \"foo\": \"lorem\", \"bar\": \"ipsum\" }"|python -m json.tool
I use jshon to do exactly what you're describing. Just run:
echo $COMPACTED_JSON_TEXT | jshon
You can also pass arguments to transform the JSON data.
Check out Jazor. It's a simple command line JSON parser written in Ruby.
gem install jazor
jazor --help
JSONLint has an open-source implementation on GitHub that can be used on the command line or included in a Node.js project.
npm install jsonlint -g
and then
jsonlint -p myfile.json
or
curl -s "http://api.twitter.com/1/users/show/user.json" | jsonlint | less
Simply pipe the output to jq ..
Example:
twurl -H ads-api.twitter.com '.......' | jq .
You can simply use standard tools like jq or json_pp.
echo '{ "foo": "lorem", "bar": "ipsum" }' | json_pp
or
echo '{ "foo": "lorem", "bar": "ipsum" }' | jq
will both prettify output like the following (jq even more colorful):
{
"foo": "lorem",
"bar": "ipsum"
}
The huge advantage of jq is that it can do A LOT more if you'd like to parse and process the json.
With Perl, if you install JSON::PP from CPAN you'll get the json_pp command. Stealing the example from B Bycroft you get:
[pdurbin#beamish ~]$ echo '{"foo": "lorem", "bar": "ipsum"}' | json_pp
{
"bar" : "ipsum",
"foo" : "lorem"
}
It's worth mentioning that json_pp comes pre-installed with Ubuntu 12.04 (at least) and Debian in /usr/bin/json_pp
Pygmentize
I combine Python's json.tool with pygmentize:
echo '{"foo": "bar"}' | python -m json.tool | pygmentize -g
There are some alternatives to pygmentize which are listed in my this answer.
Here is a live demo:
You only need to use jq
If jq is not installed then you need to install jq first:
sudo apt-get update
sudo apt-get install jq
After installing jq then only need to use jq:
echo '{ "foo": "lorem", "bar": "ipsum" }' | jq
Output looks like
{
"foo": "lorem",
"bar": "ipsum"
}
I recommend using the json_xs command line utility which is included in the JSON::XS perl module. JSON::XS is a Perl module for serializing/deserializing JSON, on a Debian or Ubuntu machine you can install it like this:
sudo apt-get install libjson-xs-perl
It is obviously also available on CPAN.
To use it to format JSON obtained from a URL you can use curl or wget like this:
$ curl -s http://page.that.serves.json.com/json/ | json_xs
or this:
$ wget -q -O - http://page.that.serves.json.com/json/ | json_xs
and to format JSON contained in a file you can do this:
$ json_xs < file-full-of.json
To reformat as YAML, which some people consider to be more humanly-readable than JSON:
$ json_xs -t yaml < file-full-of.json
jj is super-fast, can handle ginormous JSON documents economically, does not mess with valid JSON numbers, and is easy to use, e.g.
jj -p # for reading from STDIN
or
jj -p -i input.json
It is (2018) still quite new so maybe it won’t handle invalid JSON the way you expect, but it is easy to install on major platforms.
bat is a cat clone with syntax highlighting:
Example:
echo '{"bignum":1e1000}' | bat -p -l json
-p will output without headers, and -l will explicitly specify the language.
It has colouring and formatting for JSON and does not have the problems noted in this comment: How can I pretty-print JSON in a shell script?
Install yajl-tools with the command below:
sudo apt-get install yajl-tools
then,
echo '{"foo": "lorem", "bar": "ipsum"}' | json_reformat
I would like to delete a line containing "electron-inspector": "0.1.4", from .package.json
The following is not working.
sed -i '' -e 'electron-inspector' ./package.json
What do I have to change?
You could do this, but sed in not a JSON parser !
sed -i '' -e '/electron-inspector/d' file
(-i '' is required for macOS's sed)
Imagine a simple (inline) case like this :
{"foo": 123, "electron-inspector": "0.1.4", "bar": 42 }
BOOOM ! So :
The proper way, using jq :
jq -r 'del(.["electron-inspector"])' file.json > _.json && mv _.json file.json
The dash - is a special case for jq
If instead you had a simple case, the command would be :
jq 'del(.electroninspector)' file.json
Check JQ
Another solution using nodejs and... javascript :
cat file.json
{"foo": 123, "electron-inspector": "0.1.4", "bar": 42 }
Code :
$ node<<EOF > _.json && mv _.json file.json
var o = $(< file.json);
delete o["electron-inspector"];
console.log(JSON.stringify(o, null, 4));
EOF
Edited file :
{
"foo": 123,
"bar": 42
}
Last but not least, to edit the file properly using sponge :
Instead of : > x.json && mv x.json file.json, we can do :
command ...... file.json | sponge file.json
You need moreutils installed.
I got it to work with:
sed -i '/electron-inspector/d' package.json
Is it possible to change the value of a key in a JSON file from command line?
e.g., in package.json:
Change
{
...
...
"something": "something",
"name": "idan"
...
}
To
{
...
...
"something": "something",
"name": "adar"
...
}
One way to achieve it is by using the "json" npm package, e.g.:
json -I -f package.json -e "this.name='adar'"
Another way is by using the jq CLI, e.g.:
mv package.json temp.json
jq -r '.name |= "adar"' temp.json > package.json
rm temp.json
With the sde CLI utility, you can
sde name adar package.json
With xidel:
$ xidel -s --in-place package.json -e '($json).name:="adar"'
$ xidel -s --in-place package.json -e 'map:put($json,"name","adar")'
$ xidel -s --in-place package.json -e 'map:merge(($json,{"name":"adar"}),{"duplicates":"use-last"})'
The anoter way is to open the file itself in terminal :
pico filename.json
edit it then save then exit.
check if correct changes were made:
cat filename.json
Using jq to concat json files in a directory.
The directory contains a few hundred thousand files.
jq -s '.' *.json > output.json
returns an error that the file list is too long. Is there a way to write this that uses a method that will take in more files?
If jq -s . *.json > output.json produces "argument list too long"; you could fix it using zargs in zsh:
$ zargs *.json -- cat | jq -s . > output.json
That you could emulate using find as shown in #chepner's answer:
$ find -maxdepth 1 -name \*.json -exec cat {} + | jq -s . > output.json
"Data in jq is represented as streams of JSON values ... This is a cat-friendly format - you can just join two JSON streams together and get a valid JSON stream.":
$ echo '{"a":1}{"b":2}' | jq -s .
[
{
"a": 1
},
{
"b": 2
}
]
The problem is that the length of a command line is limited, and *.json produces too many argument for one command line. One workaround is to expand the pattern in a for loop, which does not have the same limits as a command line, because bash can iterate over the result internally rather than having to construct an argument list for an external command:
for f in *.json; do
cat "$f"
done | jq -s '.' > output.json
This is rather inefficient, though, since it requires running cat once for each file. A more efficient solution is to use find to call cat with as many files as possible each time.
find . -name '*.json' -exec cat '{}' + | jq -s '.' > output.json
(You may be able to simply use
find . -name '*.json' -exec jq -s '{}' + > output.json
as well; it may depend on what is in the files and how multiple calls to jq using the -s option compares to a single call.)
[EDITED to use find]
One obvious thing to consider would be to process one file at a time, and then "slurp" them:
$ while IFS= read -r f ; cat "$f" ; done <(find . -maxdepth 1 -name "*.json") | jq -s .
This however would presumably require a lot of memory. Thus the following may be closer to what you need:
#!/bin/bash
# "slurp" a bunch of files
# Requires a version of jq with 'inputs'.
echo "["
while read f
do
jq -nr 'inputs | (., ",")' $f
done < <(find . -maxdepth 1 -name "*.json") | sed '$d'
echo "]"
Is there a (Unix) shell script to format JSON in human-readable form?
Basically, I want it to transform the following:
{ "foo": "lorem", "bar": "ipsum" }
... into something like this:
{
"foo": "lorem",
"bar": "ipsum"
}
With Python 2.6+ you can do:
echo '{"foo": "lorem", "bar": "ipsum"}' | python -m json.tool
or, if the JSON is in a file, you can do:
python -m json.tool my_json.json
if the JSON is from an internet source such as an API, you can use
curl http://my_url/ | python -m json.tool
For convenience in all of these cases you can make an alias:
alias prettyjson='python -m json.tool'
For even more convenience with a bit more typing to get it ready:
prettyjson_s() {
echo "$1" | python -m json.tool
}
prettyjson_f() {
python -m json.tool "$1"
}
prettyjson_w() {
curl "$1" | python -m json.tool
}
for all the above cases. You can put this in .bashrc and it will be available every time in shell. Invoke it like prettyjson_s '{"foo": "lorem", "bar": "ipsum"}'.
Note that as #pnd pointed out in the comments below, in Python 3.5+ the JSON object is no longer sorted by default. To sort, add the --sort-keys flag to the end. I.e. ... | python -m json.tool --sort-keys.
You can use: jq
It's very simple to use and it works great! It can handle very large JSON structures, including streams. You can find
their tutorials here.
Usage examples:
$ jq --color-output . file1.json file1.json | less -R
$ command_with_json_output | jq .
$ jq # stdin/"interactive" mode, just enter some JSON
$ jq <<< '{ "foo": "lorem", "bar": "ipsum" }'
{
"bar": "ipsum",
"foo": "lorem"
}
Or use jq with identity filter:
$ jq '.foo' <<< '{ "foo": "lorem", "bar": "ipsum" }'
"lorem"
I use the "space" argument of JSON.stringify to pretty-print JSON in JavaScript.
Examples:
// Indent with 4 spaces
JSON.stringify({"foo":"lorem","bar":"ipsum"}, null, 4);
// Indent with tabs
JSON.stringify({"foo":"lorem","bar":"ipsum"}, null, '\t');
From the Unix command-line with Node.js, specifying JSON on the command line:
$ node -e "console.log(JSON.stringify(JSON.parse(process.argv[1]), null, '\t'));" \
'{"foo":"lorem","bar":"ipsum"}'
Returns:
{
"foo": "lorem",
"bar": "ipsum"
}
From the Unix command-line with Node.js, specifying a filename that contains JSON, and using an indent of four spaces:
$ node -e "console.log(JSON.stringify(JSON.parse(require('fs') \
.readFileSync(process.argv[1])), null, 4));" filename.json
Using a pipe:
echo '{"foo": "lorem", "bar": "ipsum"}' | node -e \
"\
s=process.openStdin();\
d=[];\
s.on('data',function(c){\
d.push(c);\
});\
s.on('end',function(){\
console.log(JSON.stringify(JSON.parse(d.join('')),null,2));\
});\
"
I wrote a tool that has one of the best "smart whitespace" formatters available. It produces more readable and less verbose output than most of the other options here.
underscore-cli
This is what "smart whitespace" looks like:
I may be a bit biased, but it's an awesome tool for printing and manipulating JSON data from the command-line. It's super-friendly to use and has extensive command-line help/documentation. It's a Swiss Army knife that I use for 1001 different small tasks that would be surprisingly annoying to do any other way.
Latest use-case: Chrome, Dev console, Network tab, export all as HAR file, "cat site.har | underscore select '.url' --outfmt text | grep mydomain"; now I have a chronologically ordered list of all URL fetches made during the loading of my company's site.
Pretty printing is easy:
underscore -i data.json print
Same thing:
cat data.json | underscore print
Same thing, more explicit:
cat data.json | underscore print --outfmt pretty
This tool is my current passion project, so if you have any feature requests, there is a good chance I'll address them.
I usually just do:
echo '{"test":1,"test2":2}' | python -mjson.tool
And to retrieve select data (in this case, "test"'s value):
echo '{"test":1,"test2":2}' | python -c 'import sys,json;data=json.loads(sys.stdin.read()); print data["test"]'
If the JSON data is in a file:
python -mjson.tool filename.json
If you want to do it all in one go with curl on the command line using an authentication token:
curl -X GET -H "Authorization: Token wef4fwef54te4t5teerdfgghrtgdg53" http://testsite/api/ | python -mjson.tool
If you use npm and Node.js, you can do npm install -g json and then pipe the command through json. Do json -h to get all the options. It can also pull out specific fields and colorize the output with -i.
curl -s http://search.twitter.com/search.json?q=node.js | json
Thanks to J.F. Sebastian's very helpful pointers, here's a slightly enhanced script I've come up with:
#!/usr/bin/python
"""
Convert JSON data to human-readable form.
Usage:
prettyJSON.py inputFile [outputFile]
"""
import sys
import simplejson as json
def main(args):
try:
if args[1] == '-':
inputFile = sys.stdin
else:
inputFile = open(args[1])
input = json.load(inputFile)
inputFile.close()
except IndexError:
usage()
return False
if len(args) < 3:
print json.dumps(input, sort_keys = False, indent = 4)
else:
outputFile = open(args[2], "w")
json.dump(input, outputFile, sort_keys = False, indent = 4)
outputFile.close()
return True
def usage():
print __doc__
if __name__ == "__main__":
sys.exit(not main(sys.argv))
It is not too simple with a native way with the jq tools.
For example:
cat xxx | jq .
a simple bash script for pretty json printing
json_pretty.sh
#/bin/bash
grep -Eo '"[^"]*" *(: *([0-9]*|"[^"]*")[^{}\["]*|,)?|[^"\]\[\}\{]*|\{|\},?|\[|\],?|[0-9 ]*,?' | awk '{if ($0 ~ /^[}\]]/ ) offset-=4; printf "%*c%s\n", offset, " ", $0; if ($0 ~ /^[{\[]/) offset+=4}'
Example:
cat file.json | json_pretty.sh
With Perl, use the CPAN module JSON::XS. It installs a command line tool json_xs.
Validate:
json_xs -t null < myfile.json
Prettify the JSON file src.json to pretty.json:
< src.json json_xs > pretty.json
If you don't have json_xs, try json_pp . "pp" is for "pure perl" – the tool is implemented in Perl only, without a binding to an external C library (which is what XS stands for, Perl's "Extension System").
On *nix, reading from stdin and writing to stdout works better:
#!/usr/bin/env python
"""
Convert JSON data to human-readable form.
(Reads from stdin and writes to stdout)
"""
import sys
try:
import simplejson as json
except:
import json
print json.dumps(json.loads(sys.stdin.read()), indent=4)
sys.exit(0)
Put this in a file (I named mine "prettyJSON" after AnC's answer) in your PATH and chmod +x it, and you're good to go.
That's how I do it:
curl yourUri | json_pp
It shortens the code and gets the job done.
The JSON Ruby Gem is bundled with a shell script to prettify JSON:
sudo gem install json
echo '{ "foo": "bar" }' | prettify_json.rb
Script download: gist.github.com/3738968
$ echo '{ "foo": "lorem", "bar": "ipsum" }' \
> | python -c'import fileinput, json;
> print(json.dumps(json.loads("".join(fileinput.input())),
> sort_keys=True, indent=4))'
{
"bar": "ipsum",
"foo": "lorem"
}
NOTE: It is not the way to do it.
The same in Perl:
$ cat json.txt \
> | perl -0007 -MJSON -nE'say to_json(from_json($_, {allow_nonref=>1}),
> {pretty=>1})'
{
"bar" : "ipsum",
"foo" : "lorem"
}
Note 2:
If you run
echo '{ "Düsseldorf": "lorem", "bar": "ipsum" }' \
| python -c'import fileinput, json;
print(json.dumps(json.loads("".join(fileinput.input())),
sort_keys=True, indent=4))'
the nicely readable word becomes \u encoded
{
"D\u00fcsseldorf": "lorem",
"bar": "ipsum"
}
If the remainder of your pipeline will gracefully handle unicode and you'd like your JSON to also be human-friendly, simply use ensure_ascii=False
echo '{ "Düsseldorf": "lorem", "bar": "ipsum" }' \
| python -c'import fileinput, json;
print json.dumps(json.loads("".join(fileinput.input())),
sort_keys=True, indent=4, ensure_ascii=False)'
and you'll get:
{
"Düsseldorf": "lorem",
"bar": "ipsum"
}
UPDATE I'm using jq now as suggested in another answer. It's extremely powerful at filtering JSON, but, at its most basic, also an awesome way to pretty print JSON for viewing.
jsonpp is a very nice command line JSON pretty printer.
From the README:
Pretty print web service responses like so:
curl -s -L http://<!---->t.co/tYTq5Pu | jsonpp
and make beautiful the files running around on your disk:
jsonpp data/long_malformed.json
If you're on Mac OS X, you can brew install jsonpp. If not, you can simply copy the binary to somewhere in your $PATH.
Try pjson. It has colors!
Install it with pip:
⚡ pip install pjson
And then pipe any JSON content to pjson.
Or, with Ruby:
echo '{ "foo": "lorem", "bar": "ipsum" }' | ruby -r json -e 'jj JSON.parse gets'
You can use this simple command to achieve the result:
echo "{ \"foo\": \"lorem\", \"bar\": \"ipsum\" }"|python -m json.tool
I use jshon to do exactly what you're describing. Just run:
echo $COMPACTED_JSON_TEXT | jshon
You can also pass arguments to transform the JSON data.
Check out Jazor. It's a simple command line JSON parser written in Ruby.
gem install jazor
jazor --help
JSONLint has an open-source implementation on GitHub that can be used on the command line or included in a Node.js project.
npm install jsonlint -g
and then
jsonlint -p myfile.json
or
curl -s "http://api.twitter.com/1/users/show/user.json" | jsonlint | less
Simply pipe the output to jq ..
Example:
twurl -H ads-api.twitter.com '.......' | jq .
You can simply use standard tools like jq or json_pp.
echo '{ "foo": "lorem", "bar": "ipsum" }' | json_pp
or
echo '{ "foo": "lorem", "bar": "ipsum" }' | jq
will both prettify output like the following (jq even more colorful):
{
"foo": "lorem",
"bar": "ipsum"
}
The huge advantage of jq is that it can do A LOT more if you'd like to parse and process the json.
With Perl, if you install JSON::PP from CPAN you'll get the json_pp command. Stealing the example from B Bycroft you get:
[pdurbin#beamish ~]$ echo '{"foo": "lorem", "bar": "ipsum"}' | json_pp
{
"bar" : "ipsum",
"foo" : "lorem"
}
It's worth mentioning that json_pp comes pre-installed with Ubuntu 12.04 (at least) and Debian in /usr/bin/json_pp
Pygmentize
I combine Python's json.tool with pygmentize:
echo '{"foo": "bar"}' | python -m json.tool | pygmentize -g
There are some alternatives to pygmentize which are listed in my this answer.
Here is a live demo:
You only need to use jq
If jq is not installed then you need to install jq first:
sudo apt-get update
sudo apt-get install jq
After installing jq then only need to use jq:
echo '{ "foo": "lorem", "bar": "ipsum" }' | jq
Output looks like
{
"foo": "lorem",
"bar": "ipsum"
}
I recommend using the json_xs command line utility which is included in the JSON::XS perl module. JSON::XS is a Perl module for serializing/deserializing JSON, on a Debian or Ubuntu machine you can install it like this:
sudo apt-get install libjson-xs-perl
It is obviously also available on CPAN.
To use it to format JSON obtained from a URL you can use curl or wget like this:
$ curl -s http://page.that.serves.json.com/json/ | json_xs
or this:
$ wget -q -O - http://page.that.serves.json.com/json/ | json_xs
and to format JSON contained in a file you can do this:
$ json_xs < file-full-of.json
To reformat as YAML, which some people consider to be more humanly-readable than JSON:
$ json_xs -t yaml < file-full-of.json
jj is super-fast, can handle ginormous JSON documents economically, does not mess with valid JSON numbers, and is easy to use, e.g.
jj -p # for reading from STDIN
or
jj -p -i input.json
It is (2018) still quite new so maybe it won’t handle invalid JSON the way you expect, but it is easy to install on major platforms.
bat is a cat clone with syntax highlighting:
Example:
echo '{"bignum":1e1000}' | bat -p -l json
-p will output without headers, and -l will explicitly specify the language.
It has colouring and formatting for JSON and does not have the problems noted in this comment: How can I pretty-print JSON in a shell script?
Install yajl-tools with the command below:
sudo apt-get install yajl-tools
then,
echo '{"foo": "lorem", "bar": "ipsum"}' | json_reformat