docker and format json - json

I'm trying to get usable json from the docker cli, however it seems it will only produce json for individual items, and not the complete result, as a whole.
For example, running docker container ls -a --format="{{ json .Names }}" produces:
"hopeful_payne"
"trusting_turing"
"stupefied_morse"
"unruffled_noyce"
"pensive_fermi"
"objective_neumann"
"confident_bhaskara"
"unruffled_cray"
"epic_newton"
"boring_bartik"
"priceless_sinoussi"
"naughty_grothendieck"
"hardcore_bose"
"sad_jones"
"optimistic_napier"
"trusting_stallman"
"xenodochial_dijkstra"
"pedantic_cocks"
The above is not json.
How can I produce a result that is, ideally, a json array?

I think you cannot do this using docker only.
The command-line's format function is effectively taking each input line (one for each container) and applying the Go template to it. So you need another tool to aggregate the lines into a JSON array.
One way that you can achieve your goal is using the excellent jq tool:
docker container ls --format="{\"name\":\"{{.Names}}\"}" --all | jq --slurp
This generates each container line as a JSON string: {"name": "[VALUE]"} and then uses jq to slurp them into a JSON array.
A challenge doing this directly in bash is JSON's stricture that the last element in a list can't be terminated with a ,. So, the following simple bash script generates invalid JSON and you'd need extra logic to remove it (or better yet, not add the last one):
echo "[$(for CONTAINER in $(docker container ls --format="{{.Names}}" --all); do echo "{\"name\":\"${CONTAINER}\"},"; done;)]"

What are you trying to do with these JSON responses? It might be easier just to talk directly to the Docker API, which will give you JSON responses directly. E.g., to get a list of containers:
curl --unix-socket /var/run/docker.sock http://localhost/v1.24/containers/json
You can, as DazWilkin suggested, use jq for filtering JSON on the command line. E.g., if we want a list of container names:
curl --unix-socket /var/run/docker.sock http://localhost/v1.24/containers/json |
jq '[.[]|.Names]'
You can find Docker API documentation here.

One way to think of the output is that it's JSONL: http://jsonlines.org/
This Docker output is JSON, per line. Since you asked for a single attribute -- just the name -- you're simply getting a string back. But, notice it's quoted. It's technically JSON. It may make more sense if you update your format to {{ json . }}, which will then output lines that look more like the JSON you're expecting.
However, it's still a JSON document per line, so you'd have to process each line as its own document.

Related

When creating a variable from command output, Bash removes a backslash from the JSON. How do I make it keep both backslashes to maintain valid JSON?

I'm doing the following to capture some ADO JSON data:
iteration="$(az boards iteration team list --team Test --project Test --timeframe current)"
Normally, the output of that command contains a JSON key/value pair like the following:
"path": "Test\\Sprint1"
But after capturing the STDOUT into that iteration variable, if I do
echo "$iteration"
That key/value pair becomes
"path": "Test\Sprint1"
And if I attempt to use jq on that output, it breaks because it's not recognized as valid JSON any longer. I'm very unfamiliar with Bash. How can I get that JSON to remain valid all the way through?
As already commented by markp-fuso:
It looks like your echo command is interpreting the backslashes. You can confirm this by running echo 'a\\b' and looking at the output.
The portable way to deal with such problems is to use printf instead of echo:
printf %s\\n "$iteration"

jq output into a non json list to use for a loop

I'm sure this has been asked, but I've searched and can't find anything to answer my question...
I'm looking for a simple way to output a json list as a non json list to process using a while loop.. Which means I need to strip the quotes, commas and brackets.. I know I could do it with cut but I'm sure I'm missing an easier way..
Say I'm using an aws command to get a list of resources and then I want to process that list in a while loop...
aws eks list-clusters | jq .clusters | while read cluster; do something; done
Something along these lines.. am I being stupid here?
Use [] to turn the list into a series of individual strings, and --raw-output / -r to output them without quotes:
With this option, if the filter's result is a string then it will be written directly to standard output rather than being formatted as a JSON string with quotes. This can be useful for making jq filters talk to non-JSON-based systems.
aws eks list-clusters | jq -r '.clusters[]' | while read cluster; do something; done

Oracle SQLcl: Spool to json, only include content in items array?

I'm making a query via Oracle SQLcl. I am spooling into a .json file.
The correct data is presented from the query, but the format is strange.
Starting off as:
SET ENCODING UTF-8
SET SQLFORMAT JSON
SPOOL content.json
Follwed by a query, produces a JSON file as requested.
However, how do I remove the outer structure, meaning this part:
{"results":[{"columns":[{"name":"ID","type":"NUMBER"},
{"name":"LANGUAGE","type":"VARCHAR2"},{"name":"LOCATION","type":"VARCHAR2"},{"name":"NAME","type":"VARCHAR2"}],"items": [
// Here is the actual data I want to see in the file exclusively
]
I only want to spool everything in the items array, not including that key itself.
Is this possible to set as a parameter before querying? Reading the Oracle docs have not yielded any answers, hence asking here.
Thats how I handle this.
After output to some file, I use jq command to recreate the file with only the items
ssh cat file.json | jq --compact-output --raw-output '.results[0].items' > items.json
`
Using this library = https://stedolan.github.io/jq/

Is there a `jq` command line tool or wrapper which lets you interactively explore `jq` similar to `jmespath.terminal`

jq is a lightweight and flexible command-line JSON processor.
https://stedolan.github.io/jq/
Is there a jq command line tool or wrapper which lets you pipe output into it and interactively explore jq, with the JSON input in one pane and your interactively updating result in another pane, similar to jmespath.terminal ?
I'm looking for something similar to the JMESPath Terminal jpterm
"JMESPath exploration tool in the terminal"
https://github.com/jmespath/jmespath.terminal
I found this project jqsh but it's not maintained and it appears to produce a lot of errors when I use it.
https://github.com/bmatsuo/jqsh
I've used https://jqplay.org/ and it's a great web based jq learning tool. However, I want to be able to, in the shell, pipe the json output of a command into an interactive jq which allows me to explore and experiment with jq commands.
Thanks in advance!
I've been using jiq and I'm pretty happy with it.
https://github.com/fiatjaf/jiq
It's jid with jq.
You can drill down interactively by using jq filtering queries.
jiq uses jq internally, and it requires you to have jq in your PATH.
Using the aws cli
aws ec2 describe-regions --region-names us-east-1 us-west-1 | jiq
jiq output
[Filter]> .Regions
{
"Regions": [
{
"Endpoint": "ec2.us-east-1.amazonaws.com",
"RegionName": "us-east-1"
},
{
"Endpoint": "ec2.us-west-1.amazonaws.com",
"RegionName": "us-west-1"
}
]
}
https://github.com/simeji/jid
n.b. I'm not clear how strictly it follows jq syntax and feature set
You may have to roll-your-own.
Of course, jq itself is interactive in the sense that if you invoke it without specifying any JSON input, it will process STDIN interactively.
If you want to feed the same data to multiple programs, you could easily write your own wrapper. Over at github, there's a bash script named jqplay that has a few bells and whistles. For example, if the input command begins with | then the most recent result is used as input.
Example 1
./jqplay -c spark.json
Enter a jq filter (possibly beginning with "|"), or blank line to terminate:
.[0]
{"name":"Paddington","lovesPandas":null,"knows":{"friends":["holden","Sparky"]}}
.[1]
{"name":"Holden"}
| .name
"Holden"
| .[0:1]
"H"
| length
1
.[1].name
"Holden"
Bye.
Example 2
./jqplay -n
Enter a jq filter (possibly beginning and/or ending with "|"), or blank line to terminate:
?
An initial | signifies the filter should be applied to the previous jq
output.
A terminating | causes the next line that does not trigger a special
action to be appended to the current line.
Special action triggers:
:exit # exit this script, also triggered by a blank line
:help # print this help
:input PATHNAME ...
:options OPTIONS
:save PN # save the most recent output in the named file provided
it does not exist
:save! PN # save the most recent output in the named file
:save # save to the file most recently specified by a :save command
:show # print the OPTIONS and PATHNAMEs currently in effect
:! PN # equivalent to the sequence of commands
:save! PN
:input PN
? # print this help
# # ignore this line
1+2
3
:exit
Bye.
If you're using Emacs (or willing to) then JQ-mode allows you to run JQ filters interactively on the current JSON document buffer:
https://github.com/ljos/jq-mode
There is a new one: https://github.com/PaulJuliusMartinez/jless
JLess is a command-line JSON viewer designed for reading, exploring, and searching through JSON data.
JLess will pretty print your JSON and apply syntax highlighting.
Expand and collapse Objects and Arrays to grasp the high- and low-level structure of a JSON document. JLess has a large suite of vim-inspired commands that make exploring data a breeze.
JLess supports full text regular-expression based search. Quickly find the data you're looking for in long String values, or jump between values for the same Object key.

Efficiently get the first record of a JSONL file

Is it possible to efficiently get the first record of a JSONL file without consuming the entire stream / file? One way I have been able to inefficiently do so is the following:
curl -s http://example.org/file.jsonl | jq -s '.[0]'
I realize that head could be used here to extract the first line, but assume that the file may not use a newline as the record separator and may simply be concatenated objects or arrays.
If I'm understanding correctly, the JSONL format just returns a stream of JSON objects which jq handles quite nicely. Best case scenario that you wanted the first item, you could just utilize the input filter to grab the first item.
I think you could just do this:
$ curl -s http://example.org/file.jsonl | jq -n 'input'
You need the null input -n to not process the input immediately then input just gets one input from the stream. No need to go through the rest of the input stream.