I'm trying to transform one big json resultset to multiple objects.
Input:
{
"results": {
"2019-11-27 00:00:00": [
{
"e": "10814",
"s": "153330",
"t": "164144"
}
],
"2019-11-27 00:15:00": [
{
"e": "11052",
"s": "148692",
"t": "159744"
}
],
"2019-11-27 00:30:00": [
{
"e": "11550",
"s": "152379",
"t": "163929"
}
],
"2019-11-27 00:45:00": [
{
"e": "12640",
"s": "154984",
"t": "167624"
}
]
}
}
This is the output I'm trying to reach:
{"timestamp":"2019-11-27 00:00:00","e":"10814","s":"153330","t":"164144"}
{"timestamp":"2019-11-27 00:15:00","e":"11052","s":"148692","t":"159744"}
{"timestamp":"2019-11-27 00:30:00","e":"11550","s":"152379","t":"163929"}
{"timestamp":"2019-11-27 00:45:00","e":"12640","s":"154984","t":"167624"}
I tried so far:
$ cat input.json | jq -cr '.[] | keys[] as $k | { "timestamp": "\($k)"}'
{"timestamp":"2019-11-27 00:00:00"}
{"timestamp":"2019-11-27 00:15:00"}
{"timestamp":"2019-11-27 00:30:00"}
{"timestamp":"2019-11-27 00:45:00"}
and
$ cat input.json | jq -c '.[] | .[] | .[]'
{"e":"10814","s":"153330","t":"164144"}
{"e":"11052","s":"148692","t":"159744"}
{"e":"11550","s":"152379","t":"163929"}
{"e":"12640","s":"154984","t":"167624"}
I just need a hint to combine these two filters to obtain the result as described above. I'm not sure how to do it. Any ideas?
You were almost there. Just add the objects in those arrays to objects you created out of keys.
.results | keys_unsorted[] as $k | { timestamp: $k } + .[$k][]
Online demo with your sample
Online demo with a slightly different input to show what + .[$k][] does clearly
Or using to_entries:
.results
| to_entries[]
| { timestamp: .key } + .value[]
Related
Suppose I have the following nested data structure
cat nested.json
[
{
"a": "a",
"b": [
{"c": "c"}
]
},
{
"a": "a",
"b": [
{"c": "c"}
]
}
]
I can flatten it like this
cat nested.json | jq '
[. as $in | reduce paths(scalars) as $path ({};
. + { ($path | map(tostring) | join(".")): $in | getpath($path) }
)]
' > flat.json
cat flat.json
[
{
"0.a": "a",
"0.b.0.c": "c",
"1.a": "a",
"1.b.0.c": "c"
}
]
To reverse the flatten operation with jq I tried this
cat flat.json | jq '
.[0] | reduce to_entries[] as $kv ({};
setpath($kv.key|split("."); $kv.value)
)
'
{
"0": {
"a": "a",
"b": {
"0": {
"c": "c"
}
}
},
"1": {
"a": "a",
"b": {
"0": {
"c": "c"
}
}
}
}
However, I want to convert numbers in the setpath param to create arrays. This doesn't quite work, but I think it's close?
cat flat.json | jq '
def makePath($s): [split(".")[] | if (test("\\d+")) then tonumber else . end];
.[0] | reduce to_entries[] as $kv ({}; setpath(makePath($kv.key); $kv.value))
'
jq: error (at <stdin>:8): split input and separator must be strings
The desired output is the same as the original data in nested.json
Wouldn't it be simpler to do it this way:
Encode your input with
jq '[path(.. | scalars) as $path | {($path | join(".")): getpath($path)}] | add' nested.json
{
"0.a": "a",
"0.b.0.c": "c",
"1.a": "a",
"1.b.0.c": "c"
}
And decode it with
jq 'reduce to_entries[] as $item (null; setpath($item.key / "." | map(tonumber? // .); $item.value))' flat.json
[
{
"a": "a",
"b": [
{
"c": "c"
}
]
},
{
"a": "a",
"b": [
{
"c": "c"
}
]
}
]
However, if you don't care about your special dot notation (e.g. "0.b.0.c") for the encoded keys, you can simply convert the path array into a JSON string instead, having albeit uglier virtually the same effect. Moreover, it would automatically enable the handling of input object field names that include dots (e.g. {"a.b":3}) or look like numbers (e.g. {"42":"Panic!"}).
Using JSON keys, encode your input with
jq '[path(.. | scalars) as $path | {($path | tojson): getpath($path)}] | add' nested.json
{
"[0,\"a\"]": "a",
"[0,\"b\",0,\"c\"]": "c",
"[1,\"a\"]": "a",
"[1,\"b\",0,\"c\"]": "c"
}
And decode it with
jq 'reduce to_entries[] as $item (null; setpath($item.key | fromjson; $item.value))' flat.json
[
{
"a": "a",
"b": [
{
"c": "c"
}
]
},
{
"a": "a",
"b": [
{
"c": "c"
}
]
}
]
I'm trying to use JQ to create the following paths:
/staging-data-0/cassandra/cassandra-client-port
/staging-data-0/cassandra/cassandra-gossip-port
from the following blob of JSON (I've stripped unnecessary bits out):
{
"DebugConfig": {
"ServerPort": 8300,
"Services": [
{
"Checks": [
{
"CheckID": "cassandra-client-port",
"Timeout": "1s"
},
{
"CheckID": "cassandra-gossip-port",
"Timeout": "1s"
}
],
"Name": "cassandra"
},
{
"Checks": [
{
"CheckID": "cockroachdb-tcp",
"Timeout": "1s"
}
],
"Name": "cockroachdb"
}
]
},
"Member": {
"Name": "staging-data-0"
},
"Meta": {
"consul-network-segment": ""
}
}
I'm struggling with the JQ manual to generate the paths, I can only pull out the last part so far with
jq '.DebugConfig.Services | map(select(.Name=="cassandra")) | map(.Checks[].CheckID)'
The final path should be /{.Member.Name}/{.DebugConfig.Services.Name}/{.DebugConfig.Services.Checks.CheckID}
Only cassandra
jq -r '{a:.Member.Name, b:.DebugConfig.Services[]} | select(.b.Name=="cassandra") | {a:.a, b:.b.Name, c:.b.Checks[].CheckID} | [.a, .b, .c] | join("/")'
staging-data-0/cassandra/cassandra-client-port
staging-data-0/cassandra/cassandra-gossip-port
Both
jq -r '{a:.Member.Name, b:.DebugConfig.Services[]} | {a:.a, b:.b.Name, c:.b.Checks[].CheckID} | [.a, .b, .c] | join("/")'
staging-data-0/cassandra/cassandra-client-port
staging-data-0/cassandra/cassandra-gossip-port
staging-data-0/cockroachdb/cockroachdb-tcp
With your input, the jq filter:
.DebugConfig.Services[] as $s
| "/\(.Member.Name)/\($s.Name)/\($s.Checks[].CheckID)"
produces:
"/staging-data-0/cassandra/cassandra-client-port"
"/staging-data-0/cassandra/cassandra-gossip-port"
"/staging-data-0/cockroachdb/cockroachdb-tcp"
Since you only want the "cassandra" strings, you just need to interject a "select" filter:
.DebugConfig.Services[] as $s
| "/\(.Member.Name)/\($s.Name)/" +
($s
| select(.Name == "cassandra")
| .Checks[].CheckID)
but it's worth noting how easy it is to process all the "Checks" items.
I have the following JSON Data-Structure:
{
"data": [
[
{
"a": "1",
"b": "i"
},
{
"a": "2",
"b": "ii"
},
{
"a": "3",
"b": "iii"
}
],
[
{
"a": "4",
"b": "iv"
},
{
"a": "5",
"b": "v"
},
{
"a": "6",
"b": "vi"
}
]
]
}
And I need to get the following output:
1+2+3 i|ii|iii
4+5+6 iv|v|vi
I tried the following without success:
$ cat data.json | jq -r '.data[] | .[].a | join("+")'
jq: error (at <stdin>:1642): Cannot iterate over string ("1")
And also this, but I don't even got an idea how to solve this:
$ cat data.json | jq -r '.data[] | to_entries | .[]'
Looks like an endless journey for me at this time, I you can help me, I would be very happy. :-)
Should be pretty simple. Get both the fields into an array, join them with the required delimit character and put it in a tabular format
jq -r '.data[] | [ ( map(.a) | join("+") ), ( map(.b) | join("|") ) ] | #tsv'
I have 2 JSON files. I would like to use jq to take the value of "capital" from File 2 and merge it with File 1 for each element where the same "name"-value pair occurs. Otherwise, the element from File 2 should not occur in the output. If there is no "name"-value pair for an element in File 1, it should have empty text for "capital."
File 1:
{
"countries":[
{
"name":"china",
"continent":"asia"
},
{
"name":"france",
"continent":"europe"
}
]
}
File 2:
{
"countries":[
{
"name":"china",
"capital":"beijing"
},
{
"name":"argentina",
"capital":"buenos aires"
}
]
}
Desired result:
{
"countries":[
{
"name":"china",
"continent":"asia",
"capital":"beijing"
},
{
"name":"france",
"continent":"europe",
"capital":""
}
]
}
You could first construct a dictionary from File2, and then perform the update, e.g. like so:
jq --argfile dict File2.json '
($dict.countries | map( {(.name): .capital}) | add) as $capitals
| .countries |= map( .capital = ($capitals[.name] // ""))
' File2.json
From a JSON-esque perspective, it would probably be better to use null for missing values; in that case, you could simplify the above by omitting // "".
Using INDEX/2
If your jq has INDEX/2, then the $capitals dictionary could be constructed using the expression:
INDEX($dict.countries[]; .name) | map_values(.capital)
Using INDEX makes the intention clearer, but if efficiency were a major concern, you'd probably be better off using reduce explicitly:
reduce $dict.countries[] as $c ({}; . + ($c | {(.name): .capital}))
One way:
$ jq --slurpfile file2 file2.json '
{ countries:
[ .countries[] |
. as $curr |
$curr + { capital: (($file2[0].countries[] | select(.name == $curr.name) | .capital) // "") }
]
}' file1.json
{
"countries": [
{
"name": "china",
"continent": "asia",
"capital": "beijing"
},
{
"name": "france",
"continent": "europe",
"capital": ""
}
]
}
An alternative:
$ jq -n '{ countries: ([inputs] | map(.countries) | flatten | group_by(.name) |
map(select(.[] | has("continent")) | add | .capital //= ""))
}' file[12].json
I want to sort this data structure by the object keys (easy with -S and sort the object values (the arrays) by the 'foo' property.
I can sort them with
jq -S '
. as $in
| keys[]
| . as $k
| $in[$k] | sort_by(.foo)
' < test.json
... but that loses the keys.
I've tried variations of adding | { "\($k)": . }, but then I end up with a list of objects instead of one object. I also tried variations of adding to $in (same problem) or using $in = $in * { ... }, but that gives me syntax errors.
The one solution I did find was to just have the separate objects and then pipe it into jq -s add, but ... I really wanted it to work the other way. :-)
Test data below:
{
"": [
{ "foo": "d" },
{ "foo": "g" },
{ "foo": "f" }
],
"c": [
{ "foo": "abc" },
{ "foo": "def" }
],
"e": [
{ "foo": "xyz" },
{ "foo": "def" }
],
"ab": [
{ "foo": "def" },
{ "foo": "abc" }
]
}
Maybe this?
jq -S '.[] |= sort_by(.foo)'
Output
{
"": [
{
"foo": "d"
},
{
"foo": "f"
},
{
"foo": "g"
}
],
"ab": [
{
"foo": "abc"
},
{
"foo": "def"
}
],
"c": [
{
"foo": "abc"
},
{
"foo": "def"
}
],
"e": [
{
"foo": "def"
},
{
"foo": "xyz"
}
]
}
#user197693 had a great answer. A suggestion I got in a private message elsewhere was to use
jq -S 'with_entries(.value |= sort_by(.foo))'
If for some reason using the -S command-line option is not a satisfactory option, you can also perform the by-key sort using the to_entries | sort_by(.key) | from_entries idiom. So a complete solution to the problem would be:
.[] |= sort_by(.foo)
| to_entries | sort_by(.key) | from_entries