Update: I have experimented with extracting the paths I desire to update and using those paths relative to the local and returned objects (read below) in a setpath(paths;getpath(paths)) construction. I can now iterate over the $paths array and make the desired updates to the local json object.
Using the thermostats2.json file below and a ret.json that differs from thermostats2.json only in:
{"location":{"livingroom":{"setpoints":{"day":"25000"}}}} #vs{"day":"23000"}
my script now looks like:
. as $obj |
# obtain location keys from $obj as they may have changed locally prior to $retobj being processed
($obj.location | keys? ) as $locs |
# setpoints are fixed in this code as ["day","night","away"]
["day","night","away"] as $setpoints |
[path($obj.location[($locs[])].setpoints[($setpoints[])])] as $paths |
reduce
range(0; $paths|length) as $i
(.; . | setpath($paths[$i];( $retobj[0] | getpath($paths[$i]))) ) | .
I don't need the $obj variable at this time but I have not cleaned that up yet. Please comment if you see problems with this approach or if this looks like a good solution. I will answer this question if the comments indicate it should be.
I have a json object that contains several location objects each, in turn, containing several setpoint objects, among other data. A remote application is provided this json object and returns updates to the values of the setpoint objects, if required. I would like to update the local json object rather than replace it with the returned object.
I do not want to assume the returned object's location keys are identical to those of the local object as the local object may have been maintained while the remote object was being modified.
I have figured out how to extract the location keys from the local file and create an array containing the setpoint keys whose values I am interested in updating. I have also been able to figure out how to reduce the updated values from the returned object into an array.
What I have not figured out is how to iterate over the locations and the setpoints together in order to update the values in the local json object.
I invoke jq with:
# usage : jq --slurpfile retobj ret.json --from-file query.jq thermostats2.json
query.jq contains:
# use $obj as the local object to be updated with values returned in $retobj
# $retobj is not permitted to modify the structure of $obj
. as $obj |
# obtain location keys from $obj as they may have changed locally prior to $retobj being processed
($obj.location | keys? ) as $locs |
# setpoints are fixed in this code as ["day","night","away"]
["day","night","away"] as $setpoints |
reduce $retobj[0].location[($locs[])].setpoints[($setpoints[])] as $item
( []; . + [$item] )
| . as $vals |
$vals
thermostats2.json:
{ "mode":"Home",
"location": {
"livingroom": {
"scale":"Celcius",
"current": {
"valid":"YES",
"reading":"23000",
"time":"000000"
},
"previous": {
"reading":"23000",
"time":"000000"
},
"setpoints": {
"schedule": {
"weekday": {"day":"0600",
"night":"2100"
},
"weekend": {"day":"0630",
"night":"2200"
}
},
"active":"day",
"day":"23000",
"night":"15556",
"away":"12778"
}
},
"familyroom": {
"scale":"Celcius",
"current": {
"valid":"YES",
"reading":"23000",
"time":"000000"
},
"previous": {
"reading":"23000",
"time":"000000"
},
"setpoints": {
"schedule": {
"weekday": {"day":"0600",
"night":"2100"
},
"weekend": {"day":"0630",
"night":"2200"
}
},
"active":"day",
"day":"23000",
"night":"15556",
"away":"12778"
}
},
"28-000005e2fdef": {
"scale":"Celcius",
"current": {
"valid":"YES",
"reading":"23000",
"time":"000000"
},
"previous": {
"reading":"23000",
"time":"000000"
},
"setpoints": {
"schedule": {
"weekday": {"day":"0600",
"night":"2100"
},
"weekend": {"day":"0630",
"night":"2200"
}
},
"active":"day",
"day":"23000",
"night":"15556",
"away":"12778"
}
}
}
}
What I cannot find is any means to set the values for the same objects in $obj, i.e. effectively:
$obj[0].location[($locs[])].setpoints[($setpoints[])] = $vals
I understand that, as a novice, I am not likely choosing the preferred approach for solving this type of problem. I am also struggling with embracing the filter paradigm in some of the built-in functions, particularly foreach.
to recap my goal, I wish to:
get the proper object values in $retobj via location keys derived from the local obj and the setpoint keys defined in the filter, and set the same paths in local object to those values.
Based on my understanding of the question, I believe that using --argfile rather than --slurpfile would simplify things.
The following filter will adjust $retobj based on the contents of thermostats2.json:
. as $in
| reduce ["location",
(.location | keys[]),
"setpoints",
["day","night","away"][]] as $path
( $retobj;
setpath( $path; ($in | getpath($path)) ) )
Invocation:
jq --argfile retobj ret.json -f query.jq thermostats2.json
Related
having trouble finding this, maybe it's just my search terms or who knows.
basically, i have a series of arrays mapping keyspaces to destination DBs for a large noSQL migration, in order for us to more easily script data movement. i'll include sample JSON below.
it's nested basically like: environment >> { [ target DB ] >> [ list of keyspaces ] }, { [ target DB ] >> [ list of keyspaces ] }
my intent was to update my migration script to more intelligently determine where things go based on which environment is specified, etc and require less user input or "figuring things out".
here's sample JSON:
{
"Prod": [
{
"prod1": [
"prod_db1",
"prod_db2",
"prod_d31",
"prod_db4"
]
},
{
"prod2": [
"prod_db5",
"prod_db6",
"prod_db7",
"prod_db8"
]
}
]
}
assuming i'm able to provide keyspace and environment to the script, and use those as variables in my jq query, is there a way to search for the keyspace and return the value for one level up? IE, i know i can do something like:
!#/bin/bash
ENV="Prod"
jq '.."${ENV}"[][]' env.json
to just get the DBs in the prod environment. but if i'm searching for prod_db6' how can i return the value prod2`?
Use to_entries to decompose an object into an array of key-value pairs, then IN to search in the value's array, and finally return the key:
jq -r --arg env "Prod" --arg ksp "prod_db6" '
.[$env][] | to_entries[] | select(IN(.value[]; $ksp)).key
' env.json
prod2
Demo
I am new to Powershell and am having trouble doing something that I imagine may be pretty simple, but not having luck so far.
I have an array with two objects in it. It looks like this basically:
[
{
"name":"John",
"age":30,
...
...
},
{
"score":null,
"vehicle":"Camaro",
"engine":"V8",
...
...
}
]
My goal is to update the score value in the second object. I have had luck doing so when the key's value is already present as a String, but am not understanding why I am unable to get this to work when the key is present but the value is null (as shown above).
I have tried using a function which I learned about when doing a search for a previously posted question trying to do something similar:
Set Value of Nested Object Property by Name in PowerShell
The function from that question looks like this:
function SetValue($object, $key, $Value)
{
$p1,$p2 = $key.Split(".")
if($p2) { SetValue -object $object.$p1 -key $p2 -Value $Value }
else { $object.$p1 = $Value }
}
and can be called like this:
SetValue -object $Obj -key $Key -Value $Value
I am not sure why it matters if the value is NULL or not. It can find the key, but just not doing anything if its value is NULL.
My apologies if this already out there. Powershell is just a little different than anything I have worked with before! Any help is greatly appreciated! :)
The object generated by your JSON string is an array of two objects:
$json = #'
[
{
"name":"John",
"age":30,
},
{
"score":null,
"vehicle":"Camaro",
"engine":"V8"
},
]
'# | ConvertFrom-Json
$json[0] # => Element 0 of the Array
name age
---- ---
John 30
$json[1] # => Element 1 of the Array
score vehicle engine
----- ------- ------
Camaro V8
Updating the Value of the score property considering the JSON will always be the same, meaning, the element 1 of the array will always have that property would be as simple as:
$json[1].score = 'hello' # => Updating the property value
$json | ConvertTo-Json # => Converting the array back to Json
[
{
"name": "John",
"age": 30
},
{
"score": "hello",
"vehicle": "Camaro",
"engine": "V8"
}
]
On the other hand, if the arrangement of the objects wouldn't be always the same, you could loop over all elements of the array searching for that property and, once found, update it. For example:
# For each element of the array
foreach($object in $json) {
# if this object has a property with name `score`
# assign that `NoteProperty` to the variable `$prop`
if($prop = $object.PSObject.Properties.Item('score')) {
# update the Value of the `NoteProperty`
$prop.Value = 'hello'
}
}
$json | ConvertTo-Json # Convert it back
I would like to "transpose" (not sure that's the right word) JSON elements.
For example, I have a JSON file like this:
{
"name": {
"0": "fred",
"1": "barney"
},
"loudness": {
"0": "extreme",
"1": "not so loud"
}
}
... and I would like to generate a JSON array like this:
[
{
"name": "fred",
"loudness": "extreme"
},
{
"name": "barney",
"loudness": "not so loud"
}
]
My original JSON has many more first level elements than just "name" and "loudness", and many more names, features, etc.
For this simple example I could fully specify the transformation like this:
$ echo '{"name":{"0":"fred","1":"barney"},"loudness":{"0":"extreme","1":"not so loud"}}'| \
> jq '[{"name":.name."0", "loudness":.loudness."0"},{"name":.name."1", "loudness":.loudness."1"}]'
[
{
"name": "fred",
"loudness": "extreme"
},
{
"name": "barney",
"loudness": "not so loud"
}
]
... but this isn't feasible for the original JSON.
How can jq create the desired output while being key-agnostic for my much larger JSON file?
Yes, transpose is an appropriate word, as the following makes explicit.
The following generic helper function makes for a simple solution that is completely agnostic about the key names, both of the enclosing object and the inner objects:
# Input: an array of values
def objectify($keys):
. as $in | reduce range(0;length) as $i ({}; .[$keys[$i]] = $in[$i]);
Assuming consistency of the ordering of the inner keys
Assuming the key names in the inner objects are given in a consistent order, a solution can now obtained as follows:
keys_unsorted as $keys
| [.[] | [.[]]] | transpose
| map(objectify($keys))
Without assuming consistency of the ordering of the inner keys
If the ordering of the inner keys cannot be assumed to be consistent, then one approach would be to order them, e.g. using this generic helper function:
def reorder($keys):
. as $in | reduce $keys[] as $k ({}; .[$k] = $in[$k]);
or if you prefer a reduce-free def:
def reorder($keys): [$keys[] as $k | {($k): .[$k]}] | add;
The "main" program above can then be modified as follows:
keys_unsorted as $keys
| (.[$keys[0]]|keys_unsorted) as $inner
| map_values(reorder($inner))
| [.[] | [.[]]] | transpose
| map(objectify($keys))
Caveat
The preceding solution only considers the key names in the first inner object.
Building upon Peak's solution, here is an alternative based on group_by to deal with arbitrary orders of inner keys.
keys_unsorted as $keys
| map(to_entries[])
| group_by(.key)
| map(with_entries(.key = $keys[.key] | .value |= .value))
Using paths is a good idea as pointed out by Hobbs. You could also do something like this :
[ path(.[][]) as $p | { key: $p[0], value: getpath($p), id: $p[1] } ]
| group_by(.id)
| map(from_entries)
This is a bit hairy, but it works:
. as $data |
reduce paths(scalars) as $p (
[];
setpath(
[ $p[1] | tonumber, $p[0] ];
( $data | getpath($p) )
)
)
First, capture the top level as $data because . is about to get a new value in the reduce block.
Then, call paths(scalars) which gives a key path to all of the leaf nodes in the input. e.g. for your sample it would give ["name", "0"] then ["name", "1"], then ["loudness", "0"], then ["loudness", "1"].
Run a reduce on each of those paths, starting the reduction with an empty array.
For each path, construct a new path, in the opposite order, with numbers-in-strings turned into real numbers that can be used as array indices, e.g. ["name", "0"] becomes [0, "name"].
Then use getpath to get the value at the old path in $data and setpath to set a value at the new path in . and return it as the next . for the reduce.
At the end, the result will be
[
{
"name": "fred",
"loudness": "extreme"
},
{
"name": "barney",
"loudness": "not so loud"
}
]
If your real data structure might be two levels deep then you would need to replace [ $p[1] | tonumber, $p[0] ] with a more appropriate expression to transform the path. Or maybe some of your "values" are objects/arrays that you want to leave alone, in which case you probably need to replace paths(scalars) with something like paths | select(length == 2).
I have a lookup file that maps IDs from one system onto another:
[
{
"idA": 2547,
"idB": "5d0bf91d191c6554d14572a6"
},
{
"idA": 2549,
"idB": "5b0473f93d4e53db19f8c249"
},
{
"idA": 2550,
"idB": "5d0bfabc8f20917b92ff07dc"
},
...
And I have a data file with values and an ID from one of these systems:
[
{
"idB": "5d0bf91d191c6554d14572a6",
"description": "Description for 5d0bf91d191c6554d14572a6"
},
{
"idB": "5d0bf49e9236c57281811cfc",
"description": "Description for 5d0bf49e9236c57281811cfc"
},
{
"idB": "5d0bfabc8f20917b92ff07dc",
"description": "Description for 5d0bfabc8f20917b92ff07dc"
},
...
I want to produce a new file of the descriptions with their IDs converted to the idA values in the lookup file. I tried this:
jq --slurpfile idmap ids.json 'map( {"description":.description, "id": (.idB as $b|$idmap[][]|select(.idB==$b)|.idA) } )' descriptions.json
But it produces only an empty array.
I have to double-dereference $idmap because slurping a file "binds an array of the parsed JSON values to the given global variable" -- so just doing $idmap[] throws an error, jq: error (at descriptions.json:70): Cannot index array with string "idB".
Can anyone explain what I'm doing wrong here?
Here's a concise and straightforward solution to the stated problem.
For simplicity, we'll begin by constructing a dictionary containing the relevant mapping using INDEX/2:
INDEX($idmap[]; .idB) | map_values(.idA)
Now the task is easy:
(INDEX($idmap[]; .idB) | map_values(.idA)) as $dict
| map( {description, "idA": $dict[.idB] } )
This assumes an invocation that uses --argfile idmap ids.json to avoid
the unwanted "slurping" caused by --slurpfile, but if the latter is used, then you would use $idmap[][] instead as noted in the original question.
Since the sample snippets do not include any matching "idB" values, there is little point in showing the output that would be obtained using these snippets.
Variation
If the objects in descriptions.js had other keys that should be retained, then the following variant would probably be a more useful guide:
(INDEX($idmap[]; .idB) | map_values(.idA)) as $dict # or $idmap[][] as above
| map( .idA = $dict[.idB] | del(.idB) )
I have a specific json content for which I need to get all keys which contains the character / in their values.
JSON
{ "dig": "sha256:d2aae00e4bc6424d8a6ae7639d41cfff8c5aa56fc6f573e64552a62f35b6293e",
"name": "example",
"binding": {
"wf.example.input1": "/path/to/file1",
"wf.example.input2": "hello",
"wf.example.input3":
["/path/to/file3",
"/path/to/file4"],
"wf.example.input4": 44
}
}
I know I can get all the keys containing file path or array of file paths using query jq 'paths(type == "string" and contains("/"))'. This would give me an output like:
[ "binding", "wf.example.input1" ]
[ "binding", "wf.example.input3", 0]
[ "binding", "wf.example.input3", 1 ]
Now that i have all the elements that contains some file paths as their values, is there a way to fetch both key and value for the same and then store them as another JSON? For example, in JSON mentioned for this question, I need to get the output as another JSON containing all the matched paths. My output JSON should look something like below.
{ "binding":
{ "wf.example.input1": "/path/to/file1",
"wf.example.input3": [ "/path/to/file3", "/path/to/file4" ]
}
}
The following jq filter will produce the desired output if given input that is very similar to the example, but it is far from robust and glosses over some details that are unclear from the problem description. However, it should be easy enough to modify the filter in accordance with more precise specifications:
. as $in
| reduce paths(type == "string" and test("/")) as $path ({};
($in|getpath($path)) as $x
| if ($path[-1]|type) == "string"
then .[$path[-1]] = $x
else .[$path[-2]|tostring] += [$x]
end )
| {binding: .}
Output:
{
"binding": {
"wf.example.input1": "/path/to/file1",
"wf.example.input3": [
"/path/to/file3",
"/path/to/file4"
]
}
}