jq - add new field with updating whole file - json

I have json file which is constructed in simmilar way:
[
{
"_id":"1234",
"org":"org1",
"int":
{"url":"http://url.com.uk:1234"}},
{
"_id":"4321",
"org":"org2",
"int":
{"url":"http://url.com.us:4321"}},
...
]
Now im "jumping" from one entry to another and checking if under URL application is working properly. After check i want to add/update field "status". But i can't update whole file, im just getting:
$ jq --arg mod "GOOD" '.[0].int + {stat: $mod}' tmp.json
{
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
How can i with jq command get new updated whole file, not just only part of it?

If you put your data in data.json and the changes you want to make to
each record into a separate arg.json argument file like
{
"1234": { "int": { "stat": "GOOD" } },
"4321": { "int": { "stat": "BAD", "xxx": "yyy" } }
}
and run jq as
$ jq -M --argfile arg arg.json 'map(. + $arg[._id])' data.json
then it will output the updated data, e.g.
[
{
"_id": "1234",
"org": "org1",
"int": {
"stat": "GOOD"
}
},
{
"_id": "4321",
"org": "org2",
"int": {
"stat": "BAD",
"xxx": "yyy"
}
}
]
Note that the + replaces keys. If you want to merge keys you can use * e.g.
$ jq -M --argfile arg arg.json 'map(. * $arg[._id])' data.json
which generates
[
{
"_id": "1234",
"org": "org1",
"int": {
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
},
{
"_id": "4321",
"org": "org2",
"int": {
"url": "http://url.com.us:4321",
"stat": "BAD",
"xxx": "yyy"
}
}
]
If you want to update the data in place you could use sponge
as described in the answer Manipulate JSON with jq
e.g.
$ jq -M --argfile arg arg.json 'map(. * $arg[._id])' data.json | sponge data.json

You can map to array and resign the int by operation, like:
jq --arg mod "GOOD" '.[] | .int=.int + {stat: $mod}' tmp.json
{
"_id": "1234",
"org": "org1",
"int": {
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
}
{
"_id": "4321",
"org": "org2",
"int": {
"url": "http://url.com.us:4321",
"stat": "GOOD"
}
}

Related

Grep command from json file - Bash scripting

My json file has the below content:
{
"Fruits": {
"counter": 1,
"protocols": [
{
"id": "100",
"name": "lemon",
"category": "citrus"
},
{
"id": "350",
"name": "Orange",
"category": "citrus"
},
{
"id": "150",
"name": "lime",
"category": "citrus"
}
]
}
}
I am expecting an output as below
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus
Easy to do with jq:
$ jq -r '.Fruits.protocols[] | "Fruits:\(.name):\(.category)"' input.json
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus
The jq answer is better. Still posting a Ruby solution (if you cannot use jq), but it is less elegant:
ruby -e '
require "json";
l="";
ARGF.each { |x| l+=x };
obj=JSON.parse(l);
obj["Fruits"]["protocols"].each { |x| puts "Fruits:#{x["name"]}:#{x["category"]}" }
'
Here is the full example:
echo '{"Fruits":{"counter":1,"protocols":[{"id":"100","name":"lemon","category":"citrus"},{"id":"350","name":"Orange","category":"citrus" },{"id":"150","name":"lime","category":"citrus"}]}}' \
| ruby -e 'require "json";l="";ARGF.each { |x| l+=x } ; obj=JSON.parse(l) ; obj["Fruits"]["protocols"].each { |x| puts "Fruits:#{x["name"]}:#{x["category"]}" }'
Output:
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus

Using jq find key/value pair based on another key/value pair

I'm pasting here a JSON example data which would require some manipulation to get a desired output which is mentioned in the next section to be read after this piece of JSON code.
I want to use jq for parsing my desired data.
{
"MetricAlarms": [
{
"EvaluationPeriods": 3,
"ComparisonOperator": "GreaterThanOrEqualToThreshold",
"AlarmActions": [
"Unimportant:Random:alarm:ELK2[10.1.1.2]-Root-Disk-Alert"
],
"AlarmName": "Unimportant:Random:alarm:ELK1[10.1.1.0]-Root-Alert",
"Dimensions": [
{
"Name": "path",
"Value": "/"
},
{
"Name": "InstanceType",
"Value": "m5.2xlarge"
},
{
"Name": "fstype",
"Value": "ext4"
}
],
"DatapointsToAlarm": 3,
"MetricName": "disk_used_percent"
},
{
"EvaluationPeriods": 3,
"ComparisonOperator": "GreaterThanOrEqualToThreshold",
"AlarmActions": [
"Unimportant:Random:alarm:ELK2[10.1.1.2]"
],
"AlarmName": "Unimportant:Random:alarm:ELK2[10.1.1.2]",
"Dimensions": [
{
"Name": "path",
"Value": "/"
},
{
"Name": "InstanceType",
"Value": "r5.2xlarge"
},
{
"Name": "fstype",
"Value": "ext4"
}
],
"DatapointsToAlarm": 3,
"MetricName": "disk_used_percent"
}
]
}
So when I Pass some Key1 & value1 as a parameter "Name": "InstanceType", to the JQ probably using cat | jq and output expected should be as below
m5.2xlarge
r5.2xlarge
A generic approach to search for a key-value pair (sk-sv) in input recursively and extract another key's value (pv) from objects found:
jq -r --arg sk Name \
--arg sv InstanceType \
--arg pv Value \
'.. | objects | select(contains({($sk): $sv})) | .[$pv]' file

Add or Update a field in one JSON file from another JSON file based on matching field

I have two JSON files a.json and b.json. The contents in a.json file is a JSON object and inside b.json its an array.I want to add/update status field in each mappings in a.json by retrieving the value from b.json file.
a.json:
{
"title": 25886,
"data": {
"request": {
"c": 46369,
"t1": 1562050127.376641
},
},
"rs": {
"mappings": {
"12345": {
"id": "12345",
"name": "test",
"customer_id": "11228",
},
"45678": {
"id": "45678",
"name": "abc",
"customer_id": "11206",
}
}
}}
b.json:
[
{
"status": "pending",
"extra": {
"name": "test"
},
"enabled": true,
"id": "12345"
},
{
"status": "not_started",
"extra": {
"name": "abc"
},
"enabled": true,
"id": "45678"
}
]
Below is my expected output:
{
"title": 25886,
"data": {
"request": {
"c": 46369,
"t1": 1562050127.376641
},
},
"rs": {
"mappings": {
"12345": {
"id": "12345",
"name": "test",
"customer_id": "11228",
"status":"pending"
},
"45678": {
"id": "45678",
"name": "abc",
"customer_id": "11206",
"status":"not_started"
}
}
}}
In this expected JSON file we have status field whose value is retrieved from b.json file based on a matching id value. How to do this using jq ?
For the purposes of this problem, b.json essentially defines a dictionary, so for simplicity, efficiency and perhaps elegance,
it make sense to start by using the builtin function INDEX to create the relevant dictionary:
INDEX( $b[] | {id, status}; .id )
This assumes an invocation of jq along the lines of:
jq --argfile b b.json -f update.jq a.json
(Yes, I know --argfile has been deprecated. Feel free to choose another way to set $b to the contents of b.json.)
Now, to perform the update, it will be simplest to use the "update" operator, |=, in conjunction with map_values. (Feel free to check the jq manual :-)
Putting everything together:
INDEX( $b[] | {id, status}; .id ) as $dict
| .rs.mappings |= map_values( .status = $dict[.id].status )

insert a json file into json

I'd like to know a quick way to insert a json to json.
$ cat source.json
{
"AWSEBDockerrunVersion": 2,
"containerDefinitions": [
{
"environment": [
{
"name": "SERVICE_MANIFEST",
"value": ""
},
{
"name": "SERVICE_PORT",
"value": "4321"
}
]
}
]
}
The SERVICE_MANIFEST is content of another json file
$ cat service_manifest.json
{
"connections": {
"port": "1234"
},
"name": "foo"
}
I try to make it with jq command
cat service_manifest.json |jq --arg SERVICE_MANIFEST - < source.json
But seems it doesn't work
Any ideas? The final result still should be a valid json file
{
"AWSEBDockerrunVersion": 2,
"containerDefinitions": [
{
"environment": [
{
"name": "SERVICE_MANIFEST",
"value": {
"connections": {
"port": "1234"
},
"name": "foo"
}
},
...
]
}
],
...
}
Updates.
Thanks, here is the command I run from your sample.
$ jq --slurpfile sm service_manifest.json '.containerDefinitions[].environment[] |= (select(.name=="SERVICE_MANIFEST").value=$sm)' source.json
But the result is an array, not list.
{
"AWSEBDockerrunVersion": 2,
"containerDefinitions": [
{
"environment": [
{
"name": "SERVICE_MANIFEST",
"value": [
{
"connections": {
"port": "1234"
},
"name": "foo"
}
]
},
{
"name": "SERVICE_PORT",
"value": "4321"
}
]
}
]
}
You can try this jq command:
jq --slurpfile sm SERVICE_MANIFEST '.containerDefinitions[].environment[] |= (select(.name=="SERVICE_MANIFEST").value=$sm[])' file
--slurpfile assigns the content of the file to the variable sm
The filter replaces the array .containerDefinitions[].environment[] with the content of the file only on the element having SERVICE_MANIFEST as name.
A simple solution would use --argfile and avoid select:
< source.json jq --argfile sm service_manifest.json '
.containerDefinitions[0].environment[0].value = $sm '
Or if you want only to update the object(s) with .name == "SERVICE_MANIFEST" you could use the filter:
.containerDefinitions[].environment
|= map(if .name == "SERVICE_MANIFEST"
then .value = $sm
else . end)
Variations
There is no need for any "--arg"-style parameter at all, as illustrated by the following:
jq -s '.[1] as $sm
| .[0] | .containerDefinitions[0].environment[0].value = $sm
' source.json service_manifest.json

Making multiple changes to one JSON template

I'm currently using jq with the 1pass CLI to try and create randomly generated passwords into a secure note. I'm having an issue with setting the fields.
These are two of my variables. I have 8 total I need to set.
section0_section_uuid="Section_0"
section1_section_uuid="Section_1"
And here are my commands to manipulate the template. I first read it in, change the first title, then save it to $template. I then pass $template into jq
template=$(cat template.json | jq --arg uuid "$section0_section_uuid" '.sections[0].title=$uuid')
template=$($template | jq --arg uuid "$section1_section_uuid" '.sections[1].title=$uuid')
echo $template
I get "file name too long." I don't think I'm passing the modified template variable in correctly. I need to do 7 more modifications to the template.json file.
Edit:
Here's the full template I'm trying to manipulate. It's 12 total changes to the template I have to make. 10 of the 12 are random numbers that I will generate. The remaining 2 of the 12 will be a generated usernames.
{
"fields": [],
"sections": [
{
"fields": [
{
"k": "concealed",
"n": "[CHANGE_ME]",
"t": "ROOT_USER_PASS",
"v": "[CHANGE_ME]"
},
{
"k": "concealed",
"n": "[CHANGE_ME]",
"t": "DEV_USER_PASS",
"v": "[CHANGE_ME]"
}
],
"name": "Section_[CHANGE_ME]",
"title": "Container SSH"
},
{
"fields": [
{
"k": "string",
"n": "[CHANGE_ME]",
"t": "placeholdertext",
"v": "[CHANGE_ME_LETTERS]"
},
{
"k": "string",
"n": "[CHANGE_ME]",
"t": "placeholdertext",
"v": "[CHANGE_ME_LETTERS]"
},
{
"k": "concealed",
"n": "[CHANGE_ME]",
"t": "placeholdertext",
"v": "[CHANGE_ME]"
}
],
"name": "Section_[CHANGE_ME]",
"title": "MySQL"
}
]
}
Why not make your template an actual jq filter, rather than a JSON blob to modify?
The contents of template.jq would be
{
sections: [
{ title: $t1 },
{ title: $t2 },
{ title: $t3 },
{ title: $t4 },
{ title: $t5 },
{ title: $t6 },
{ title: $t7 },
{ title: $t8 }
]
}
Then your command would simply be
$ jq -n --arg t1 foo --arg t2 bar ... -f template.jq
{
"sections": [
{
"title": "foo"
},
{
"title": "bar"
},
...
]
}
One benefit of doing it this way is that you can't accidentally forget a value; jq can only process the filter if you provide definitions for all 8 variables.
Here is a solution which uses jq to build a legal json array from bash variables and then uses that array with a second jq invocation to substitute the variables into the template at corresponding positions:
#!/bin/bash
# example template
template='{
"sections": [
{ "title": "x" },
{ "title": "y" }
]
}'
# bash variables
section0_section_uuid="Section_0"
section1_section_uuid="Section_1"
# put into json array with jq
uuids=$(jq -MRn '[inputs]' <<EOF
$section0_section_uuid
$section1_section_uuid
EOF)
# substitute json array into template
jq -M --argjson uuids "$uuids" '
reduce ($uuids|keys[]) as $k (.; .sections[$k].title = $uuids[$k])
' <<< "$template"
Sample Output
{
"sections": [
{
"title": "Section_0"
},
{
"title": "Section_1"
}
]
}
Try it online!
Here is a solution to a portion of the revised problem which works by replacing leaf values in the template with corresponding values from an object constructed from bash variables and passed to jq via --argjson. It should be straightforward to generalize to the complete template assuming more suitable names are chosen for replacement values then [CHANGE_ME] and [CHANGE_ME_LETTERS]
#!/bin/bash
# example template
template='{
"fields": [],
"sections": [ {
"fields": [ {
"k": "concealed",
"n": "[SSH_ROOT_USER_N]",
"t": "ROOT_USER_PASS",
"v": "[SSH_ROOT_USER_V]"
} ]
} ]
}'
# bash variables
SSH_ROOT_USER_N="abcd"
SSH_ROOT_USER_V="efgh"
# put into json object with jq
vars=$(jq -M . <<EOF
{
"[SSH_ROOT_USER_N]": "$SSH_ROOT_USER_N",
"[SSH_ROOT_USER_V]": "$SSH_ROOT_USER_V"
}
EOF)
# substitute variables into template
jq -M --argjson vars "$vars" '
reduce (tostream|select(length==2)) as [$p,$v] (
{}
; setpath($p;if $v|type!="string" then . else $vars[$v]//$v end)
)
' <<< "$template"
Sample Output
{
"fields": {},
"sections": [
{
"fields": [
{
"k": "concealed",
"n": "abcd",
"t": "ROOT_USER_PASS",
"v": "efgh"
}
]
}
]
}
Try it online!
The following approach to the problem is similar to #jq170727's (in particular, the jq program is agnostic both about the number of "section_uuid" variables, and the names of the template variables), but only one invocation of jq is required (rather than three).
The other significant difference is that reduce is used to avoid the penalties associated with using tostream. A minor difference is that inputs is used to avoid reading in the "section_uuid" variable values all at once.
Note: The fillin function defined below should be sufficient for basic templating.
In the following, the "template" file is assumed to be named template.json.
template.jq
# input: a JSON entity defining a template;
# vars: a JSON object defining TEMPLATEVARIABLE-VALUE pairs
def fillin(vars):
reduce paths as $p (.;
getpath($p) as $v
| if $v|type == "string" and vars[$v]
then setpath($p; vars[$v])
else .
end);
reduce inputs as $line ({i:0, value:$template};
(.value.sections[.i].title |= $line)
| .i +=1)
| .value
| fillin($vars)
The script
#!/bin/bash
### Set the bash variables - as many as needed
section0_section_uuid="Section_0"
section1_section_uuid="Section_1"
ROOT_USER_PASS=RUP
DEV_USER_PASS=DUP
### Preparations for calling jq
vars=$(cat<<EOF
{
"ROOT_USER_PASS": "$ROOT_USER_PASS",
"DEV_USER_PASS": "$DEV_USER_PASS"
}
EOF
)
cat << EOF | jq -nR --argfile template template.json --argjson vars "$vars" -f template.jq
$section0_section_uuid
$section1_section_uuid
EOF
Output
With the (expanded) example template, the output is:
{
"fields": [],
"sections": [
{
"fields": [
{
"k": "concealed",
"n": "[CHANGE_ME]",
"t": "RUP",
"v": "[CHANGE_ME]"
},
{
"k": "concealed",
"n": "[CHANGE_ME]",
"t": "DUP",
"v": "[CHANGE_ME]"
}
],
"name": "Section_[CHANGE_ME]",
"title": "Section_0"
},
{
"fields": [
{
"k": "string",
"n": "[CHANGE_ME]",
"t": "placeholdertext",
"v": "[CHANGE_ME_LETTERS]"
},
{
"k": "string",
"n": "[CHANGE_ME]",
"t": "placeholdertext",
"v": "[CHANGE_ME_LETTERS]"
},
{
"k": "concealed",
"n": "[CHANGE_ME]",
"t": "placeholdertext",
"v": "[CHANGE_ME]"
}
],
"name": "Section_[CHANGE_ME]",
"title": "Section_1"
}
]
}