Grep command from json file - Bash scripting - json

My json file has the below content:
{
"Fruits": {
"counter": 1,
"protocols": [
{
"id": "100",
"name": "lemon",
"category": "citrus"
},
{
"id": "350",
"name": "Orange",
"category": "citrus"
},
{
"id": "150",
"name": "lime",
"category": "citrus"
}
]
}
}
I am expecting an output as below
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus

Easy to do with jq:
$ jq -r '.Fruits.protocols[] | "Fruits:\(.name):\(.category)"' input.json
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus

The jq answer is better. Still posting a Ruby solution (if you cannot use jq), but it is less elegant:
ruby -e '
require "json";
l="";
ARGF.each { |x| l+=x };
obj=JSON.parse(l);
obj["Fruits"]["protocols"].each { |x| puts "Fruits:#{x["name"]}:#{x["category"]}" }
'
Here is the full example:
echo '{"Fruits":{"counter":1,"protocols":[{"id":"100","name":"lemon","category":"citrus"},{"id":"350","name":"Orange","category":"citrus" },{"id":"150","name":"lime","category":"citrus"}]}}' \
| ruby -e 'require "json";l="";ARGF.each { |x| l+=x } ; obj=JSON.parse(l) ; obj["Fruits"]["protocols"].each { |x| puts "Fruits:#{x["name"]}:#{x["category"]}" }'
Output:
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus

Related

Using jq to fetch key-value pairs from a json file

I am trying to get key#value pairs of JSON file below using jq
{
"STUFF_RELATED1": "STUFF_RELATED1",
"STUFF_RELATED2": "STUFF_RELATED2",
"THINGS": {
"THING_2": {
"details": {
"stuff_branch": "user/dev"
},
"repository": "path/to/repo",
"branch": "master",
"revision": "dsfkes4s34jlis4jsj4lis4sli3"
},
"THING_1": {
"details": {
"stuff_branch": "master"
},
"repository": "path/to/repo",
"branch": "master",
"revision": "dsfkes4s34jlis4jsj4lis4sli3"
}
},
"STUFF": {
"revision": "4u324i324iy32g",
"branch": "master"
}
}
The key#value pair should look like this:
THING_1#dsfkes4s34jlis4jsj4lis4sli3
Currently I have tried this on my own:
jq -r ' .THINGS | keys[] as $k | "($k)#(.[$k].revision)" ' things.json
But it does not give the resul that I really want.:( Thanks in advance!
You need to escape ( :
jq -r ' .THINGS | keys[] as $k | "\($k)#\(.[$k].revision)" ' things.json

Convert timestamps in JSON string with Bash commands

I have the following string as below:
{
"name": {
"value": "Demo"
},
"activity": {
"value": "CLOSED",
"timestamp": "2020-11-19T10:58:17.534+0000"
},
"state": {
"value": "OK",
"timestamp": "2020-11-19T10:58:17.570+0000"
},
"lastErrorCode": {
"value": "NO_MESSAGE",
"timestamp": "2020-11-19T10:58:17.570+0000"
}
}
How can I convert all timestamps in that string to another format (Timezone)? Example like (on cmd line):
echo '2020-11-19T10:58:17.534+0000' | xargs date +'%Y-%m-%d %H:%M:%S' -d
results in:
2020-11-19 11:58:17
That string of yours is actually a JSON(-string). Please use a tool that supports JSON and can do dateTime-conversions, like xidel.
First of all, the value of the timestamp-keys is not a valid dateTime. You'd have to change the timezone property 0000 to 00:00 to make it a valid one:
$ xidel -s input.json -e '
for $x in $json//timestamp return
dateTime(replace($x,"0000","00:00"))
'
2020-11-19T10:58:17.534Z
2020-11-19T10:58:17.57Z
2020-11-19T10:58:17.57Z
Then to change the timezone to +01:00 use adjust-dateTime-to-timezone():
$ xidel -s input.json -e '
for $x in $json//timestamp return
adjust-dateTime-to-timezone(
dateTime(replace($x,"0000","00:00")),
duration("PT1H")
)
'
2020-11-19T11:58:17.534+01:00
2020-11-19T11:58:17.57+01:00
2020-11-19T11:58:17.57+01:00
(You can remove duration("PT1H") if your timezone already is +01:00)
Finally to customize your output use format-dateTime():
$ xidel -s input.json -e '
for $x in $json//timestamp return
format-dateTime(
adjust-dateTime-to-timezone(
dateTime(replace($x,"0000","00:00")),
duration("PT1H")
),
"[Y]-[M01]-[D01] [H01]:[m01]:[s01]"
)
'
2020-11-19 11:58:17
2020-11-19 11:58:17
2020-11-19 11:58:17
If instead you want to update the JSON with these customized dateTimes... that can be done, but requires a more advanced recursive function:
$ xidel -s input.json --xquery '
declare function local:change-timestamp($a){
if (exists($a)) then
if ($a instance of map(*)) then
map:merge(
map:keys($a) ! map{
.:if (.="timestamp") then
format-dateTime(
adjust-dateTime-to-timezone(
dateTime(replace($a(.),"0000","00:00")),
duration("PT1H")
),
"[Y]-[M01]-[D01] [H01]:[m01]:[s01]"
)
else
local:change-timestamp($a(.))
}
)
else
$a
else
()
};
local:change-timestamp($json)
'
{
"name": {
"value": "Demo"
},
"activity": {
"value": "CLOSED",
"timestamp": "2020-11-19 11:58:17"
},
"state": {
"value": "OK",
"timestamp": "2020-11-19 11:58:17"
},
"lastErrorCode": {
"value": "NO_MESSAGE",
"timestamp": "2020-11-19 11:58:17"
}
}
Also check the xidel playground.
Should be possible using regex. Something like this:
str='{
"name": {
"value": "Demo"
},
"activity": {
"value": "CLOSED",
"timestamp": "2020-11-19T10:58:17.534+0000"
},
"state": {
"value": "OK",
"timestamp": "2020-11-19T10:58:17.570+0000"
},
"lastErrorCode": {
"value": "NO_MESSAGE",
"timestamp": "2020-11-19T10:58:17.570+0000"
}
}'
re='(.*?-[0-9]+-[0-9]+)T([0-9:]+)(\.[0-9+]+)(.*)'
while [[ $str =~ $re ]]; do
str="${BASH_REMATCH[1]} ${BASH_REMATCH[2]}${BASH_REMATCH[4]}"
done
echo "$str"
Returns:
{
"name": {
"value": "Demo"
},
"activity": {
"value": "CLOSED",
"timestamp": "2020-11-19 10:58:17"
},
"state": {
"value": "OK",
"timestamp": "2020-11-19 10:58:17"
},
"lastErrorCode": {
"value": "NO_MESSAGE",
"timestamp": "2020-11-19 10:58:17"
}
}

insert a json file into json

I'd like to know a quick way to insert a json to json.
$ cat source.json
{
"AWSEBDockerrunVersion": 2,
"containerDefinitions": [
{
"environment": [
{
"name": "SERVICE_MANIFEST",
"value": ""
},
{
"name": "SERVICE_PORT",
"value": "4321"
}
]
}
]
}
The SERVICE_MANIFEST is content of another json file
$ cat service_manifest.json
{
"connections": {
"port": "1234"
},
"name": "foo"
}
I try to make it with jq command
cat service_manifest.json |jq --arg SERVICE_MANIFEST - < source.json
But seems it doesn't work
Any ideas? The final result still should be a valid json file
{
"AWSEBDockerrunVersion": 2,
"containerDefinitions": [
{
"environment": [
{
"name": "SERVICE_MANIFEST",
"value": {
"connections": {
"port": "1234"
},
"name": "foo"
}
},
...
]
}
],
...
}
Updates.
Thanks, here is the command I run from your sample.
$ jq --slurpfile sm service_manifest.json '.containerDefinitions[].environment[] |= (select(.name=="SERVICE_MANIFEST").value=$sm)' source.json
But the result is an array, not list.
{
"AWSEBDockerrunVersion": 2,
"containerDefinitions": [
{
"environment": [
{
"name": "SERVICE_MANIFEST",
"value": [
{
"connections": {
"port": "1234"
},
"name": "foo"
}
]
},
{
"name": "SERVICE_PORT",
"value": "4321"
}
]
}
]
}
You can try this jq command:
jq --slurpfile sm SERVICE_MANIFEST '.containerDefinitions[].environment[] |= (select(.name=="SERVICE_MANIFEST").value=$sm[])' file
--slurpfile assigns the content of the file to the variable sm
The filter replaces the array .containerDefinitions[].environment[] with the content of the file only on the element having SERVICE_MANIFEST as name.
A simple solution would use --argfile and avoid select:
< source.json jq --argfile sm service_manifest.json '
.containerDefinitions[0].environment[0].value = $sm '
Or if you want only to update the object(s) with .name == "SERVICE_MANIFEST" you could use the filter:
.containerDefinitions[].environment
|= map(if .name == "SERVICE_MANIFEST"
then .value = $sm
else . end)
Variations
There is no need for any "--arg"-style parameter at all, as illustrated by the following:
jq -s '.[1] as $sm
| .[0] | .containerDefinitions[0].environment[0].value = $sm
' source.json service_manifest.json

Generate file and replace strings in template file in the generated files

I have a JSON file:
{
"header": {
"uuid": "c578592a-a751-4993-9060-53f488597e59",
"timestamp": 1522938800,
"productionDateTime": "2018-04-05T14:33:20.000+00:00",
"producers": {
"operator": null,
"application": {
"com.example": {
"id": {
"string": "1"
},
"name": {
"string": "Test"
}
}
},
"station": null,
"equipment": null,
"partner": null,
"flow": null
},
"correlationContext": {
"array": [{
"correlationId": "98440498-3104-479e-9c99-f4449ba8f4b6",
"correlationDateTime": {
"string": "2018-04-05T14:30:39.000+00:00"
}
}]
}
},
}
I would like to create n files that copy the JSON file but with a random correlationId and correlationDateTime.
Do you have any tips or suggestions? Many thanks in advance!
Using jq and leaving the generation of appropriate random values as an exercise for the reader:
$ jq --arg cid "foo" --arg cdt "bar" '
.header.correlationContext.array[0]={
correlationId: $cid,
correlationDateTime: {string: $cdt}
}' tmp.json
Here you go:-
#!/bin/bash
alias uid="python -c 'import sys,uuid; sys.stdout.write(uuid.uuid4().hex)' | pbcopy && pbpaste && echo"
dt=$(date "+%Y%m%d-%H%M%S")
# For above two values I am leaving with you. If you have best option please go for it
i=0
while [ $i -le 20 ]
do
sed "/correlationId/s/98440498-3104-479e-9c99-f4449ba8f4b6/"$uid"/;s/2018-04-05T14:30:39.000+00:00/"$dt"/" yourJsonFile.json > testcase$i.json
i=$((i+1))
done

jq - add new field with updating whole file

I have json file which is constructed in simmilar way:
[
{
"_id":"1234",
"org":"org1",
"int":
{"url":"http://url.com.uk:1234"}},
{
"_id":"4321",
"org":"org2",
"int":
{"url":"http://url.com.us:4321"}},
...
]
Now im "jumping" from one entry to another and checking if under URL application is working properly. After check i want to add/update field "status". But i can't update whole file, im just getting:
$ jq --arg mod "GOOD" '.[0].int + {stat: $mod}' tmp.json
{
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
How can i with jq command get new updated whole file, not just only part of it?
If you put your data in data.json and the changes you want to make to
each record into a separate arg.json argument file like
{
"1234": { "int": { "stat": "GOOD" } },
"4321": { "int": { "stat": "BAD", "xxx": "yyy" } }
}
and run jq as
$ jq -M --argfile arg arg.json 'map(. + $arg[._id])' data.json
then it will output the updated data, e.g.
[
{
"_id": "1234",
"org": "org1",
"int": {
"stat": "GOOD"
}
},
{
"_id": "4321",
"org": "org2",
"int": {
"stat": "BAD",
"xxx": "yyy"
}
}
]
Note that the + replaces keys. If you want to merge keys you can use * e.g.
$ jq -M --argfile arg arg.json 'map(. * $arg[._id])' data.json
which generates
[
{
"_id": "1234",
"org": "org1",
"int": {
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
},
{
"_id": "4321",
"org": "org2",
"int": {
"url": "http://url.com.us:4321",
"stat": "BAD",
"xxx": "yyy"
}
}
]
If you want to update the data in place you could use sponge
as described in the answer Manipulate JSON with jq
e.g.
$ jq -M --argfile arg arg.json 'map(. * $arg[._id])' data.json | sponge data.json
You can map to array and resign the int by operation, like:
jq --arg mod "GOOD" '.[] | .int=.int + {stat: $mod}' tmp.json
{
"_id": "1234",
"org": "org1",
"int": {
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
}
{
"_id": "4321",
"org": "org2",
"int": {
"url": "http://url.com.us:4321",
"stat": "GOOD"
}
}