Convert timestamps in JSON string with Bash commands - json

I have the following string as below:
{
"name": {
"value": "Demo"
},
"activity": {
"value": "CLOSED",
"timestamp": "2020-11-19T10:58:17.534+0000"
},
"state": {
"value": "OK",
"timestamp": "2020-11-19T10:58:17.570+0000"
},
"lastErrorCode": {
"value": "NO_MESSAGE",
"timestamp": "2020-11-19T10:58:17.570+0000"
}
}
How can I convert all timestamps in that string to another format (Timezone)? Example like (on cmd line):
echo '2020-11-19T10:58:17.534+0000' | xargs date +'%Y-%m-%d %H:%M:%S' -d
results in:
2020-11-19 11:58:17

That string of yours is actually a JSON(-string). Please use a tool that supports JSON and can do dateTime-conversions, like xidel.
First of all, the value of the timestamp-keys is not a valid dateTime. You'd have to change the timezone property 0000 to 00:00 to make it a valid one:
$ xidel -s input.json -e '
for $x in $json//timestamp return
dateTime(replace($x,"0000","00:00"))
'
2020-11-19T10:58:17.534Z
2020-11-19T10:58:17.57Z
2020-11-19T10:58:17.57Z
Then to change the timezone to +01:00 use adjust-dateTime-to-timezone():
$ xidel -s input.json -e '
for $x in $json//timestamp return
adjust-dateTime-to-timezone(
dateTime(replace($x,"0000","00:00")),
duration("PT1H")
)
'
2020-11-19T11:58:17.534+01:00
2020-11-19T11:58:17.57+01:00
2020-11-19T11:58:17.57+01:00
(You can remove duration("PT1H") if your timezone already is +01:00)
Finally to customize your output use format-dateTime():
$ xidel -s input.json -e '
for $x in $json//timestamp return
format-dateTime(
adjust-dateTime-to-timezone(
dateTime(replace($x,"0000","00:00")),
duration("PT1H")
),
"[Y]-[M01]-[D01] [H01]:[m01]:[s01]"
)
'
2020-11-19 11:58:17
2020-11-19 11:58:17
2020-11-19 11:58:17
If instead you want to update the JSON with these customized dateTimes... that can be done, but requires a more advanced recursive function:
$ xidel -s input.json --xquery '
declare function local:change-timestamp($a){
if (exists($a)) then
if ($a instance of map(*)) then
map:merge(
map:keys($a) ! map{
.:if (.="timestamp") then
format-dateTime(
adjust-dateTime-to-timezone(
dateTime(replace($a(.),"0000","00:00")),
duration("PT1H")
),
"[Y]-[M01]-[D01] [H01]:[m01]:[s01]"
)
else
local:change-timestamp($a(.))
}
)
else
$a
else
()
};
local:change-timestamp($json)
'
{
"name": {
"value": "Demo"
},
"activity": {
"value": "CLOSED",
"timestamp": "2020-11-19 11:58:17"
},
"state": {
"value": "OK",
"timestamp": "2020-11-19 11:58:17"
},
"lastErrorCode": {
"value": "NO_MESSAGE",
"timestamp": "2020-11-19 11:58:17"
}
}
Also check the xidel playground.

Should be possible using regex. Something like this:
str='{
"name": {
"value": "Demo"
},
"activity": {
"value": "CLOSED",
"timestamp": "2020-11-19T10:58:17.534+0000"
},
"state": {
"value": "OK",
"timestamp": "2020-11-19T10:58:17.570+0000"
},
"lastErrorCode": {
"value": "NO_MESSAGE",
"timestamp": "2020-11-19T10:58:17.570+0000"
}
}'
re='(.*?-[0-9]+-[0-9]+)T([0-9:]+)(\.[0-9+]+)(.*)'
while [[ $str =~ $re ]]; do
str="${BASH_REMATCH[1]} ${BASH_REMATCH[2]}${BASH_REMATCH[4]}"
done
echo "$str"
Returns:
{
"name": {
"value": "Demo"
},
"activity": {
"value": "CLOSED",
"timestamp": "2020-11-19 10:58:17"
},
"state": {
"value": "OK",
"timestamp": "2020-11-19 10:58:17"
},
"lastErrorCode": {
"value": "NO_MESSAGE",
"timestamp": "2020-11-19 10:58:17"
}
}

Related

Using jq to fetch and show key value with quotes

I have a file that looks as below:
{
"Job": {
"Name": "sample_job",
"Description": "",
"Role": "arn:aws:iam::00000000000:role/sample_role",
"CreatedOn": "2021-10-21T23:35:23.660000-03:00",
"LastModifiedOn": "2021-10-21T23:45:41.771000-03:00",
"ExecutionProperty": {
"MaxConcurrentRuns": 1
},
"Command": {
"Name": "glueetl",
"ScriptLocation": "s3://aws-sample-s3/scripts/sample.py",
"PythonVersion": "3"
},
"DefaultArguments": {
"--TempDir": "s3://aws-sample-s3/temporary/",
"--class": "GlueApp",
"--enable-continuous-cloudwatch-log": "true",
"--enable-glue-datacatalog": "true",
"--enable-metrics": "true",
"--enable-spark-ui": "true",
"--job-bookmark-option": "job-bookmark-enable",
"--job-insights-byo-rules": "",
"--job-language": "python",
"--spark-event-logs-path": "s3://aws-sample-s3/logs"
},
"MaxRetries": 0,
"AllocatedCapacity": 100,
"Timeout": 2880,
"MaxCapacity": 100.0,
"WorkerType": "G.1X",
"NumberOfWorkers": 100,
"GlueVersion": "2.0"
}
}
I want to get key/value from "Name", "--enable-continuous-cloudwatch-log": "" and "--enable-metrics": "". So, I need to show the info like this:
"Name" "sample_job"
"--enable-continuous-cloudwatch-log" ""
"--enable-metrics" ""
UPDATE
Follow the tips from #Inian and #0stone0 I came close to it:
jq -r '(.Job ) + (.Job.DefaultArguments | { "--enable-continuous-cloudwatch-log", "--enable-metrics"}) | to_entries[] | "\"\(.key)\" \"\(.value)\""'
This extract the values I need but show all another key/values.
Since you're JSON isn't valid, I've converted it into:
{
"Job": {
"Name": "sample_job",
"Role": "sample_role_job"
},
"DefaultArguments": {
"--enable-continuous-cloudwatch-log": "test_1",
"--enable-metrics": ""
},
"Timeout": 2880,
"NumberOfWorkers": 10
}
Using the following filter:
"Name \(.Job.Name)\n--enable-continuous-cloudwatch-log \(.DefaultArguments."--enable-continuous-cloudwatch-log")\n--enable-metrics \(.DefaultArguments."--enable-metrics")"
We use string interpolation to show the desired output:
Name sample_job
--enable-continuous-cloudwatch-log test_1
--enable-metrics
jq --raw-output '"Name \(.Job.Name)\n--enable-continuous-cloudwatch-log \(.DefaultArguments."--enable-continuous-cloudwatch-log")\n--enable-metrics \(.DefaultArguments."--enable-metrics")"'
Online Demo

Grep command from json file - Bash scripting

My json file has the below content:
{
"Fruits": {
"counter": 1,
"protocols": [
{
"id": "100",
"name": "lemon",
"category": "citrus"
},
{
"id": "350",
"name": "Orange",
"category": "citrus"
},
{
"id": "150",
"name": "lime",
"category": "citrus"
}
]
}
}
I am expecting an output as below
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus
Easy to do with jq:
$ jq -r '.Fruits.protocols[] | "Fruits:\(.name):\(.category)"' input.json
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus
The jq answer is better. Still posting a Ruby solution (if you cannot use jq), but it is less elegant:
ruby -e '
require "json";
l="";
ARGF.each { |x| l+=x };
obj=JSON.parse(l);
obj["Fruits"]["protocols"].each { |x| puts "Fruits:#{x["name"]}:#{x["category"]}" }
'
Here is the full example:
echo '{"Fruits":{"counter":1,"protocols":[{"id":"100","name":"lemon","category":"citrus"},{"id":"350","name":"Orange","category":"citrus" },{"id":"150","name":"lime","category":"citrus"}]}}' \
| ruby -e 'require "json";l="";ARGF.each { |x| l+=x } ; obj=JSON.parse(l) ; obj["Fruits"]["protocols"].each { |x| puts "Fruits:#{x["name"]}:#{x["category"]}" }'
Output:
Fruits:lemon:citrus
Fruits:Orange:citrus
Fruits:lime:citrus

Generate file and replace strings in template file in the generated files

I have a JSON file:
{
"header": {
"uuid": "c578592a-a751-4993-9060-53f488597e59",
"timestamp": 1522938800,
"productionDateTime": "2018-04-05T14:33:20.000+00:00",
"producers": {
"operator": null,
"application": {
"com.example": {
"id": {
"string": "1"
},
"name": {
"string": "Test"
}
}
},
"station": null,
"equipment": null,
"partner": null,
"flow": null
},
"correlationContext": {
"array": [{
"correlationId": "98440498-3104-479e-9c99-f4449ba8f4b6",
"correlationDateTime": {
"string": "2018-04-05T14:30:39.000+00:00"
}
}]
}
},
}
I would like to create n files that copy the JSON file but with a random correlationId and correlationDateTime.
Do you have any tips or suggestions? Many thanks in advance!
Using jq and leaving the generation of appropriate random values as an exercise for the reader:
$ jq --arg cid "foo" --arg cdt "bar" '
.header.correlationContext.array[0]={
correlationId: $cid,
correlationDateTime: {string: $cdt}
}' tmp.json
Here you go:-
#!/bin/bash
alias uid="python -c 'import sys,uuid; sys.stdout.write(uuid.uuid4().hex)' | pbcopy && pbpaste && echo"
dt=$(date "+%Y%m%d-%H%M%S")
# For above two values I am leaving with you. If you have best option please go for it
i=0
while [ $i -le 20 ]
do
sed "/correlationId/s/98440498-3104-479e-9c99-f4449ba8f4b6/"$uid"/;s/2018-04-05T14:30:39.000+00:00/"$dt"/" yourJsonFile.json > testcase$i.json
i=$((i+1))
done

JQ: How do I replace keys and values based on regex match?

I have two questions:
How can I use jq to search for "name" fields that start with an underscore (like _RDS_PASSWORD) and remove the leading underscore (so it becomes RDS_PASSWORD)
How can I use jq for "name" fields that start with an underscore (like _RDS_PASSWORD) and pass the value of the value cGFzc3dvcmQK to be decoded via base64? (ex: "cGFzc3dvcmQK" | base64 --decode)
Input:
[
{
"name": "RDS_DB_NAME",
"value": "rds_db_name"
},
{
"name": "RDS_HOSTNAME",
"value": "rds_hostname"
},
{
"name": "RDS_PORT",
"value": "1234"
},
{
"name": "RDS_USERNAME",
"value": "rds_username"
},
{
"name": "_RDS_PASSWORD",
"value": "cGFzc3dvcmQK"
}
]
Desired output:
[
{
"name": "RDS_DB_NAME",
"value": "rds_db_name"
},
{
"name": "RDS_HOSTNAME",
"value": "rds_hostname"
},
{
"name": "RDS_PORT",
"value": "1234"
},
{
"name": "RDS_USERNAME",
"value": "rds_username"
},
{
"name": "RDS_PASSWORD",
"value": "password"
}
]
Q1
walk( if type=="object" and has("name") and .name[0:1] == "_"
then .name |= .[1:]
else .
end)
If your jq does not have walk/1 then you can either upgrade to a more recent version of jq than 1.5, or include its def, which can be found at https://github.com/stedolan/jq/blob/master/src/builtin.jq
Q2
.. | objects | select(has("name") and .name[0:1] == "_") | .value
If you are certain that the encoded string was a UTF-8 string, you could use jq's #base64d; otherwise, invoke jq with the -r option and pipe the results to a decoder as you indicated you planned to do.

jq - add new field with updating whole file

I have json file which is constructed in simmilar way:
[
{
"_id":"1234",
"org":"org1",
"int":
{"url":"http://url.com.uk:1234"}},
{
"_id":"4321",
"org":"org2",
"int":
{"url":"http://url.com.us:4321"}},
...
]
Now im "jumping" from one entry to another and checking if under URL application is working properly. After check i want to add/update field "status". But i can't update whole file, im just getting:
$ jq --arg mod "GOOD" '.[0].int + {stat: $mod}' tmp.json
{
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
How can i with jq command get new updated whole file, not just only part of it?
If you put your data in data.json and the changes you want to make to
each record into a separate arg.json argument file like
{
"1234": { "int": { "stat": "GOOD" } },
"4321": { "int": { "stat": "BAD", "xxx": "yyy" } }
}
and run jq as
$ jq -M --argfile arg arg.json 'map(. + $arg[._id])' data.json
then it will output the updated data, e.g.
[
{
"_id": "1234",
"org": "org1",
"int": {
"stat": "GOOD"
}
},
{
"_id": "4321",
"org": "org2",
"int": {
"stat": "BAD",
"xxx": "yyy"
}
}
]
Note that the + replaces keys. If you want to merge keys you can use * e.g.
$ jq -M --argfile arg arg.json 'map(. * $arg[._id])' data.json
which generates
[
{
"_id": "1234",
"org": "org1",
"int": {
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
},
{
"_id": "4321",
"org": "org2",
"int": {
"url": "http://url.com.us:4321",
"stat": "BAD",
"xxx": "yyy"
}
}
]
If you want to update the data in place you could use sponge
as described in the answer Manipulate JSON with jq
e.g.
$ jq -M --argfile arg arg.json 'map(. * $arg[._id])' data.json | sponge data.json
You can map to array and resign the int by operation, like:
jq --arg mod "GOOD" '.[] | .int=.int + {stat: $mod}' tmp.json
{
"_id": "1234",
"org": "org1",
"int": {
"url": "http://url.com.uk:1234",
"stat": "GOOD"
}
}
{
"_id": "4321",
"org": "org2",
"int": {
"url": "http://url.com.us:4321",
"stat": "GOOD"
}
}