Implement a condition on a query JMESPATH - json

I have this query which bring me the expiring dates of the keyvault's secret, I have to bring the dates that are less than 30 days
az keyvault secret show \
--vault name "$keyvault" \
--name "$secret" \
--query "attributes.expires" - o tsv
How can I implement something like
--query "attributes.expires < 30 days" - o tsv

You won't be able to make the date computation in JMESPath, but you could perfectly let PowerShell do it, with the Get-Date utility.
(Get-Date).AddDays(30) | Get-Date -Format yyyy-MM-ddTHH:mm:ssK
For the JMESPath query part, you have to know that a filter can only be applied to an array, not to an object, which is what you seems to have.
You can overcome this, though, using the function to_array on your object, and, then, use a pipe expression to get only the first item of the array – | [0].
So, if the unfiltered JSON return of the Azure client looks like
{
"attributes": {
"expires": "2022-07-30T00:00:00+00:00"
}
}
Then, this should be your query
--query "to_array(attributes)[?
expires < '$((Get-Date).AddDays(30) | Get-Date -Format yyyy-MM-ddTHH:mm:ssK)'
] | [0].expires"

Related

How to insert JSON as string into another JSON

I am writing script (bash script for Azure pipeline) and I need to combine JSON from different variables. For example, I have:
TYPE='car'
COLOR='blue'
ADDITIONAL_PARAMS='{"something": "big", "etc":"small"}'
So, as you can see, I have several string variables and one which consist JSON.
I need to combine these variables with this format (and I cant :( ):
some_script --extr-vars --extra_vars '{"var_type": "'$TYPE'", "var_color": "'$COLOR'", "var_additional_data": "'$ADDITIONAL_PARAMS'"}'
But this combination is not working, I have a string something like:
some_script --extr-vars --extra_vars '{"var_type": "car", "var_color": "blue", "var_additional_data": " {"something": "big", "etc":"small"} "}'
which is not correct and valid JSON.
How I can combine existing JSON (already formatted with double quotes ") with other variables? I am using bash / console / yq utilite (to convert yaml to json)
Use jq to generate the JSON. (You can probably do this in one step with yq, but I'm not as familiar with that tool.)
ev=$(jq --arg t "$TYPE" \
--arg c "$COLOR" \
--argjson ap "$ADDITIONAL_PARAMS" \
-n '{var_type: $t, var_color: $c var_additional_data: $ap}')
some_script --extr-vars --extra_vars "$ev"

Update JSON file parameters daily using Powershell script

I have a JSON file that contains a query I am using to pull audit history data from Oracle. I need to be able to automate changing the fromDate and toDate parameters daily. I was thinking to create a powershell script and use: Get-Date -format "yyyy-MM-dd" as a variable, but not sure this is the best approach? My JSON file (Oracle.json) looks like this:
{
"fromDate": "2019-04-18",
"toDate": "2019-04-18",
"product": "OPSS",
"eventType": "RoleMembershipAdd"
}
I am then using curl to make the POST request and output the data to flat file:
curl.exe -i --user username:password -X POST -H "Content-Type: application/json" -d "#C:\Oracle.json" hxxps://someurl.com/fscmRestApi/fndAuditRESTService/audittrail/getaudithistory >> C:\Oracle.txt
What would be the best way to make the dates dynamic so I can have the script run daily and pull from that day without having to change the dates manually?
I'd use a here string and directly insert the date:
$Json = #"
{
"fromDate": "$(get-date -format "yyyy-MM-dd")",
"toDate": "$(get-date -format "yyyy-MM-dd")",
"product": "OPSS",
"eventType": "RoleMembershipAdd"
}
"# | ConvertFrom-Json | ConvertTo-Json -Compress
> $Json
"fromDate":"2019-04-23","toDate":"2019-04-23","product":"OPSS","eventType":"RoleMembershipAdd"}

jq: extract value based on different (calculated) value

I am trying to filter down a very large json file (AWS output from aws rds describe-db-snapshots) into just a list of snapshots for deletion.
The final list of snapshots should be older than 60 days. I can discern their age via their SnapshotCreateTime, but I need their DBSnapshotIdentifier value to be able to delete them.
Greatly stripped down for SO purposes, below is the input.json file.
{
"Engine": "postgres",
"SnapshotCreateTime": "2017-08-22T16:35:42.302Z",
"AvailabilityZone": "us-east-1b",
"DBSnapshotIdentifier": "alex2-20170822-0108-bkup",
"AllocatedStorage": 5
}
{
"Engine": "postgres",
"SnapshotCreateTime": "2017-06-02T16:35:42.302Z",
"AvailabilityZone": "us-east-1a",
"DBSnapshotIdentifier": "alex-dbs-16opfr84gq4h9-snapshot-rtsmdbinstance-fr84gq4h9",
"AllocatedStorage": 5
}
{
"Engine": "postgres",
"SnapshotCreateTime": "2017-04-22T16:35:42.302Z",
"AvailabilityZone": "us-east-1a",
"DBSnapshotIdentifier": "alex3-20170422-update",
"AllocatedStorage": 5
}
I know about select but from what I can tell it can't handle the math needed for the time comparison in a one-liner. I figured I'd need to branch out to bash, so I've been messing with the following (clunky) workaround. It's not working, but I figured I'd include it as proof of effort.
THEN=$(date +'%Y%m%d' -d "`date`-60days")
while IFS= read -r i
do
awsDate=$(jq -r '.SnapshotCreateTime' < $i) // get time
snapDate=$(date -d $awsDate +'%Y%m%d') //convert to correct format
if [ $snapDate -gt $THEN ] //compare times
then
// something to copy the ID
fi
done < input.json
In this case I'd be looking for an output of
alex-dbs-16opfr84gq4h9-snapshot-rtsmdbinstance-fr84gq4h9
alex3-20170422-update
Here is an all-jq solution (i.e. one that does not depend on calling the date command). You might like to try a variation, e.g. passing some form of the date in, using one of the command-line options such as --arg.
jq currently does not quite understand the SnapshotCreateTime format; that's where the call to sub comes in:
def ago(days): now - (days*24*3600);
select(.SnapshotCreateTime | sub("\\.[0-9]*";"") < (ago(60) | todate))
| .DBSnapshotIdentifier
After fixing the sample input so that it is valid JSON, the output would be:
"alex-dbs-16opfr84gq4h9-snapshot-rtsmdbinstance-fr84gq4h9"
"alex3-20170422-update"
To strip the quotation marks, use the -r command-line option.
Here is a solution which defines a filter function which uses select, sub, fromdate and now.
def too_old:
select( .SnapshotCreateTime
| sub("[.][0-9]+Z";"Z") # remove fractional seconds
| fromdate # convert to unix time
| now - . # convert to age in seconds
| . > (86400 * 60) # true if older than 60 days in seconds
)
;
too_old
| .DBSnapshotIdentifier
If you place this in a file filter.jq and run jq with the -r option e.g
jq -M -r -f filter.jq input.json
it will produce the output you requested:
alex-dbs-16opfr84gq4h9-snapshot-rtsmdbinstance-fr84gq4h9
alex3-20170422-update

How to select a date range from a JSON string by using jq?

I have a JSON string like this (MacOS):
[{
"id": 3624,
"created_at": "2016-10-21T20:51:16.000+08:00",
},
{
"id": 3625,
"created_at": "2016-10-22T08:09:16.000+08:00",
},
{
"id": 3626,
"created_at": "2016-10-23T09:19:55.000+08:00",
}]
I wanna select "created_at" from "2016-10-21" to "2016-10-22";
I wanna get result like this:
[{
"id": 3624,
"created_at": "2016-10-21T20:51:16.000+08:00",
},
{
"id": 3625,
"created_at": "2016-10-22T08:09:16.000+08:00",
}]
Can someone point me in the right direction?
The problem is solved.
Now,i use this code to select date right to the minute,i hope it's useful for others:
jq --arg s '2016-10-26T18:16' --arg e '2016-10-27T20:24' '[($s, $e) | strptime("%Y-%m-%dT%H:%M") | mktime] as $r
| map(select(
(.updated_at[:19] | strptime("%Y-%m-%dT%H:%M:%S") | mktime) as $d
| $d >= $r[0] and $d <= $r[1]))' <<< "$requestJson"
For a more robust solution, it would be better to parse the dates to get its components and compare those components. The closest you can get is to use strptime/1 to parse the date which returns an array of its components. Then compare the components to check if it's in range.
The array that strptime returns are the components:
year (%Y)
month (%m)
date (%d)
hours (%H)
minutes (%M)
seconds (%S)
day of week (%w)
day of year (%j)
Since you're only comparing the dates, the comparisons should only look at the first 3 components.
$ jq --arg s '2016-10-21' --arg e '2016-10-22' '
[($s, $e) | strptime("%Y-%m-%d")[0:3]] as $r
| map(select(
(.created_at[:19] | strptime("%Y-%m-%dT%H:%M:%S")[0:3]) as $d
| $d >= $r[0] and $d <= $r[1]
))
' input.json
Since you're running on a Mac, I'd expect these methods would be available to you in your build. You may have to make adjustments to the format of the dates for it to work as expected. As you can see in the comments, we had to massage it a bit to make it work.
Using command-line JSON parser jq, as requested:
Note: Jeff Mercado's helpful answer demonstrates a lot of great advanced jq techniques, but for the specific problem at hand I believe that the text-based approach in this answer is much simpler while still being flexible enough.
#!/usr/bin/env bash
# Create variable with sample input.
IFS= read -r -d '' json <<'EOF'
[
{
"id": 3624,
"created_at": "2016-10-21T20:51:16.000+08:00"
},
{
"id": 3625,
"created_at": "2016-10-22T08:09:16.000+08:00"
},
{
"id": 3626,
"created_at": "2016-10-23T09:19:55.000+08:00"
}
]
EOF
# Use `jq` to select the objects in the array whose .created_at
# property value falls between "2016-10-21:T20:51" and "2016-10-22T08:09"
# and return them as an array (effectively a sub-array of the input).
# (To solve the problem as originally stated, simply pass "2016-10-21"
# and "2016-10-22" instead.)
jq --arg s '2016-10-21T20:51' --arg e '2016-10-22T08:09' '
map(select(.created_at | . >= $s and . <= $e + "z"))
' <<<"$json"
Arguments --arg s '2016-10-21T20:51' and --arg e '2016-10-22T08:09' define variables $s (start of date+time range) and $e (end of date+time range) respectively, for use inside the jq script.
Function map() applies the enclosed expression to all the elements of the input array and outputs the results as an array, too.
Function select() accepts a filtering expression: every input object is evaluated against the enclosed expression, and the input object is only passed out if the expression evaluates to a "truthy" value.
Expression .created_at | . >= $s and . <= $e + "z" accesses each input object's created_at property and sends its value to the comparison expression, which performs lexical comparison, which - due to the formatting of the date+time strings - amounts to chronological comparison.
Note the trailing "z" appended to the range endpoint, to ensure that it matches all date+time strings in the JSON string that prefix-match the endpoint; e.g., endpoint 2016-10-22T08:09 should match 2016-10-22T08:09:01 as well as 2016-10-22T08:59.
This lexical approach allows you to specify as many components from the beginning as desired in order to narrow or widen the date range; e.g. --arg s '2016-10-01' --arg e '2016-10-31' would match all entries for the entire month of October 2016.

jq dates and unix timestamps

So I have a data with bunch of unix timestamp values (in milliseconds). Something like this:
{
"id": "f6922fd5-4f97-4113-820e-b45eba0ae236",
"published_at": 1461624333859,
"tracking_id": "a85d5ed5-5efa-461b-aae0-beb2098c0ff7",
}, {
"id": "835d412f-5162-440c-937b-7276f22c4eb9",
"published_at": 1461625249934,
"tracking_id": "86472ba2-ce5f-400f-b42a-5a0ac155c42c",
}, {
"id": "bc2efcac-67a0-4855-856a-f31ce5e4618e",
"published_at": 1461625253393,
"tracking_id": "c005398f-07f8-4a37-b96d-9ab019d586c2",
}
And very often we need to search for rows within a certain date. Is it possible to query with jq, providing human readable dates e.g. 2016-04-25. Also I wonder if the other way around possible, to make jq show published_at values in human readable form?
For example this works:
$ echo 1461624333 | jq 'todate'
"2016-04-25T22:45:33Z"
although it has to be in seconds, not milliseconds
Sure! Your provided input is not valid JSON, but I'm going to assume the trailing commas on those objects are removed and the objects are wrapped in an array, which would be the root object of the JSON document.
First, we can transform the millisecond-precision UNIX dates into second-precision, which is what jq's date functions expect, and then convert that to the human-readable dates you expect:
.[].published_at |= (. / 1000 | strftime("%Y-%m-%d"))
Then, we select only those elements whose dates match:
map(select(.published_at == $date))
Lastly, we put it all together, taking the $date variable from the command-line:
jq --arg date "2016-04-25" '.[].published_at |= (. / 1000 | strftime("%Y-%m-%d")) | map(select(.published_at == $date))' stuff.json
jq 1.5 has standard time-and-date functions such as strftime, as documented in the online manual. However support for TZ is extremely limited and/or unreliable, as illustrated here:
$ echo $TZ
$ jq -n '123 | strftime("%B %d %Y %I:%M%p %Z")'
"January 01 1970 12:02AM EST"
TZ='Asia/Kolkata' jq -n '123 | strftime("%B %d %Y %I:%M%p %Z")'
"January 01 1970 12:02AM IST"
strflocaltime
If your jq has strflocaltime:
TZ=Asia/Kolkata jq -n '123|strflocaltime("%Y-%m-%dT%H:%M:%S %Z")'
"1970-01-01T05:32:03 IST"