Let's say I have to following JSON file:
{
"1.1.1.1": {
"history_ban": [
"2021-05-02 14:30",
"2022-01-01 12:00"
],
"history_unban": [
"2021-05-09 14:30",
"2022-01-08 12:00"
]
},
"2.2.2.2": {
"history_ban": [
"2022-01-16 07:00"
],
"history_unban": []
},
"3.3.3.3": {
"history_ban": [
"2022-01-15 22:40"
]
}
}
My goals is to get all the keys where:
Max history_ban date is smaller than "2022-01-16 09:00"
Max history_unban date is empty/non-existent or smaller then Max history_ban date
I believe I have the majority of the query working as I wanted, but the 'Compare max unban with max ban' is not working. My current (not working) query is as follows:
to_entries[] | select((.value.history_ban != null) and (.value.history_ban | max < "2022-01-16 09:00") and ((.value.history_unban | length == 0 ) or (.value.history_unban | max < .value.history_ban | max))) | .key
I know my error is within (.value.history_unban | max < .value.history_ban | max) because, if I replace it with (.value.history_unban | max < "somedate") I get a working query.
The error I get is
jq: error (at :22): Cannot index array with string "value"
exit status 5
What do I need to do to select/compare these two max values?
Just to be sure, my expected result in this example is
"2.2.2.2"
"3.3.3.3"
You could use the update operator // to introduce another constraint if history_unban is existent and not empty.
jq -r '
to_entries[] | select(.value
| (.history_ban | max) as $maxban
| $maxban > "2022-01-16 09:00"
and (.history_unban | length == 0 // $maxban > max)
).key
'
2.2.2.2
3.3.3.3
Demo
Related
I have a JSON array, and another text file that contains a list of values.
[
{
"key": "foo",
"detail": "bar"
},
...
]
I need to filter the array elements to only those that have a "key" value that is found in the list of values.
The list of values is a text file containing a single item per-line.
foo
baz
Is this possible to do using jq?
You can use the following:
jq --rawfile to_keep_file to_keep.txt '
( [ $to_keep_file | match(".+"; "g").string | { (.): true } ] | add ) as $to_keep_lkup |
map(select($to_keep_lkup[.key]))
' to_filter.json
or
(
jq -sR . to_keep.txt
cat to_filter.json
) | jq -n '
( [ input | match(".+"; "g").string | { (.): true } ] | add ) as $to_keep_lkup |
inputs | map(select($to_keep_lkup[.key]))
'
The former requires jq v1.6, the first version to provide --rawfile.
jqplay
Given the following JSON
{
"tags": [
{
"key": "env",
"value": "foo"
},
{
"key": "env",
"value": "bar"
}
]
}
I am trying to find out the first tag where the key is env. I have this-
.tags[] | select (.key=="env") |.[0]
but that gives me an error Cannot index object with number
Use first(expr) to provide an expression that satisfies your usecase.
first(.tags[]? | select(.key == "env") .value)
You could wrap the results of your query in an array and then pick the first one
[.tags[] | select(.key=="env")] | .[0]
jq -r 'first( .tags[] | select(.key=="env") ).value'
jqplay
.tags[] flattens the array into a stream of values. You're applying .[0] to each of the values, not a filtered array. To filter an array, you'd use
.tags | map(select(...)) | .[0]
or
.tags | map(select(...)) | first
map(...) is a shorthand for [ .[] | ... ], so the above is equivalent to
.tags | [ .[] | select(...) ] | first
and
[ .tags[] | select(...) ] | first
Finally, [ ... ] | first can be written as first(...).
first( .tags[] | select(...) )
I am working with a nested json file. The issue is that the keys of the nested json are dates and their value is not known beforehand. Therefore I am unable to apply expandRecordColumn method on it.
Each row has a unique refId and looks like this
{
"refId" : "XYZ",
"snapshotIndexes" : {
"19-07-2021" : {
url: "abc1",
value: "123"
},
"20-07-2021" : {
url: "abc2",
value: "567"
}
}
}
I finally want a table with these columns,
refid | date | url | value
XYZ | 19-7-2021 | abc1 | 123
XYZ | 20-7-2021 | abc2 | 567
PQR | 7-5-2021 | srt | 999
In the new table, refId and date will together make a unique entry.
This is powerBi snapshot
Records
I was able to solve it using Record.ToTable on each row to convert from record to table and then applying ExpandTableColumn
let
Source = DocumentDB.Contents("sourceurl"),
Source = Source{[id="dbid"]}[Collections],
SourceTable= Source{[db_id="dbid",id="PartnerOfferSnapshots"]}[Documents],
ExpandedDocument = Table.ExpandRecordColumn(SourceTable, "Document", {"refId", "snapshotIndexes"}, {"Document.refId", "Document.snapshotIndexes"}),
TransformColumns = Table.TransformColumns(ExpandedDocument,{"Document.snapshotIndexes", each Table.ExpandRecordColumn(Record.ToTable(_), "Value", {"url","id","images"}, {"url","id","images"})}),
ExpandedTable = Table.ExpandTableColumn(TransformColumns, "Document.snapshotIndexes", {"Name","url","id","images"}, {"Document.dates","Document.url","Document.id","Document.images"})
in
ExpandedTable
I have a json with records like this:
[
{"number":1},
{"number":3}
]
and want to select (filter) a record with a max or min value of the field "number".
I can get a min or max value of "number" like this:
$ echo '[{"number":1},{"number":3}]' | jq ' [ .[].number ] | min'
(emits 1)
and I can output booleans:
$ echo '[{"number":1},{"number":3}]' | jq '.[].number==([ .[].number ] | min)'
true
false
but when I try to put that together with select, it fails:
$ echo '[{"number":1},{"number":3}]' | jq 'map(select(.[].number==([ .[].number ] | min)))'
jq: error (at <stdin>:1): Cannot index number with string "number"
I feel like I am close, bu stuck. What am I doing wrong?
Thanks in advance!
([ .[].number ] | min) as $m| map(select(.number== $m))
See https://jqplay.org/s/bUwtNrfAE-
first
To retrieve the first minimal item:
([ .[].number ] | min) as $m| first(.[]|select(.number== $m))
min_by, minimal_by, etc
jq has the built-ins max_by and min_by, as documented at
https://stedolan.github.io/jq/manual/#Builtinoperatorsandfunctions
For a definition of maximal_by, see the jq cookbook at https://github.com/stedolan/jq/wiki/Cookbook#find-the-maximal-elements-of-an-array-or-stream. That section also has stream-oriented definitions.
I want to read the values from json and need to create a new json so is there any way that
we can save json in table and columns in oracle that will help to perform calculation on that. calculation is too complax.
Here is the json sample and json has many hash and
{
"agri_Expense": {
"input": 6000,
"max": 7500,
"check": 7500
},
"income3": {
"Hiring_income": 239750
},
"Operational_Cost1": [
{
"Field_input3": 10000,
"Minimum": "0.05",
"Check_Input": 26750,
"Tractor_Cost": "Maintenance"
}
]
}
You do not need PL/SQL, and can do it entirely in SQL.
I want to read the values from json [...] so is there any way that
we can save json in table and columns in oracle
Yes, use SQL to create a table:
CREATE TABLE table_name ( json_column CLOB CHECK ( json_column IS JSON ) )
and then INSERT the value there:
INSERT INTO table_name ( json_column ) VALUES (
'{'
|| '"agri_Expense": {"input": 6000,"max": 7500,"check": 7500},'
|| '"income3": {"Hiring_income": 239750},'
|| '"Operational_Cost1": [{"Field_input3": 10000,"Minimum": "0.05","Check_Input": 26750,"Tractor_Cost": "Maintenance"}]'
|| '}'
)
then, if you want individual values, SELECT using JSON_TABLE:
SELECT j.*
FROM table_name t
CROSS JOIN JSON_TABLE(
t.json_column,
'$'
COLUMNS (
agri_expense_input NUMBER PATH '$.agri_Expense.input',
agri_expense_max NUMBER PATH '$.agri_Expense.max',
agri_expense_check NUMBER PATH '$.agri_Expense.check',
income3_hiring_income NUMBER PATH '$.income3.Hiring_income',
NESTED PATH '$.Operational_Cost1[*]'
COLUMNS (
oc1_field_input3 NUMBER PATH '$.Field_input3',
oc1_minimum NUMBER PATH '$.Minimum',
oc1_check_input NUMBER PATH '$.Check_Input'
)
)
) j
Which outputs:
AGRI_EXPENSE_INPUT | AGRI_EXPENSE_MAX | AGRI_EXPENSE_CHECK | INCOME3_HIRING_INCOME | OC1_FIELD_INPUT3 | OC1_MINIMUM | OC1_CHECK_INPUT
-----------------: | ---------------: | -----------------: | --------------------: | ---------------: | ----------: | --------------:
6000 | 7500 | 7500 | 239750 | 10000 | .05 | 26750
db<>fiddle here