I have the following table:
CREATE TABLE api_data (
id bigserial NOT NULL PRIMARY KEY,
content JSONB NOT NULL
);
Now I insert an array like this into the content column:
[{ "id": 44, "name": "address One", "petId": 1234 },
{ "id": 45, "name": "address One", "petId": 1234 },
{ "id": 46, "name": "address One", "petId": 1111 }]
What I want next is to get exactly the objects that have the "petId" set to a given value.
I figured I could do
select val
from api_data
WHERE content #> '[{"petId":1234}]'
But that returns the whole array.
Another thing I found is this query:
select val
from api_data
JOIN LATERAL jsonb_array_elements(content) obj(val) ON obj.val->>'petId' = '1234'
WHERE content #> '[{"petId":1234}]'
Which returns the object I am looking for, but three times which matches the number of elements in the array.
What I actually need is a result like this:
[{ "id": 44, "name": "address One", "petId": 1234 },
{ "id": 45, "name": "address One", "petId": 1234 }]
If you are using Postgres 12, you can use a JSON path expression:
select jsonb_path_query_array(content, '$[*] ? (#.petId == 1234)') as content
from api_data
where content #> '[{"petId":1234}]';
If you are using an older version, you need to unnest and aggregate manually:
select (select jsonb_agg(e)
from jsonb_array_elements(d.content) as t(e)
where t.e #> '{"petId":1234}') as content
from api_data d
where d.content #> '[{"petId":1234}]'
Related
I have a JSON column "jobs" that looks like this:
[
{
"id": "1",
"done": "100",
"target": "100",
"startDate": "123123132",
"lastAction": "123123132",
"status": "0"
},
{
"id": "2",
"done": "10",
"target": "20",
"startDate": "2312321",
"lastAction": "2312321",
"status": "1"
}
]
I want to filter the array by object key values. For example: To find all items that have target > done, status != 0 and lastAction is yesterday to get response like this:
[
{
"id": "1",
"done": "19",
"target": "100",
"startDate": "123123132",
"lastAction": "123123132",
"status": "0"
}
]
I know I can extract the data to a JSON_TABLE() to do the filtering but I don't get the original object back(unless I recreate it back) and the solution is not dynamic.
Can this kind of array filtering can really be done in MySQL?
SELECT JSON_PRETTY(JSON_EXTRACT(jobs.jobs, CONCAT('$[', j.rownum-1, ']'))) AS object
FROM jobs
CROSS JOIN JSON_TABLE(
jobs.jobs, '$[*]' COLUMNS(
rownum for ordinality,
done int path '$.done',
target int path '$.target',
status int path '$.status'
)
) as j
WHERE j.target > j.done AND j.status != 0;
You also mentioned a condition on lastAction, but the example values you gave are not valid dates, so I'll leave that enhancement to you. The example above demonstrates the technique.
Yes it is possible to do it using the JSON_EXTRACT and JSON_SEARCH functions.
Let's say your table is named tbl_Jobs and the jobs column is of type JSON.
SELECT * FROM tbl_Jobs
WHERE JSON_EXTRACT(jobs, "$[*].target") = JSON_EXTRACT(jobs, "$[*].done")
AND JSON_EXTRACT(jobs, "$[*].status") != 0
AND JSON_SEARCH(jobs, 'one', DATE_SUB(CURDATE(), INTERVAL 1 DAY), NULL, "$[*].lastAction") IS NOT NULL
for example I have a table:
CREATE TABLE fruit(id bigint, data jsonb);
and a row for example is:
1,
{
"type": "pinapple",
"store1": {
"first_added": "<some timestamp>",
"price": "10",
"store_id": "1",
"comments": "some comments..."
},
"store2": {
"first_added": "<some timestamp>",
"price": "11",
"store_id": "2",
"comments": "some comments..."
},
.... more stores
}
In case of update I have the fruit id and store data :
1,
"store1": {
"price": "12",
"store_id": "1",
"comments": "some comments...V2"
}
I want to update entire store object in fruit entry (for store1), except the first_added field.
Any idea how I can accomplish it via JSONB operators or functions?
Thanks
You can use
UPDATE fruit
SET data = data || jsonb_set($1::jsonb, '{store1,first_added}', data#>'{store1,first_added}')
WHERE id = 1;
(online demo)
where the parameter $1 is set to the value {"store1": {"price": "12", "store_id": "1", "comments": "some comments...V2"}}.
Or if you need the key to be dynamic, use
UPDATE fruit
SET data = jsonb_set(data, ARRAY[$2::text], jsonb_set($1::jsonb, '{first_added}', data->$2->'first_added'))
WHERE id = 1;
(online demo)
You can use the jsonb_set function to change the desired element, then use the jsonb_build_object function to create a new dataset, then concatenate the data with the || operator to keep the rest of the data(first_added,...)
update table1
set data = jsonb_set(data, '{store1}', jsonb_build_object('first_added', data->'store1'->'first_added', 'price', 12, 'store_id', 1, 'comments', 'some comments...V2'))
where id = 1;
Demo in DBfiddle
Lets say I have this Json and I would like to retrieve all the age values where the name equals Chris in the Array key.
{
"Array": [
{
"age": "65",
"name": "Chris"
},
{
"age": "20",
"name": "Mark"
},
{
"age": "23",
"name": "Chris"
}
]
}
That Json is present in the Json column inside my database.
by that I would like to retrieve one age column the has the age 65 and 23 because they both named Chris.
Use json_each() table-valued function to extract all the names and ages from the json array of each row of the table and json_extract() function to filter the rows for 'Chris' and get his age:
SELECT json_extract(j.value, '$.name') name,
json_extract(j.value, '$.age') age
FROM tablename t JOIN json_each(t.col, "$.Array") j
WHERE json_extract(j.value, '$.name') = 'Chris';
Change col to the name of the json column.
See the demo.
SELECT JSON_query([json], '$') from mytable
Returns fine the contents of [json] field
SELECT JSON_query([json], '$.Guid') from mytable
Returns null
SELECT JSON_query([json], '$.Guid[1]') from mytable
Returns null
I've also now tried:
SELECT JSON_query([json], '$[1].Guid')
SELECT JSON_query([json], '$[2].Guid')
SELECT JSON_query([json], '$[3].Guid')
SELECT JSON_query([json], '$[4].Guid')
and they all return null
So I'm stuck as to figuring out how create the path to get to the info. Maybe SQL Server json_query can't handle the null as the first array?
Below is the string that is stored inside of the [json] field in the database.
[
null,
{
"Round": 1,
"Guid": "15f4fe9d-403c-4820-8e35-8a8c8d78c33b",
"Team": "2",
"PlayerNumber": "78"
},
{
"Round": 1,
"Guid": "8e91596b-cc33-4ce7-bfc0-ac3d1dc5eb67",
"Team": "2",
"PlayerNumber": "54"
},
{
"Round": 1,
"Guid": "f53cd74b-ed5f-47b3-aab5-2f3790f3cd34",
"Team": "1",
"PlayerNumber": "23"
},
{
"Round": 1,
"Guid": "30297678-f2cf-4b95-a789-a25947a4d4e6",
"Team": "1",
"PlayerNumber": "11"
}
]
You need to follow the comments below your question. I'll just summarize them:
Probably the most appropriate approach in your case is to use OPENJSON() with explicit schema (the WITH clause).
JSON_QUERY() extracts a JSON object or a JSON array from a JSON string and returns NULL. If the path points to a scalar JSON value, the function returns NULL in lax mode and an error in strictmode. The stored JSON doesn't have a $.Guid key, so NULL is the actual result from the SELECT JSON_query([json], '$.Guid') FROM mytable statement.
The following statements provide a working solution to your problem:
Table:
SELECT *
INTO Data
FROM (VALUES
(N'[
null,
{
"Round": 1,
"Guid": "15f4fe9d-403c-4820-8e35-8a8c8d78c33b",
"Team": "2",
"PlayerNumber": "78",
"TheProblem": "doesn''t"
},
{
"Round": 1,
"Guid": "8e91596b-cc33-4ce7-bfc0-ac3d1dc5eb67",
"Team": "2",
"PlayerNumber": "54"
},
{
"Round": 1,
"Guid": "f53cd74b-ed5f-47b3-aab5-2f3790f3cd34",
"Team": "1",
"PlayerNumber": "23"
},
{
"Round": 1,
"Guid": "30297678-f2cf-4b95-a789-a25947a4d4e6",
"Team": "1",
"PlayerNumber": "11"
}
]')
) v (Json)
Statements:
SELECT j.Guid
FROM Data d
OUTER APPLY OPENJSON(d.Json) WITH (
Guid uniqueidentifier '$.Guid',
Round int '$.Round',
Team nvarchar(1) '$.Team',
PlayerNumber nvarchar(2) '$.PlayerNumber'
) j
SELECT JSON_VALUE(j.[value], '$.Guid')
FROM Data d
OUTER APPLY OPENJSON(d.Json) j
Result:
Guid
------------------------------------
15f4fe9d-403c-4820-8e35-8a8c8d78c33b
8e91596b-cc33-4ce7-bfc0-ac3d1dc5eb67
f53cd74b-ed5f-47b3-aab5-2f3790f3cd34
30297678-f2cf-4b95-a789-a25947a4d4e6
I have a table (log_table) and in this table there is a nested array json field (activities). With using this activities field, I want to normalize my row.
log_table:
- id:long
- activities:json
- date:timestamp
example activities field:
[
{
"actionType":"NOTIFICATION",
"items":null
},
{
"actionType":"MUTATION",
"items":[
{
"id":387015007,
"name":"epic",
"value":{
"currency":"USD",
"amount":1.76
}
},
{
"id":386521039,
"name":"test",
"value":{
"currency":"USD",
"amount":1.76
}
}
]
}
]
As query, I've tried:
select
*
from
log_table l,
json_array_elements(l.activities) elems,
json_array_elements(elems->'items') obj;
With this query, I got error like below:
ERROR: cannot call json_array_elements on a scalar
Is there any suggestion?
The lack of items should be marked as [null], not null. You can use the case expression to correct this, e.g.:
select elems->>'actionType' as action_type, obj
from log_table
cross join jsonb_array_elements(l.activities::jsonb) elems
cross join jsonb_array_elements(case elems->'items' when 'null' then '[null]' else elems->'items' end) obj
action_type | obj
--------------+---------------------------------------------------------------------------------
NOTIFICATION | null
MUTATION | {"id": 387015007, "name": "epic", "value": {"amount": 1.76, "currency": "USD"}}
MUTATION | {"id": 386521039, "name": "test", "value": {"amount": 1.76, "currency": "USD"}}
(3 rows)