Expand Postgresql Nested Array Json Field - json

I have a table (log_table) and in this table there is a nested array json field (activities). With using this activities field, I want to normalize my row.
log_table:
- id:long
- activities:json
- date:timestamp
example activities field:
[
{
"actionType":"NOTIFICATION",
"items":null
},
{
"actionType":"MUTATION",
"items":[
{
"id":387015007,
"name":"epic",
"value":{
"currency":"USD",
"amount":1.76
}
},
{
"id":386521039,
"name":"test",
"value":{
"currency":"USD",
"amount":1.76
}
}
]
}
]
As query, I've tried:
select
*
from
log_table l,
json_array_elements(l.activities) elems,
json_array_elements(elems->'items') obj;
With this query, I got error like below:
ERROR: cannot call json_array_elements on a scalar
Is there any suggestion?

The lack of items should be marked as [null], not null. You can use the case expression to correct this, e.g.:
select elems->>'actionType' as action_type, obj
from log_table
cross join jsonb_array_elements(l.activities::jsonb) elems
cross join jsonb_array_elements(case elems->'items' when 'null' then '[null]' else elems->'items' end) obj
action_type | obj
--------------+---------------------------------------------------------------------------------
NOTIFICATION | null
MUTATION | {"id": 387015007, "name": "epic", "value": {"amount": 1.76, "currency": "USD"}}
MUTATION | {"id": 386521039, "name": "test", "value": {"amount": 1.76, "currency": "USD"}}
(3 rows)

Related

How to get a specific object in an JSON array in MySQL?

I have a JSON column "jobs" that looks like this:
[
{
"id": "1",
"done": "100",
"target": "100",
"startDate": "123123132",
"lastAction": "123123132",
"status": "0"
},
{
"id": "2",
"done": "10",
"target": "20",
"startDate": "2312321",
"lastAction": "2312321",
"status": "1"
}
]
I want to filter the array by object key values. For example: To find all items that have target > done, status != 0 and lastAction is yesterday to get response like this:
[
{
"id": "1",
"done": "19",
"target": "100",
"startDate": "123123132",
"lastAction": "123123132",
"status": "0"
}
]
I know I can extract the data to a JSON_TABLE() to do the filtering but I don't get the original object back(unless I recreate it back) and the solution is not dynamic.
Can this kind of array filtering can really be done in MySQL?
SELECT JSON_PRETTY(JSON_EXTRACT(jobs.jobs, CONCAT('$[', j.rownum-1, ']'))) AS object
FROM jobs
CROSS JOIN JSON_TABLE(
jobs.jobs, '$[*]' COLUMNS(
rownum for ordinality,
done int path '$.done',
target int path '$.target',
status int path '$.status'
)
) as j
WHERE j.target > j.done AND j.status != 0;
You also mentioned a condition on lastAction, but the example values you gave are not valid dates, so I'll leave that enhancement to you. The example above demonstrates the technique.
Yes it is possible to do it using the JSON_EXTRACT and JSON_SEARCH functions.
Let's say your table is named tbl_Jobs and the jobs column is of type JSON.
SELECT * FROM tbl_Jobs
WHERE JSON_EXTRACT(jobs, "$[*].target") = JSON_EXTRACT(jobs, "$[*].done")
AND JSON_EXTRACT(jobs, "$[*].status") != 0
AND JSON_SEARCH(jobs, 'one', DATE_SUB(CURDATE(), INTERVAL 1 DAY), NULL, "$[*].lastAction") IS NOT NULL

Aggregate JSON in PostgreSQL

I have a json column with entries that look like this:
{
"pages": "64",
"stats": {
"1": { "200": "55", "400": "4" },
"2": { "200": "1" },
"3": { "200": "1", "404": "13" },
}
}
The 'stats' are collections (of various sizes) containing http status codes versus counts.
I would like to aggregate the stats into two calculated columns - one for the total number of 200 responses and the other for the total number of responses (including 200s).
You can use two lateral joins to unnest the inner objects, then do conditional aggregation:
select
sum(z.cnt::int) no_responses,
sum(z.cnt::int) filter(where z.code::int = 200) no_200_responses
from mytable t
cross join lateral jsonb_each(t.data -> 'stats') as x(kx, obj)
cross join lateral jsonb_each_text(x.obj) as z(code, cnt)
Demo on DB Fiddle:
no_responses | no_200_responses
-----------: | ---------------:
74 | 57

How to flatten json into table

I have just realised on my AWS Aurora postgres cluster having functions with temp_tables are not friendly with read replicas. I need to do a re-write (using CTEs) - anyway.... How do I take a json object with arrays nested and flatten them to a table like so:
{
"data": [
{
"groupName": "TeamA",
"groupCode": "12",
"subGroupCodes": [
"11"
]
},
{
"groupName": "TeamB",
"groupCode": "13",
"subGroupCodes": [
"15", "22"
]
}
]
}
I would like the output table to be:
groupName groupCode subGroupCodes
TeamA 12 11
TeamB 13 15
TeamB 13 22
I know I can get most of the way there with:
SELECT j."groupCode" as int, j."groupName" as pupilgroup_name
FROM json_to_recordset(p_in_filters->'data') j ("groupName" varchar(50), "groupCode" int)
But I just need to get the subGroupCodes as well but unpacking the array and joining to the correct parent groupCodes.
You need to first unnest the array, and then another unnest to get the subgroup codes:
with data (j) as (
values ('{
"data": [
{
"groupName": "TeamA",
"groupCode": "12",
"subGroupCodes": [
"11"
]
},
{
"groupName": "TeamB",
"groupCode": "13",
"subGroupCodes": [
"15", "22"
]
}
]
}'::jsonb)
)
select e ->> 'groupName' as group_name,
e ->> 'groupCode' as code,
sg.*
from data d
cross join lateral jsonb_array_elements(d.j -> 'data') as e(g)
cross join lateral jsonb_array_elements_text(g -> 'subGroupCodes') as sg(subgroup_code)

Parsing JSON in Postgres

I have the following JSON that I'd like to parse inside a postgresql function.
{
"people": [
{
"person_name": "Person#1",
"jobs": [
{
"job_title": "Job#1"
},
{
"job_name": "Job#2"
}
]
}
]
}
I need to know how to pull out the person_name, and then loop thru the jobs and pull out the job_title. This is as far as I've been able to get.
select ('{"people":[{"person_name":"Person#1","jobs":[{"job_title":"Job#1"},
{"job_name":"Job#2"}]}]}')::json -> 'people';
https://www.db-fiddle.com/f/vcgya7WtVdvj8q5ck5TqgX/0
Assuming that job_name in your post should be job_title. I expanded your test data to:
{
"people": [{
"person_name": "Person#1",
"jobs": [{
"job_title": "Job#11"
},
{
"job_title": "Job#12"
}]
},
{
"person_name": "Person#2",
"jobs": [{
"job_title": "Job#21"
},
{
"job_title": "Job#22"
},
{
"job_title": "Job#23"
}]
}]
}
Query:
SELECT
person -> 'person_name' as person_name, -- B
json_array_elements(person -> 'jobs') -> 'job_title' as job_title -- C
FROM (
SELECT
json_array_elements(json_data -> 'people') as person -- A
FROM (
SELECT (
'{"people":[ '
|| '{"person_name":"Person#1","jobs":[{"job_title":"Job#11"}, {"job_title":"Job#12"}]}, '
|| '{"person_name":"Person#2","jobs":[{"job_title":"Job#21"}, {"job_title":"Job#22"}, {"job_title":"Job#23"}]} '
|| ']}'
)::json as json_data
)s
)s
A Getting person array; json_array_elements expands all array elements into one row per element
B Getting person_name from array elements
C Expanding the job array elements into one row per element and getting the job_title
Result:
person_name job_title
----------- ---------
"Person#1" "Job#11"
"Person#1" "Job#12"
"Person#2" "Job#21"
"Person#2" "Job#22"
"Person#2" "Job#23"

want to sum inner element with JSON in using N1QLCouchbase

when I run below query
SELECT * FROM myBucket WHERE ANY x IN transactions SATISFIES x.type in [0,4] END;
Result:
{
"_type": "Company",
"created": "2015-12-01T18:30:00.000Z",
"transactions": [
{
"amount": "96.5",
"date": "2016-01-03T18:30:00.000Z",
"type": 0
},
{
"amount": "483.7",
"date": "2016-01-10T18:30:00.000Z",
"type": 0
}
]
}
I get multiple json like this
SELECT sum(transactions[*].amount) FROM Inheritx WHERE ANY x IN transactions SATISFIES x.type in [0,4] END;
Result:
[
{
"$1": null
}
]
Now I want to sum of all this. How can I do it?
transactions[*].amount this is return array so first need to user array function
ARRAY_SUM
than use sum like below.
SELECT sum(ARRAY_SUM(transactions[*].amount)) FROM Inheritx WHERE ANY x IN transactions SATISFIES x.type in [0,4] END;