I'm new to postgres and I am having trouble finding an example of how to query the following:
{
"Skill": {
"Technical": [
{ "Name": "C#",
"Rating": 4,
"Last Used": "2014-08-21"
},
{ "Name": "ruby",
"Rating": 4,
"Last Used": "2014-08-21"
}
],
"Product": [
{ "Name": "MDM",
"Rating": 4,
"Last Used": "2014-08-21"
},
{ "Name": "UDM",
"Rating": 5,
"Last Used": "2014-08-21"
}
]
}
}
In short I struggling with understanding how to query through maps without having to be explicit about naming each key.
I have a query that does the following, though it seems a bit much to have to do...
Select 'Technical' as SkillType
, json_array_elements(ResourceDocument->'Skill'->'Technical')->>'Name' as SkillName
, json_array_elements(ResourceDocument->'Skill'->'Technical')->>'Rating' as Rating
, json_array_elements(ResourceDocument->'Skill'->'Technical')->>'Last Used' as LastUsed
FROM testdepot.Resource
UNION ALL
Select 'Product' as SkillType
, json_array_elements(ResourceDocument->'Skill'->'Product')->>'Name' as SkillName
, json_array_elements(ResourceDocument->'Skill'->'Product')->>'Rating' as Rating
, json_array_elements(ResourceDocument->'Skill'->'Product')->>'Last Used' as LastUsed
FROM testdepot.Resource
I am trying to find a way to do this in 1 query that allows containing all keys of a map.
In this case Product and Technical
Something like:
Select 'Product' as SkillType
, json_array_elements(ResourceDocument->'Skill'->*)->>'Name' as SkillName
, json_array_elements(ResourceDocument->'Skill'->*)->>'Rating' as Rating
, json_array_elements(ResourceDocument->'Skill'->*)->>'Last Used' as LastUsed
FROM testdepot.Resource
You can wrap a call to json_object_keys in a sub-query to first get the keys inside "Skill" and then use json_array_elements on the outer query over the result:
SELECT SkillType
, json_array_elements(ResourceDocument->'Skill'->SkillType)->>'Name' AS SkillName
, json_array_elements(ResourceDocument->'Skill'->SkillType)->>'Rating' AS Rating
, json_array_elements(ResourceDocument->'Skill'->SkillType)->>'Last Used' AS LastUsed
FROM (
SELECT json_object_keys(resourcedocument->'Skill') AS SkillType, ResourceDocument
FROM Resource
) t;
Related
I have a JSON column "jobs" that looks like this:
[
{
"id": "1",
"done": "100",
"target": "100",
"startDate": "123123132",
"lastAction": "123123132",
"status": "0"
},
{
"id": "2",
"done": "10",
"target": "20",
"startDate": "2312321",
"lastAction": "2312321",
"status": "1"
}
]
I want to filter the array by object key values. For example: To find all items that have target > done, status != 0 and lastAction is yesterday to get response like this:
[
{
"id": "1",
"done": "19",
"target": "100",
"startDate": "123123132",
"lastAction": "123123132",
"status": "0"
}
]
I know I can extract the data to a JSON_TABLE() to do the filtering but I don't get the original object back(unless I recreate it back) and the solution is not dynamic.
Can this kind of array filtering can really be done in MySQL?
SELECT JSON_PRETTY(JSON_EXTRACT(jobs.jobs, CONCAT('$[', j.rownum-1, ']'))) AS object
FROM jobs
CROSS JOIN JSON_TABLE(
jobs.jobs, '$[*]' COLUMNS(
rownum for ordinality,
done int path '$.done',
target int path '$.target',
status int path '$.status'
)
) as j
WHERE j.target > j.done AND j.status != 0;
You also mentioned a condition on lastAction, but the example values you gave are not valid dates, so I'll leave that enhancement to you. The example above demonstrates the technique.
Yes it is possible to do it using the JSON_EXTRACT and JSON_SEARCH functions.
Let's say your table is named tbl_Jobs and the jobs column is of type JSON.
SELECT * FROM tbl_Jobs
WHERE JSON_EXTRACT(jobs, "$[*].target") = JSON_EXTRACT(jobs, "$[*].done")
AND JSON_EXTRACT(jobs, "$[*].status") != 0
AND JSON_SEARCH(jobs, 'one', DATE_SUB(CURDATE(), INTERVAL 1 DAY), NULL, "$[*].lastAction") IS NOT NULL
I'm new to JSONB and I am wondering, if the following would be possible with a single query:
I have a lot of tables that look like this:
ID (INT) | members (JSONB)
all the tables has only one row.
example for 2 tables
table1:
id: 1
data:
[
{
"computer": "12.12.12.12",
"tag": "dog"
},
{
"computer": "1.1.1.1",
"tag": "cat"
},
{
"computer": "2.2.2.2",
"tag": "cow"
}
]
table2:
id: 1
data:
[
{
"IP address": "12.12.12.12",
"name": "Beni",
"address": "Rome"
},
{
"IP address": "1.1.1.1",
"name": "Jone",
"address": "Madrid"
}
]
The result should be rows like this :
computer
tag
name
12.12.12.12
dog
Beni
1.1.1.1
cat
Jone
Thanks !
Convert jsons into setof types using jsonb_to_recordset function and then join them (like they were relational tables).
with table1 (id,members) as (
values (1,'[{"computer": "12.12.12.12","tag": "dog"},{"computer": "1.1.1.1","tag": "cat"},{"computer": "2.2.2.2","tag": "cow"}]'::jsonb)
), table2 (id,members) as (
values (1,'[{"IP address": "12.12.12.12","name": "Beni", "address": "Rome"},{"IP address": "1.1.1.1","name": "Jone", "address": "Madrid"}]'::jsonb)
)
select t1.computer, t1.tag, t2.name
from jsonb_to_recordset((select members from table1 where id=1)) as t1(computer text,tag text)
join jsonb_to_recordset((select members from table2 where id=1)) as t2("IP address" text,name text)
on t1.computer = t2."IP address"
db fiddle
to get values out of a jsonb array of objects you somehow have to explode them.
another way with jsonb_array_elements:
with _m as (
select
jsonb_array_elements(members.data) as data
from members
),
_m2 as (
select
jsonb_array_elements(members2.data) as data
from members2
)
select
_m.data->>'computer' as computer,
_m.data->>'tag' as tag,
_m2.data->>'name' as name
from _m
left join _m2 on _m2.data->>'IP address' = _m.data->>'computer'
https://www.db-fiddle.com/f/68iC5TzLKbzkLZ8gFWYiLz/0
I am trying to understand the "new" MYSQL JSON field.
I have this table:
id (int-11, not_null, auto_inc)
customer_id (int-11, not null)
labels (json)
With this data:
id: 1
customer_id: 1
labels: [{"isnew": "no", "tagname": "FOO", "category": "CAT_1", "isdeleted": "no"}, {"isnew": "yes", "tagname": "BAR", "category": "CAT_2", "isdeleted": "no"}]
JSON beautify
[
{
"tagname": "FOO",
"category": "CAT_1",
"isnew": "no",
"isdeleted": "no"
},
{
"tagname": "BAR",
"category": "CAT_2",
"isnew": "yes",
"isdeleted": "no"
}
]
And now I want to SELECT all the customers (by customer_id) in the table that have a specific category and a specific tagname
I tried this one:
SELECT * FROM labels_customers_json
WHERE JSON_SEARCH(labels, 'all', 'BAR') IS NOT NULL
But this is not what I want. This one is searching in every json attribute.
I have seen some examples of JSON_EXTRACT:
SELECT * FROM `e_store`.`products`
WHERE
`category_id` = 1
AND JSON_EXTRACT(`attributes` , '$.ports.usb') > 0
AND JSON_EXTRACT(`attributes` , '$.ports.hdmi') > 0;
SELECT c, c->"$.id", g, n
FROM jemp
WHERE JSON_EXTRACT(c, "$.id") > 1
ORDER BY c->"$.name";
So I tried this
SELECT * FROM labels_customers_json
WHERE JSON_EXTRACT(labels, '$.tagname') = 'BAR'
SELECT labels, JSON_EXTRACT(labels, "$.customer_id"), customer_id
FROM labels_customers_json
WHERE JSON_EXTRACT(labels, "$.customer_id") > 0
You could probably try using SELECT * FROM labels_customers_json WHERE JSON_SEARCH(labels, 'all', "BAR", NULL, "$[*].tagname") is not null - although i cannot say if that is the best way to perform this query.
You can use JSON_SEARCH to search on a specific path too. So you can use the following query:
SELECT *
FROM labels_customers_json
WHERE JSON_SEARCH(labels, 'all', 'BAR', NULL, '$[*].tagname') IS NOT NULL
You can also use JSON_EXTRACT and JSON_CONTAINS together:
SELECT *
FROM labels_customers_json
WHERE JSON_CONTAINS(JSON_EXTRACT(labels, '$[*].tagname'), '["BAR"]') > 0;
You can also use only JSON_CONTAINS to check:
SELECT *
FROM labels_customers_json
WHERE JSON_CONTAINS(labels, '{"tagname":"BAR"}') > 0;
demos: https://www.db-fiddle.com/f/rufrThAQPfXHrK9YyibFSm/2
How to get aggregate SUM(amount) from "refunds" array in postgres json select
Following is my data schema and structure:
Table Name: transactions
Column name: data
{
"id": "tran_6ac25129951962e99f28fa488993",
"amount": 1200,
"origin_amount": 3900,
"status": "partial_refunded",
"description": "Subscription#sub_a67d59efb2bcbf73485a ",
"livemode": false,
"refunds": [
{
"id": "refund_ee4192ffb6d2caa490a1",
"amount": 1200,
"status": "refunded",
"created_at": 1426412340,
"updated_at": 1426412340,
},
{
"id": "refund_0e4a34e4ee7281d369df",
"amount": 1500,
"status": "refunded",
"created_at": 1426412353,
"updated_at": 1426412353,
}
]
}
Out put should be: 1200+1500 = 2700
Output
|---------
|total
|---------
|2700
Please provide global solution and not with static data
This should work on 9.3+
WITH x AS( SELECT
'{
"id": "tran_6ac25129951962e99f28fa488993",
"amount": 1200,
"origin_amount": 3900,
"status": "partial_refunded",
"description": "Subscription#sub_a67d59efb2bcbf73485a ",
"livemode": false,
"refunds": [
{
"id": "refund_ee4192ffb6d2caa490a1",
"amount": 1200,
"status": "refunded",
"created_at": 1426412340,
"updated_at": 1426412340
},
{
"id": "refund_0e4a34e4ee7281d369df",
"amount": 1500,
"status": "refunded",
"created_at": 1426412353,
"updated_at": 1426412353
}
]
}'::json as y),
refunds AS(
SELECT json_array_elements(y->'refunds') as j FROM x)
SELECT sum((j->>'amount')::int) FROM refunds;
WITH AllRefunds AS ( SELECT jsonb_array_elements(data->'refunds') AS refund FROM transactions)
SELECT SUM( CAST ( refund ->> 'amount' AS INTEGER )) FROM AllRefunds;
If you need to know how the query is built:
1.
WITH AllRefunds AS ( SELECT jsonb_array_elements(data->'refunds') FROM transactions)
SELECT * FROM AllRefunds;
This selects all elements as JSON objects (done via ->) from the array refunds that were found in transactions table and stores it in a new table AllRefunds. This new table only consists of one unnamed column.
2.
WITH AllRefunds AS ( SELECT jsonb_array_elements(data->'refunds') AS refund FROM transactions)
SELECT * FROM AllRefunds;
Here the added (second) AS renames the currently unnamed column inside AllRefunds to refund
3.
WITH AllRefunds AS ( SELECT jsonb_array_elements(data->'refunds') AS refund FROM transactions)
SELECT SUM( CAST ( refund ->> 'amount' AS INTEGER )) FROM AllRefunds;
Our array entries are JSON objects. So we return the field amount as a simple string with ->> that we then cast to Integers and SUM all entries up.
Here is my JSON stored in a CLOB column:
select upJSON from myLocations;
{"values":[
{"nameValuePairs":{"upJSON":"{\"mResults\":[0.0,0.0],\"mProvider\":\"fused\",\"mDistance\":0.0,\"mAltitude\":0.0}","id":"1","updated":"2015-03-30 20:28:51"}},
{"nameValuePairs":{"upJSON":"{\"mResults\":[0.0,0.0],\"mProvider\":\"FINDME\",\"mDistance\":0.0,\"mAltitude\":22.2}","id":"2","updated":"2015-03-30 20:28:53"}},
{"nameValuePairs":{"upJSON":"{\"mResults\":[0.0,0.0],\"mProvider\":\"fused\",\"mDistance\":0.0,\"mAltitude\":0.0}","id":"3","updated":"2015-03-30 20:28:55"}},
{"nameValuePairs":{"upJSON":"{\"mResults\":[0.0,0.0],\"mProvider\":\"fused\",\"mDistance\":0.0,\"mAltitude\":0.0}","id":"4","updated":"2015-03-30 20:28:57"}}
]}
(I have inserted newlines for clarity)
Please: What is the SQL (or PL/SQL) needed to select just the value of mProvider, mAltitude, and the id from the 2nd "nameValuePairs"
(= "FINDME" and 22.2 and "2") in the example above)
??
Since you're using 12c you have access to the native JSON parsing (as long as your CLOB column has an is json check constraint).
Some good background is available at:
https://docs.oracle.com/database/121/ADXDB/json.htm#ADXDB6371
If you're JSON looks something like this:
{
"values": [
{
"nameValuePairs": {
"upJSON": {
"mResults": [
"0.0",
"0.0"
],
"mProvider": "fused",
"mDistance": "0.0",
"mAltitude": "22.2"
},
"id": "1",
"updated": "2015-03-30 20:28:51"
}
},
...
...
Although, when I put your snippet from the question into JSONLint it returns:
{
"values": [
{
"nameValuePairs": {
"upJSON": "{\"mResults\":[0.0,0.0],\"mProvider\":\"fused\",\"mDistance\":0.0,\"mAltitude\":0.0}",
"id": "1",
"updated": "2015-03-30 20:28:51"
}
},
{
"nameValuePairs": {
"upJSON": "{\"mResults\":[0.0,0.0],\"mProvider\":\"FINDME\",\"mDistance\":0.0,\"mAltitude\":22.2}",
"id": "2",
"updated": "2015-03-30 20:28:53"
}
},
Something like the following might get you started:
select
upJSON.values
from
myLocations
where
json_value(upJSON, '$nameValuePairs.id' returning varchar2 error on error) = '2';
If you want to limit the query to a single ID, you'll need to add a full-text or function-based index to the JSON column.
https://odieweblog.wordpress.com/2015/04/12/json_table-chaining/#comment-1025
with tmp as (
SELECT /*+ no_merge */ d.*
FROM ulocations ul,
json_table(ul.upjson, '$'
columns(
NESTED PATH '$.values[*].nameValuePairs'
COLUMNS (
updated VARCHAR2(19 CHAR) PATH '$.updated'
, id varchar2(9 char) path '$._id'
, upJSON VARCHAR2(2000 CHAR) PATH '$.upJSON'
)) ) d
--where d.id = '0'
)
select t.updated
, t.id
, jt2.*
from tmp t
, json_table(t.upJSON, '$'
columns mProvider varchar2(5) path '$.mProvider'
, mLongitude number path '$.mLongitude'
) jt2
;