Querying element inside a collection on a json field - Postgres - json

I have the following json structure on my Postgres. The table is named "customers" and the field that contains the json is named "data"
{
customerId: 1,
something: "..."
list: [{ nestedId: 1, attribute: "a" }, { nestedId: 2, attribute: "b" }]
}
I'm trying to query all customers that have an element inside the field "list" with nestedId = 1.
I accomplished that poorly trough the query:
SELECT data FROM customers a, jsonb_array_elements(data->'list') e WHERE (e->'nestedId')::int = 1
I said poorly because since I'm using jsonb_array_elements on the FROM clausule, it is not used as filter, resulting in a seq scan.
I tried something like:
SELECT data FROM customers where data->'list' #> '{"nestedId": 1, attribute: "a"}'::jsonb
But it does not return anything. I imagine because the "list" field is seen as an array and not as each type of my records.
Any ideas how to perform that query filtering nestedId on the WHERE condition?

Try this query:
SELECT data FROM customers where data->'list' #> '[{"nestedId": 1}]';
This query will work in Postgres 9.4+.

Related

json_set returns desired result (in SQLITE Browser) but does not update the table

This is my table:
CREATE TABLE orders(
id integer primary key,
order_uid text unique,
created_at text,
updated_at text,
created_by text,
updated_by text,
client text,
phone text,
device text,
items json,
comments text
)
'items' is a list of dictionaries - valid json.
This is what 'items' looks like:
[
{
"am_pm": "AM",
"brand_": "EEE",
"quantity": 8,
"code": "1-936331-67-5",
"delivery_date": "2020-04-19",
"supplier": "XXX",
"part_uid": "645039eb-82f4-4eed-b5f9-115b09679c66",
"name": "WWWWWW",
"price": 657,
"status": "Not delivered"
},
{
"am_pm": "AM",
"brand_": "DDDDDDD",
...
},
...
]
This is what I'm running (in 'execute sql' tab in sqlitebrowser V. 3.11.2, SQLite version 3.31.1), and it looks like it returns the desired results, however not reflected in the actual table, it doesn't update it:
select json_set(value, "$.am_pm", "Tequilla") from orders, json_each(orders.items, '$')
where orders.id=2 and json_extract(value, '$.part_uid') = '35f81391-392b-4d5d-94b4-a5639bba8591'
I also ran
update orders
set items = (select json_set(orders.items, '$.am_pm', "Tequilla") from orders, json_each(orders.items, '$'))
where orders.id=2
With the result being - it deleted the list of dicts and replaced it with a single dict, with the 'am_pm' field updated.
What is the correct sql statement, so I can update a single (or several) object/s in 'items'?
After much fiddling and posting on Sqlite forums as well, the optimal solution seems to be:
update orders
set items = (select json_set(items, fullkey||'.brand_', 'Teq')
from orders, json_each(items, '$')
where json_extract(value, '$.supplier') = 'XXX' and orders.id = 1)
where orders.id = 1
This will only update a single item in the json array, even if multiple items meet the criteria.
It would be helpful if someone more experienced in Sqlite could come up with a solution of updating multiple elements in a json array at once.

Count JSON column in Snowflake

I have a table called HISTORY in Snowflake that has column called RECORD with VARIANT datatype, this column contain JSON data in it, I would like to add new column for HISTORY table that counting the JSON columns ( values ) for each row of HISTORY table , pls help.
Json data starts like:
{"prizes":
[ {"year":"2018",
"category":"physics",
"laureates":[ {"id":"960","firstname":"Arthur","surname":"Ashkin"}
, { "id":"961","firstname":"G\u00e9rard","surname":"Mourou" }
]
},
...
]
}
First flatten the data to the lowest level I need (laureates), and then apply on the "year" element, which is one level above the laureates element. you can also filter on the lowest level columns if I need to.
select
count(*)
from NobelPrizeJson
, lateral flatten(INPUT=>json:prizes) prizes
, lateral flatten(INPUT=>prizes.value:laureates) laureates
where prizes.value:year::int > 2010;
This is posted at:
https://community.snowflake.com/s/question/0D50Z00008xAQSY/i-have-a-query-that-counts-the-number-of-objects-inside-a-large-json-document-and-now-i-need-to-filter-on-only-objects-with-a-specific-keyvalue-pair-inside-those-objects-how-can-i-filter

Json data search from MySQL table

My MySQL table contain json column academics.
Table values look like this :
id academics
--------------------------------------------------------
100 ["CBSE-Afternoon-12-A","CBSE-Morning-12-B"]
200 ["CBSE-Afternoon-12-C","CBSE-Morning-12-D"]
300 ["CBSE-Afternoon-12-E","CBSE-Afternoon-12-F"]
I have to find the id from the above table based on the search key:
CBSE-Morning-12 & CBSE-Afternoon-12
I have tried the below query
SELECT id
FROM ACADEMIC_TABLE
WHERE JSON_SEARCH(academics, 'all', 'CBSE-Morning-12%') IS NOT NULL
it returns id: 100,200 correctly.
But I need to search with two keywords like condition in JSON
[CBSE-Morning-12 & CBSE-Afternoon-12 ] and return id 100,200,300
Please help me
I need to search with two keywords like condition in JSON [CBSE-Morning-12 & CBSE-Afternoon-12 ] and return id 100,200,300
Looking at the sample data - you need the value contained either first or second pattern. If so then you must use OR:
SELECT id
FROM ACADEMIC_TEBLE
WHERE JSON_SEARCH(academics, 'one', 'CBSE-Morning-12%') IS NOT NULL
OR JSON_SEARCH(academics, 'one', 'CBSE-Afternoon-12%') IS NOT NULL;
https://dbfiddle.uk/?rdbms=mysql_8.0&fiddle=bdc058111adf7d4200a1471c9873e94c

mySQL JSON : search array of objects where property value in list

I have a JSON column, manifest, containing an array of objects.
I need to return all table rows where any of the objects in their array have a slide_id that is present in a sub select.
The structure of the JSON field is..
{ matrix:[
{
row:1,
col:1,
slide_id:1
},
{
row:1,
col:2,
slide_id:5
}
]
}
So I want to run something like this....
SELECT id FROM presentation WHERE manifest->'$.matrix[*].slide_id' IN ( (SELECT id from slides WHERE date_deleted IS NOT NULL) );
But this doesn't work as manifest->'$.matrix[*].slide_id' returns a JSON array for each row.
I have managed to get this to work, but its amazingly slow as it scans the whole table...
SELECT
p.id
FROM
(
SELECT id,
manifest->'$.matrix[*].slide_id' as slide_ids
FROM `presentation`
) p
INNER JOIN `pp_slides` s
ON JSON_CONTAINS(p.slide_ids, CAST(s.id as json), '$')
WHERE s.date_deleted IS NOT NULL
If I filter it down to an individual presentation ID, then its not too bad, but still takes 700 ms for a presentation with a couple of hundred slides in it. Is there a cleaner way to do this?
I suppose the best way would be to refactor it to store the matrix as a relational table....

How to create an empty JSON object in postgresql?

Datamodel
A person is represented in the database as a meta table row with a name and with multiple attributes which are stored in the data table as key-value pair (key and value are in separate columns).
Simplified data-model
Now there is a query to retrieve all users (name) with all their attributes (data). The attributes are returned as JSON object in a separate column. Here is an example:
name data
Florian { "age":25 }
Markus { "age":25, "color":"blue" }
Thomas {}
The SQL command looks like this:
SELECT
name,
json_object_agg(d.key, d.value) AS data,
FROM meta AS m
JOIN (
JOIN d.fk_id, d.key, d.value AS value FROM data AS d
) AS d
ON d.fk_id = m.id
GROUP BY m.name;
Problem
Now the problem I am facing is, that users like Thomas which do not have any attributes stored in the key-value table, are not shown with my select function. This is because it does only a JOIN and no LEFT OUTER JOIN.
If I would use LEFT OUTER JOIN then I run into the problem, that json_object_agg try's to aggregate NULL values and dies with an error.
Approaches
1. Return empty list of keys and values
So I tried to check if the key-column of a user is NULL and return an empty array so json_object_agg would just create an empty JSON object.
But there is not really a function to create an empty array in SQL. The nearest thing I found was this:
select '{}'::text[];
In combination with COALESCE the query looks like this:
json_object_agg(COALESCE(d.key, '{}'::text[]), COALESCE(d.value, '{}'::text[])) AS data
But if I try to use this I get following error:
ERROR: COALESCE types text and text[] cannot be matched
LINE 10: json_object_agg(COALESCE(d.key, '{}'::text[]), COALES...
^
Query failed
PostgreSQL said: COALESCE types text and text[] cannot be matched
So it looks like that at runtime d.key is a single value and not an array.
2. Split up JSON creation and return empty list
So I tried to take json_object_agg and replace it with json_object which does not aggregate the keys for me:
json_object(COALESCE(array_agg(d.key), '{}'::text[]), COALESCE(array_agg(d.value), '{}'::text[])) AS data
But there I get the error that null value not allowed for object key. So COALESCE does not check that the array is empty.
Qustion
So, is there a function to check if a joined column is empty, and if yes return just a simple JSON object?
Or is there any other solution which would solve my problem?
Use left join with coalesce(). As default value use '{}'::json.
select name, coalesce(d.data, '{}'::json) as data
from meta m
left join (
select fk_id, json_object_agg(d.key, d.value) as data
from data d
group by 1
) d
on m.id = d.fk_id;
name | data
---------+------------------------------------
Florian | { "age" : "25" }
Marcus | { "age" : "25", "color" : "blue" }
Thomas | {}
(3 rows)