This question already has answers here:
How to return rows that have the same column values in MySql
(3 answers)
Closed 6 years ago.
I have raw data in JSON as follows:
{
"id": 1,
"tags": [{
"category": "location",
"values": ["website", "browser"]
},{
"category": "campaign",
"values": ["christmas_email"]
}]
},
{
"id": 2,
"tags": [{
"category": "location",
"values": ["website", "browser", "chrome"]
}]
},
{
"id": 3,
"tags": [{
"category": "location",
"values": ["website", "web_view"]
}]
}
The tag category and its values are dynamically generated and are not known beforehand. I need to load this data into an RDBMS table and then later make queries to the data. The queries may be as follows:
Extract all rows where location has values "website" and "browser". The output of this query should return rows with id 1 and 2.
I need some help in modelling this into a table schema to support such queries. I was thinking of tables as:
Table 1: MAIN
Columns: ID, TAG_LIST_ID
Row1: 1 TL1
Row2: 2 TL2
Row3: 3 TL3
Table 2: TAGS
Columns: TAG_ID, TAG_CATEGORY, TAG_VALUE
Row1: TID1 location website
Row2: TID2 location browser
Row3: TID3 location chrome
Row4: TID4 location web_view
Row5: TID5 campaign christmas_email
Table 3: TAG_MAPPING
Columns: TAG_MAPPING_ID, TAG_LIST_ID, TAG_ID
Row1: TMID1 TL1 TID1
Row2: TMID2 TL1 TID2
Row3: TMID3 TL1 TID5
Row4: TMID4 TL2 TID1
Row5: TMID5 TL2 TID2
Row6: TMID6 TL2 TID3
Row7: TMID7 TL3 TID1
Row8: TMID8 TL3 TID4
Now to query all rows where location has values "website" and "browser", I could write
SELECT * from MAIN m, TAGS t, TAG_MAPPING tm
WHERE m.TAG_LIST_ID=tm.TAG_LIST_ID AND
tm.TAG_ID = t.TAG_ID AND
t.TAG_CATEGORY = "location" AND
(t.TAG_VALUE="website" OR t.TAG_VALUE="browser")
However this will return all the three rows; changing the OR condition to AND will return no rows. What is the right way to design the schema?
Any pointers appreciated.
Just replace the OR by IN and a counter:
SELECT tm.TAG_LIST_ID, count(1) as cnt
FROM MAIN m, TAGS t, TAG_MAPPING tm
WHERE tm.TAG_LIST_ID= m.TAG_LIST_ID
AND tm.TAG_ID = t.TAG_ID
AND t.TAG_CATEGORY = "location" AND
AND t.TAG_VALUE IN ("website","browser")
GROUP by tm.TAG_LIST_ID
having count(1) > 1 -- should be greater than 1 because you are looking for 2 words. This values change according the number of words.
Related
I'm new to JSONB and I am wondering, if the following would be possible with a single query:
I have a lot of tables that look like this:
ID (INT) | members (JSONB)
all the tables has only one row.
example for 2 tables
table1:
id: 1
data:
[
{
"computer": "12.12.12.12",
"tag": "dog"
},
{
"computer": "1.1.1.1",
"tag": "cat"
},
{
"computer": "2.2.2.2",
"tag": "cow"
}
]
table2:
id: 1
data:
[
{
"IP address": "12.12.12.12",
"name": "Beni",
"address": "Rome"
},
{
"IP address": "1.1.1.1",
"name": "Jone",
"address": "Madrid"
}
]
The result should be rows like this :
computer
tag
name
12.12.12.12
dog
Beni
1.1.1.1
cat
Jone
Thanks !
Convert jsons into setof types using jsonb_to_recordset function and then join them (like they were relational tables).
with table1 (id,members) as (
values (1,'[{"computer": "12.12.12.12","tag": "dog"},{"computer": "1.1.1.1","tag": "cat"},{"computer": "2.2.2.2","tag": "cow"}]'::jsonb)
), table2 (id,members) as (
values (1,'[{"IP address": "12.12.12.12","name": "Beni", "address": "Rome"},{"IP address": "1.1.1.1","name": "Jone", "address": "Madrid"}]'::jsonb)
)
select t1.computer, t1.tag, t2.name
from jsonb_to_recordset((select members from table1 where id=1)) as t1(computer text,tag text)
join jsonb_to_recordset((select members from table2 where id=1)) as t2("IP address" text,name text)
on t1.computer = t2."IP address"
db fiddle
to get values out of a jsonb array of objects you somehow have to explode them.
another way with jsonb_array_elements:
with _m as (
select
jsonb_array_elements(members.data) as data
from members
),
_m2 as (
select
jsonb_array_elements(members2.data) as data
from members2
)
select
_m.data->>'computer' as computer,
_m.data->>'tag' as tag,
_m2.data->>'name' as name
from _m
left join _m2 on _m2.data->>'IP address' = _m.data->>'computer'
https://www.db-fiddle.com/f/68iC5TzLKbzkLZ8gFWYiLz/0
I have a list ob objects. Each object contains several properties. Now I want to make a SELECT statement that gives me a list of a single property values. The simplified list look like this:
[
[
{
"day": "2021-10-01",
"entries": [
{
"name": "Start of competition",
"startTimeDelta": "08:30:00"
}
]
},
{
"day": "2021-10-02",
"entries": [
{
"name": "Start of competition",
"startTimeDelta": "03:30:00"
}
]
},
{
"day": "2021-10-03",
"entries": [
{
"name": "Start of competition"
}
]
}
]
]
The working SELECT is now
SELECT
JSON_EXTRACT(column, '$.days[*].entries[0].startTimeDelta') AS list
FROM table
The returned result is
[
"08:30:00",
"03:30:00"
]
But what I want to get (and also have expected) is
[
"08:30:00",
"03:30:00",
null
]
What can I do or how can I change the SELECT statement so that I also get NULL values in the list?
SELECT startTimeDelta
FROM test
CROSS JOIN JSON_TABLE(val,
'$[*][*].entries[*]' COLUMNS (startTimeDelta TIME PATH '$.startTimeDelta')) jsontable
https://dbfiddle.uk/?rdbms=mysql_8.0&fiddle=491f0f978d200a8a8522e3200509460e
Do you also have a working idea for MySQL< 8? – Lars
What is max amount of objects in the array on the 2nd level? – Akina
Well it's usually less than 10 – Lars
SELECT JSON_EXTRACT(val, CONCAT('$[0][', num, '].entries[0].startTimeDelta')) startTimeDelta
FROM test
-- up to 4 - increase if needed
CROSS JOIN (SELECT 0 num UNION SELECT 1 UNION SELECT 2 UNION SELECT 3) nums
WHERE JSON_EXTRACT(val, CONCAT('$[0][', num, '].entries[0]')) IS NOT NULL;
https://www.db-fiddle.com/f/xnCCSTGQXevcpfPH1GAbUo/0
I have a hierarchical table in Postgres database, e.g. category. The structure is simple like this:
id
parent_id
name
1
null
A
2
null
B
3
1
A1
4
3
A1a
5
3
A1b
6
2
B1
7
2
B2
What i need to get from this table is recursive deep tree structure like this:
[
{
"id": 1,
"name": "A",
"children": [
{
"id": 3,
"name": "A1",
"children": [
{
"id": 4,
"name": "A1a",
"children": []
},
{
"id": 5,
"name": "A1b",
"children": []
}
]
}
]
},
{
"id": 2,
"name": "B",
"children": [
{
"id": 6,
"name": "B1",
"children": []
},
{
"id": 7,
"name": "B2",
"children": []
}
]
},
]
Is it possible with unknown depth using combination of WITH RECURSIVE and json_build_array() or some other solution?
I found an answer to this question in this excellent blog post here, as I was wondering how to generalise over this problem in jOOQ. It would be useful if jOOQ could materialise arbitrary recursive object trees in a generic way: https://github.com/jOOQ/jOOQ/issues/12341
In the meantime, use this SQL statement, which was inspired by the above blog post, with a few modifications. Translate to jOOQ if you must, though you might as well store this as a view:
WITH RECURSIVE
d1 (id, parent_id, name) as (
values
(1, null, 'A'),
(2, null, 'B'),
(3, 1, 'A1'),
(4, 3, 'A1a'),
(5, 3, 'A1b'),
(6, 2, 'B1'),
(7, 2, 'B2')
),
d2 AS (
SELECT d1.*, 0 AS level
FROM d1
WHERE parent_id IS NULL
UNION ALL
SELECT d1.*, d2.level + 1
FROM d1
JOIN d2 ON d2.id = d1.parent_id
),
d3 AS (
SELECT d2.*, jsonb_build_array() children
FROM d2
WHERE level = (SELECT max(level) FROM d2)
UNION (
SELECT (branch_parent).*, jsonb_agg(branch_child)
FROM (
SELECT
branch_parent,
to_jsonb(branch_child) - 'level' - 'parent_id' AS branch_child
FROM d2 branch_parent
JOIN d3 branch_child ON branch_child.parent_id = branch_parent.id
) branch
GROUP BY branch.branch_parent
UNION
SELECT d2.*, jsonb_build_array()
FROM d2
WHERE d2.id NOT IN (
SELECT parent_id FROM d2 WHERE parent_id IS NOT NULL
)
)
)
SELECT jsonb_pretty(jsonb_agg(to_jsonb(d3) - 'level' - 'parent_id')) AS tree
FROM d3
WHERE level = 0;
dbfiddle. Again, read the linked blog post for an explanation of how this works
I have an object in the table column saved using JSON.stringify and it looks like this:
[{
"id": 1,
"uID": 10
}, {
"id": 2,
"uID": 10
}, {
"id": 3,
"uID": 94
}]
I need a query that will check if a given column contains values e.g.
I want uID = 10 and id = 2 will return
I want uID = 10 and id = 5 will not return it
I want uID = 10 and id = 2, uID = 94 and id = 0 will not return it
(because uID = 94 and id = 0 is not here)
Unless you are querying programmatically where you can parse the JSON and then do the logic, I would recommend something like this:
SELECT * FROM Table WHERE Column LIKE '%"id": 1,"uID": 10%'
The LIKE keyword allows us to use wildcards (%) but still do an exact text match for what we define.
Add a wildcard in the middle, but note that order matters with this strategy.
SELECT * FROM Table WHERE Column LIKE '%"id": 1%"id": 2%'
I need it to work backwards too:] e.g. I have
[{
"id": 1,
"uID": 10
}, {
"id": 2,
"uID": 55
}, {
"id": 3,
"uID": 94
}]
SELECT * FROM Table WHERE Column LIKE '%"uID": 55%"uID": 94%' <- will be working
SELECT * FROM Table WHERE Column LIKE '%"uID": 94%"uID": 55%' <- will be not working
Does not work "back"
I have data like this:
[
{"name": "pratha", "email": "p#g.com", "sub": { "id": 1 } },
{"name": "john", "email": "x#x.com", "sub": { "id": 5 } },
{"name": "pratha", "email": "c#d.com", "sub": { "id": 2 } }
]
This is my query to get unique and latest emails:
SELECT DISTINCT ON (jae.e->>'name')
jae.e->>'name' as name,
jae.e->>'email' as email
FROM survey_results sr
CROSS JOIN LATERAL jsonb_array_elements(sr.data_field) jae (e)
ORDER BY jae.e->>'name', jae.e->'sub'->>'id' desc
Problem is, when I add count(*) to select, all counts are equal.
I want to get unique result with distinct, and count their occurrences. So in this case, pratha should be 2 and john should be 1
with their data (not just counts)
How can achieve this with PostgreSQL?
See here: https://dbfiddle.uk/?rdbms=postgres_11&fiddle=f5c640958c3e4d594287632d0f4a835f
Do you need this?
SELECT DISTINCT ON (jj->>'name') jj->>'name', jj->>'email' , count(*) over(partition by jj->>'name' )
from survey_results
join lateral jsonb_array_elements(data_field) j(jj) on true
ORDER BY jj->>'name', jj->'sub'->>'id' desc
https://dbfiddle.uk/?rdbms=postgres_11&fiddle=5f07b7bcb0001ebe32aa2f1338d9d0f0