Expand JSON with unknown keys to rows with MySQL JSON_TABLE - mysql

I have a MySQL 8.0.22 JSON column containing objects with keys that aren't known in advance:
'{"x": 1, "y": 2, "z": 3}'
'{"e": 4, "k": 5}'
I want to use JSON_TABLE to expand these values into multiple rows containing key value pairs:
key
value
x
1
y
2
z
3
e
4
k
5
The difficulty of course is that the keys aren't known a priori. The best thing I've come up with is...
SET #json_doc = '{"x": 1, "y": 2, "z": 3}';
SELECT a.seq, b.k, a.v
FROM
JSON_TABLE(
#json_doc,
"$.*"
COLUMNS(
seq FOR ordinality,
v INT PATH "$"
)
) AS a,
JSON_TABLE(
JSON_KEYS(#json_doc),
"$[*]"
COLUMNS(
seq FOR ordinality,
k CHAR(1) PATH "$"
)
) AS b
WHERE a.seq = b.seq;
This feels strange because it uses two JSON_TABLE calls, does a cross join on the values and keys, then keeps the ones that align. I'd like to find a simpler query like this...
SELECT a.seq, b.k, a.v
FROM
JSON_TABLE(
#json_doc,
"$.*"
COLUMNS(
seq FOR ordinality,
k CHAR(1) PATH "?" -- <-- what do I put here to find each key?
v INT PATH "$"
)
) AS a,
I know this problem can probably be solved with CTEs or a numbers table and JSON_EXTRACT. But, I'd like to find something performant and readable if possible.

You can use enumarete by using ROW_NUMBER() window function while determining the key values through use of JSON_KEYS(), and then extract the respective keys by using JSON_EXTRACT() from the arrays we got such as
WITH k AS
(
SELECT *,
ROW_NUMBER() OVER(PARTITION BY `jsdata` ORDER BY value DESC) AS rn,
JSON_KEYS(`jsdata`) AS jk
FROM `tab` AS t
JOIN JSON_TABLE(`jsdata`,'$.*' COLUMNS (value INT PATH '$')) j
)
SELECT JSON_UNQUOTE(JSON_EXTRACT(jk, CONCAT('$[',rn-1,']'))) AS "key",
value
FROM k
or use the following query as being more straightforward
SELECT JSON_UNQUOTE(
JSON_EXTRACT(JSON_KEYS(`jsdata`),
CONCAT('$[',
ROW_NUMBER() OVER(PARTITION BY `jsdata` ORDER BY value DESC)-1,
']'))
) AS "key", value
FROM `tab` AS t
JOIN JSON_TABLE(`jsdata`,'$.*' COLUMNS (value INT PATH '$')) j
Demo

Try to do JSON_EXTRACT directly after you got the JSON_KEYS as rows:
WITH j AS (
SELECT CAST('{"a": 1, "b": "-1", "c": null}' AS JSON) o UNION ALL
SELECT CAST('{"x": 2, "y": "-2", "z": null}' AS JSON)
)
SELECT k, JSON_EXTRACT(j.o, CONCAT('$."', jt.k, '"')) v
FROM j
, JSON_TABLE(JSON_KEYS(o), '$[*]' COLUMNS (k VARCHAR(200) PATH '$')) jt;
The answer by Barbaros can solve your problem with the demo data you provided, but it may not get what you want if your json objects have same value under different keys.

Related

MariaDB/MySQL - Convert keys and values from json object into rows, using JSON_TABLE

Using Mariadb 10.6 - In the following example, I try to convert the entries of the json object into table rows:
SELECT *
FROM JSON_TABLE('{
"1": [1, 123.25],
"10": [2, 110.5],
"100": [3, 105.75]
}', '$.*' COLUMNS (
col1 decimal(13,2) PATH '$',
col2 int PATH '$[0]',
col3 decimal(17,2) PATH '$[1]'
)) table1
The result is:
col1
col2
col3
NULL
1
123.25
NULL
2
110.50
NULL
3
105.75
Is there any way to fill "col1" with the property keys ("1", "10", "100")?
I guess there is some "key word" to reference the key, but I can't find any information on this in the docs from MariaDB or MySQL.
I already made a routine that creates a temporary table by looping output from JSON_KEYS, but it would be more elegant if I can use JSON_TABLE for this job.
This is an other way to do it using CROSS JOIN, JSON_TABLE & JSON_KEYS:
JSON_KEYS(json) will give us ["1", "10", "100"]
CROSS JOIN is used to generate multi rows from ["1", "10", "100"]
WITH data AS
(
SELECT '{
"1": [1, 123.25],
"10": [2, 110.5],
"100": [3, 105.75]
}' AS json
)
SELECT k.key, c.col2, c.col3
FROM data
CROSS JOIN JSON_TABLE(
JSON_KEYS(json),
'$[*]' COLUMNS(
rowid FOR ORDINALITY,
key TEXT PATH '$'
)
) k
INNER JOIN
(SELECT cols.*
FROM data,
JSON_TABLE(
json,
'$.*' COLUMNS(
rowid FOR ORDINALITY,
col2 int PATH '$[0]',
col3 decimal(17, 2) PATH '$[1]'
)
) AS cols) AS c
ON c.rowid = k.rowid;
demo here
Here's one way to do it without routines:
extract your json values using JSON_TABLE, alongside a row number using FOR ORDINALITY
extract your keys using JSON_KEYS
for each record, extract the i-th key corresponding to i-th ranking value, given by row number, using JSON_EXTRACT
SELECT JSON_EXTRACT(JSON_KEYS(#json),
CONCAT('$[', table1.rowid-1, ']')) AS col1,
table1.col2,
table1.col3
FROM JSON_TABLE(#json, '$.*' COLUMNS (
rowid FOR ORDINALITY,
col2 int PATH '$[0]',
col3 decimal(17,2) PATH '$[1]'
)) table1
Output:
col1
col2
col3
"1"
1
123.25
"10"
2
110.50
"100"
3
105.75
Check the demo here.
I answer to my own question:
I'm sorry that there apparently is'nt any native option to reference the key names in a json object with JSON_TABLE (yet), and the two workarounds currently posted are great.
I ended up using a mixture from both:
SET #json = '{ "1": [1, 123.25], "10": [2, 110.5], "100": [3, 105.75] }';
SELECT
col1,
JSON_EXTRACT(#json, CONCAT('$."', col1, '"[0]')) col2,
JSON_EXTRACT(#json, CONCAT('$."', col1, '"[1]')) col3
FROM JSON_TABLE(JSON_KEYS(#json), '$[*]' COLUMNS (col1 varchar(20) PATH '$')) t1;

Using JSON_TABLE to convert List into Rows

I have a database column (named "product_parents") formatted as a JSON object that contains the following data:
'["A", "B", "G", "H", "C", "E", "P", "R"]'
I want to use JSON_Table to create separate rows for each item.
Ideally I would get something like:
|product_parent|
|A|
|B|
|C|
|...|
|P|
|R|
I've tried
SELECT *
FROM pmctb.products AS p,
JSON_TABLE (p.product_parents, '$[*]'
COLUMNS (
pp_id FOR ORDINALITY,
pp_pn VARCHAR(255) PATH '$.header')
) AS pp
WHERE product_uid = "310-000574"
($.header was just an attempt since there is no column header) but that just returns the table and the ordinality and gives me nulls for pp_pn.
Any help would be appreciated. Thx
It looks like this does the trick with MySQL 8+:
create table products (product_parents json);
insert into products values ('["A", "B", "G", "H", "C", "E", "P", "R"]');
select pp
from products,
JSON_TABLE(
products.product_parents,
'$[*]' columns (pp varchar(255) path '$')
) t;
and the result is:
pp
A
B
G
H
C
E
P
R

Nested SELECT statements and reading in nested JSON file in SQL Server

The discussed problem has been solved partly in here:
Read in nested JSON file in SQL Server
but now the JSON file was extended with more objects of different format.
Declare #json nvarchar(max)
SELECT #json =
N'{
"Model": {
"Energy-X/A": {
"x": 1,
"y": 2,
"z": 3
},
"Energy-X/B": {
"x": 4,
"y": 5,
"z": 6
}
},
"Energy":
{
"Energy-X/A": [
[
100.123456, null
],
[
101.123456, null
]
],
"Energy-X/B": [
[
102.123456, null
],
[
103.123456, null
]
]
}
}'
select * from openjson(#json, '$.Model')
with (x [int] '$."Energy-X/A".x',
y [int] '$."Energy-X/A".y',
z [int] '$."Energy-X/A".z',
x [int] '$."Energy-X/B".x',
y [int] '$."Energy-X/B".y',
z [int] '$."Energy-X/B".z'
);
select commaDelimited.* from openjson (#json)
with (energyXA nvarchar(max) '$.Energy."Energy-X/A"' as json,
energyXB nvarchar(max) '$.Energy."Energy-X/B"' as json
) as energy
cross apply (
select
(select string_agg(isnull(value, 'null'), ',') from openjson(energyXA, '$[0]')),
(select string_agg(isnull(value, 'null'), ',') from openjson(energyXB, '$[0]'))
union all
select
(select string_agg(isnull(value, 'null'), ',') from openjson(energyXA, '$[1]')),
(select string_agg(isnull(value, 'null'), ',') from openjson(energyXB, '$[1]'))
) commaDelimited ([Energy-X/A], [Energy-X/B]);
The solution works and the values can be extracted but now I want to combine both SELECT statements into one query and construct a correlated subquery. The columns should appear when "Energy-X/A" and Energy-X/B" match like:
Energy-X/A
Energy-X/A
x
y
z
100.123456, null
101.123456, null
1
2
3
Energy-X/B
Energy-X/B
x
y
z
102.123456, null
103.123456, null
4
5
6
or also better output would be to sum up the values of Energy-X/A and Energy-X/B in one, separate column (using a delimiter such as semicolon):
Energy-X/A
x
y
z
100.123456, null ; 101.123456, null
1
2
3
Energy-X/B
x
y
z
102.123456, null ; 103.123456, null
1
2
3
I am grateful for any help!
Since you changed your expected results significantly, I've completely re-written your query.
Start by unpivoting the A and B values into separate rows using a (values) table and json_query.
Then read those columns using openjson.
In the case of Energy you need two levels of aggregation also, in order to get your second expected result.
select
commaDelimited.*,
model.*
from (values
(json_query(#json, '$.Model.BCS'), json_query(#json, '$.Energy."Energy-X/A"')),
(json_query(#json, '$.Model.BCSA'), json_query(#json, '$.Energy."Energy-X/B"'))
) j(model, energy)
outer apply openjson(j.model)
with (
x int,
y int,
z int
) model
outer apply (
select
Energy = string_agg(c.Energy, ' ; ')
from openjson(j.energy) energy
cross apply (
select
Energy = string_agg(isnull(Xinner.value, 'null'), ', ')
from openjson(energy.value) Xinner
) c
) commaDelimited;
db<>fiddle

How to do a select in v 12 c oracle for a json nested field containing backslash

I have a json field in a table as below, i am unable to query the "day" from it :
{"FID":54,"header_json":"{\"date\":{\"day\":2,\"month\":6,\"year\":2020},\"amt\":10,\"count\":1}"}
SQL tried:
select jt.*
from order_json o,
json_table(o.order_json,'$.header_json.date[*]'
columns ("day" varchar2(2) path '$.day')) as jt;
That's pretty easy: as you can see header_json is just a string, not usual nested json. So you need to get this quoted string and parse as a json again:
select *
from
(
select--+ no_merge
jh.*
from order_json o,
json_table(o.order_json,'$.header_json[*]'
columns (
header_json varchar2(200) path '$')
) as jh
) headers,
json_table(headers.header_json,'$.date[*]'
columns (
"day" varchar2(2) path '$.day')
) as j
;
Full example with sample data:
-- sample data:
with order_json(order_json) as (
select
'{"FID":54,"header_json":"{\"date\":{\"day\":2,\"month\":6,\"year\":2020},\"amt\":10,\"count\":1}"}'
from dual
)
-- main query
select *
from
(
select--+ no_merge
jh.*
from order_json o,
json_table(o.order_json,'$.header_json[*]'
columns (
header_json varchar2(200) path '$')
) as jh
) headers,
json_table(headers.header_json,'$.date[*]'
columns (
"day" varchar2(2) path '$.day')
) as j
;

MySQL 8 search JSON key by value in array

I've got MySQL table with JSON field, where I store data in such a format.
{
"fields": {
"1": {
"s": "y"
},
"2": {
"s": "n"
}
}
}
I need to obtain the keys in fields, e.g. 1 or 2 given the value of s.
Example query:
create table mytable ( mycol json );
insert into mytable set mycol = '{"fields": {"1": {"s": "y"},"2": {"s": "n"}}}';
select j.* from mytable, JSON_TABLE(mycol,
'$.fields.*' COLUMNS (
json_key VARCHAR(10) PATH '$',
s VARCHAR(10) PATH '$.s'
)
) AS j where j.s = 'y';
gives:
# json_key, s
null, y
I would expect to get
# json_key, s
1, y
Is it possible to get that data somehow?
I don't need the results in row / table format. I would be happy to get the comma separated list of IDs (json_keys) meeting my criterium.
EDIT:
I was also thinking about getting the paths using JSON_SEARCH and passing that to JSON_EXTRACT, this was achieved here: Combining JSON_SEARCH and JSON_EXTRACT get me: "Invalid JSON path expression."
Unfortunately the difference is that I would need to use JSON_SEARCH in all mode, as I need all results. In such a mode JSON_SEARCH returns list of paths, where as JSON_EXTRACT accepts list of arguments.
Try FOR ORDINALITY (see 12.17.6 JSON Table Functions), this type enumerates rows in the COLUMNS clause:
SELECT
JSON_UNQUOTE(
JSON_EXTRACT(
JSON_KEYS(`mycol` ->> '$.fields'),
CONCAT('$[', `j`.`row` - 1, ']')
)
) `json_key`,
`j`.`s`
FROM
`mytable`,
JSON_TABLE(
`mycol`,
'$.fields.*' COLUMNS (
`row` FOR ORDINALITY,
`s` VARCHAR(10) PATH '$.s'
)
) `j`
WHERE
`j`.`s` = 'y';
See dbfiddle.