Converting a mysql dict column using JSON_Table - mysql

Edited
I have a mysql table "prop" with a column "detail" that contains a dict field.
fnum details
55 '{"a":"3"},{"b":"2"},{"d":"1"}'
I have tried to convert this to a table. using this
SELECT p.fnum, deets.*
FROM prop p
JOIN JSON_TABLE( p.details,
'$[*]'
COLUMNS (
idx FOR ORDINALITY,
a varChar(10) PATH '$.a',
b varchar(20) PATH '$.b'
d varchar(45) PATH '$.d',
)
) deets
I have tried various paths including $.*. I am expecting the following:
fnum a b d
55 3 2 1
also if I have 2 rows such as
fnum details
55 '{"a":"3"},{"b":"2"},{"d":"1"}'
56 '{"c":"car"}'
should generate the following
fnum a b d c
55 3 2 1 null
56 null null null car

Your details data is not valid JSON, because it doesn't have [ ] delimiting the array.
Demo:
mysql> create table prop (fnum int, details json);
mysql> insert into prop select 55, '{"a":"3"},{"b":"2"},{"d":"1"}';
ERROR 3140 (22032): Invalid JSON text: "The document root must
not be followed by other values." at position 9 in value for column
'prop.details'.
mysql> insert into prop select 55, '[{"a":"3"},{"b":"2"},{"d":"1"}]';
Query OK, 1 row affected (0.00 sec)
Records: 1 Duplicates: 0 Warnings: 0
It's worth using the JSON data type instead of storing JSON in a text column, because the JSON data type ensures that the document is valid JSON format. It must be valid JSON to use the JSON_TABLE() function or any other JSON function.
Also your query has some syntax mistakes with respect to commas:
SELECT p.fnum, deets.*
FROM prop p
JOIN JSON_TABLE( p.details,
'$[*]'
COLUMNS (
idx FOR ORDINALITY,
a varChar(10) PATH '$.a',
b varchar(20) PATH '$.b' <-- missing comma
d varchar(45) PATH '$.d', <-- extra comma
)
) deets

Related

Postgresql - How to query nested array elements

I have JSON array data in PostgreSQL 13 table. I want to query this table to see all the nested array data in the output. I tried the below query, but it's not giving the expected output.
select
json_data::json -> 'Rows' -> 0 -> 'Values' ->> 0 as Lid
,json_data::json -> 'Rows' -> 0 -> 'Values' ->> 1 as L2LicenseId
,json_data::json -> 'Rows' -> 1 -> 'Values' ->> 0 as Lid
,json_data::json -> 'Rows' -> 1 -> 'Values' ->> 1 as L2LicenseId
from test;
Can someone please help me?
Sample Data
CREATE TABLE IF NOT EXISTS test
(
json_data text
);
INSERT INTO test (json_data) VALUES ('{"Origin":"api","Topic":"licenses","Timestamp":"2023-02-07T12:46:42.2568898+00:00","Columns":["LId","L2LicenseId","SfdcAccountId","SfdcLineItemId","SL","Quantity","StartDate","EndDate","DisplayName","ProductPrimaryKey"],"Schema":["string","string","string","string","string","int32","datetime","datetime","string","string"],"Rows":[{"Values":["1234","123456","ACC_","PurchaseT","SKU-0000","1","2023-01-09T00:00:00.0000000","2024-01-08T00:00:00.0000000","Automation with 5 users","lc11dev.my-dev.com"]},{"Values":["8967","8967-e567","fihikelo","Addon_00000490_2nd_GB","SKU-0490","3","2023-01-01T00:00:00.0000000","2023-01-22T00:00:00.0000000","Automation, Data 5GB","mygreattest01311433.my-dev.com"]}]}')
Expected Output
DB FIDDLE
You can do it using PostgreSQL function jsonb_array_elements (or json_array_elements). This function extracts all Json array elements like as rows view.
select
a2.value -> 'Values' ->> 0 as Lid,
a2.value -> 'Values' ->> 1 as L2LicenseId
from test a1
cross join jsonb_array_elements(a1.json_data::jsonb->'Rows') a2
-- Result:
lid | l2licenseid |
--- -+-------------+
1234 | 123456 |
8967 | 8967-e567 |

MySQL 8 update with Replace() or Regex

How can i update data using replace or regex-like method from
id | jdata
---------------
01 | {"name1":["number","2"]}
02 | {"val1":["number","12"],"val2":["number","22"]}
to
id | jdata
---------------
01 | {"name1":2 }
02 | {"val1": 12,"val2":22 }
I need to make a proper json entry for numbers and replace an array with a number from that array. Column "jdata" can have any number of similar attributes from the example. Something similar to this would do:
UPDATE table SET jdata = REPLACE(jdata, '["number","%d"]', %d);
Two ways:
The long, more clumsy way, using JSON_ARRAY:
UPDATE table1,
(
SELECT
id,
JSON_EXTRACT(jdata, "$.name1[0]") as A,
JSON_EXTRACT(jdata, "$.name1[1]") as B,
JSON_EXTRACT(jdata, "$.val1[0]") as C,
JSON_EXTRACT(jdata, "$.val1[1]") as D,
JSON_EXTRACT(jdata, "$.val2[0]") as E,
JSON_EXTRACT(jdata, "$.val2[1]") as F
FROM table1
) x
SET jdata = CASE WHEN table1.id=1 THEN JSON_ARRAY("name1",x.B)
ELSE JSON_ARRAY("val1",x.D,"val2",F) END
WHERE x.id=table1.id;
Or using JSON_REPLACE:
update table1
set jdata = JSON_REPLACE(jdata, "$.name1",JSON_EXTRACT(jdata,"$.name1[1]"))
where id=1;
update table1
set jdata = JSON_REPLACE(jdata, "$.val1",JSON_EXTRACT(jdata,"$.val1[1]"),
"$.val2",JSON_EXTRACT(jdata,"$.val2[1]"))
where id=2;
see: DBFIDDLE for both options
EDIT: To get more depth in the query, you can start with below, and create a new JSON message from this stuff without the number:
WITH RECURSIVE cte1 as (
select 0 as x
union all
select x+1 from cte1 where x<10
)
select
id,
x,
JSON_UNQUOTE(JSON_EXTRACT(JSON_KEYS(jdata),CONCAT("$[",x,"]"))) j,
JSON_EXTRACT(jdata,CONCAT("$.",JSON_UNQUOTE(JSON_EXTRACT(JSON_KEYS(jdata),CONCAT("$[",x,"]"))))) v,
JSON_UNQUOTE(JSON_EXTRACT(jdata,CONCAT("$.",JSON_UNQUOTE(JSON_EXTRACT(JSON_KEYS(jdata),CONCAT("$[",x,"]"))),"[0]"))) v1,
JSON_UNQUOTE(JSON_EXTRACT(jdata,CONCAT("$.",JSON_UNQUOTE(JSON_EXTRACT(JSON_KEYS(jdata),CONCAT("$[",x,"]"))),"[1]"))) v2
from table1
cross join cte1
where x<JSON_DEPTH(jdata)
and not JSON_EXTRACT(JSON_KEYS(jdata),CONCAT("$[",x,"]")) is null
order by id,x;
output:
id
x
j
v
v1
v2
1
0
name1
["number", "2"]
number
2
2
0
val1
["number", "12"]
number
12
2
1
val2
["number", "22"]
number
22
This should take care of JSON message which also contains values like val3, val4, etc, until a maximum depth which is now fixed to 10 in cte1.
EDIT2: When it is just needed to remove the "number" from the JSON message, you can also repeat this UPDATE until all "number" tags are removed (you can repeat this in a stored procedure, I am not going to write the stored procedure for you 😉)
update
table1,
( WITH RECURSIVE cte1 as (
select 0 as x
union all
select x+1 from cte1 where x<10
) select * from cte1 )x
set jdata = JSON_REMOVE(table1.jdata, CONCAT("$.",JSON_UNQUOTE(JSON_EXTRACT(JSON_KEYS(jdata),CONCAT("$[",x,"]"))),"[0]"))
where JSON_UNQUOTE(JSON_EXTRACT(jdata,CONCAT("$.",JSON_UNQUOTE(JSON_EXTRACT(JSON_KEYS(jdata),CONCAT("$[",x,"]"))),"[0]"))) = "number"
An example, where I do run the update 2 times, is in this DBFIDDLE

Postgres select value by key from json in a list

Given the following:
create table test (
id int,
status text
);
insert into test values
(1,'[]'),
(2,'[{"A":"d","B":"c"}]'),
(3,'[{"A":"g","B":"f"}]');
Is it possible to return?
id A B
1 null null
2 d c
3 g f
I am attempting something like this:
select id,
status::json ->> 0 #> "A" from test
Try this to address your specific example :
SELECT id, (status :: json)#>>'{0,A}' AS A, (status :: json)#>>'{0,B}' AS B
FROM test
see the result
see the manual :
jsonb #>> text[] → text
Extracts JSON sub-object at the specified path as text.
'{"a": {"b": ["foo","bar"]}}'::json #>> '{a,b,1}' → bar
This does it:
SELECT id,
(status::json->0)->"A" as A,
(status::json->0)->"B" as B
FROM test;

MySQL LOAD DATA - Avoid convert string to zero when integer column

I try to trigger an error when I load a string into integer column with LOAD DATA.
The string value in file (aaa) become "0" in table.
My table :
CREATE TABLE (
a INT(11) DEFAULT NULL,
b INT(11) DEFAULT NULL,
c VARCHAR(45) DEFAULT NULL,
c VARCHAR(45) DEFAULT NULL
)
My loader :
LOAD DATA LOCAL INFILE 'file.txt'
INTO TABLE `test1`
FIELDS TERMINATED BY ';'
IGNORE 1 LINES (a,b,c,d)
My data file :
a;b;c;d
aaa;11;aa;z
2;bbb;bb;x
3;33;cc;w
4;44;dd;y
And the result in the table :
a b c d
-------------
0 11 aa z
2 0 bb x
3 33 cc w
4 44 dd y
You can see that "aaa" become "0" and "bbb" too.
I would like the file records to be rejected.
I tried to set sql mode to STRICT_ALL_TABLES but no effect :
set sql_mode = STRICT_ALL_TABLES;
Thank you !

Compare JSON values in MariaDB

How can I compare two JSON values in MariaDB? Two values such as {"b": 1, "a": 2} and {"a": 2, "b": 1} should be equal. Does MariaDB contain function to reorder elements of a JSON value?
If you expect to need this (uncommon) kind of comparison, build the JSON in some canonical way before storing it. The obvious way for a simple JSON like yours is to alphabetize the keys. How to do that will depend on the "encode" library you are using for JSON.
Just use JSON_EXTRACT, JSON_EXTRACT doesnt care about the position of a digit within a JSON string.
Query
SELECT
JSON_EXTRACT(#json_string_1, '$.a') AS a1
, JSON_EXTRACT(#json_string_2, '$.a') AS a2
, JSON_EXTRACT(#json_string_1, '$.b') AS b1
, JSON_EXTRACT(#json_string_2, '$.b') AS b2
FROM (
SELECT
#json_string_1 := '{"b":1,"a":2}'
, #json_string_2 := '{"a":2,"b":1}'
)
AS
json_strings
Result
a1 a2 b1 b2
------ ------ ------ --------
2 2 1 1
Now use this result as delivered table so we can check if a1 is equal to a2 and b1 is equal to b2.
Query
SELECT
1 AS json_equal
FROM (
SELECT
JSON_EXTRACT(#json_string_1, '$.a') AS a1
, JSON_EXTRACT(#json_string_2, '$.a') AS a2
, JSON_EXTRACT(#json_string_1, '$.b') AS b1
, JSON_EXTRACT(#json_string_2, '$.b') AS b2
FROM (
SELECT
#json_string_1 := '{"b":1,"a":2}'
, #json_string_2 := '{"a":2,"b":1}'
)
AS
json_strings
)
AS json_data
WHERE
json_data.a1 = json_data.a2
AND
json_data.b1 = json_data.b2
Result
json_equal
------------
1
Disclaimer: I work for MariaDB
See my answer at https://dba.stackexchange.com/a/300235/208895 for an example how to use JSON_EQUALS available as of 10.7.