Related
I have the following JSON:
{
"params" : {
"A" : 200.5,
"B" : 70.2
}
}
And the following table:
CREATE TABLE `params` (
`param` varchar(255),
`value` float
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
Is there a way to make mysql function with one insert query without using "while do" directly to insert all the parameters into table like this:
-------+-------
| param | value |
|-------+-------|
| A | 200.5|
| B | 70.2|
---------------
If you are running MySQL 8.0, you can use json_keys() to dynamically extract the keys from the json subobject as a json array, and then use json_table() to turn it to rows. You can then extract the values.
Consider:
insert into `params`
with t as (select '{"params": { "A": 200.5, "B": 70.2 } }' js)
select x.k, json_extract(js, concat('$.params.', x.k)) v
from
t
cross join json_table(
json_keys(js->"$.params"),
"$[*]" columns(k varchar(255) path "$")
) as x
Demo on DB Fiddle
Content of the table after running the query:
param | value
:---- | ----:
A | 200.5
B | 70.2
We can insert multiple rows with a single query using insert into. we need to construct this query using your JSON object
INSERT INTO params(param,value)
VALUES('A',200.5), ('B', 70.2);
I've a json object in my postgres db, which looks like as given below
{"Actor":[{"personName":"Shashi Kapoor","characterName":"Prem"},{"personName":"Sharmila Tagore","characterName":"Preeti"},{"personName":"Shatrughan Sinha","characterName":"Dr. Amar"]}
Edited (from editor: left the original because it is an invalid json, in my edit I fixed it)
{
"Actor":[
{
"personName":"Shashi Kapoor",
"characterName":"Prem"
},
{
"personName":"Sharmila Tagore",
"characterName":"Preeti"
},
{
"personName":"Shatrughan Sinha",
"characterName":"Dr. Amar"
}
]
}
the name of the column be xyz and I've a corresponding content_id.
I need to retrieve content_ids that have Actor & personName = Sharmila Tagore.
I tried many queries, among those these two where very possible query to get but still i didn't get.
SELECT content_id
FROM content_table
WHERE cast_and_crew #>> '{Actor,personName}' = '"C. R. Simha"'
.
SELECT cast_and_crew ->> 'content_id' AS content_id
FROM content_table
WHERE cast_and_crew ->> 'Actor' -> 'personName' = 'C. R. Simha'
You should use jsonb_array_elements() to search in a nested jsonb array:
select content_id, value
from content_table,
lateral jsonb_array_elements(cast_and_crew->'Actor');
content_id | value
------------+-----------------------------------------------------------------
1 | {"personName": "Shashi Kapoor", "characterName": "Prem"}
1 | {"personName": "Sharmila Tagore", "characterName": "Preeti"}
1 | {"personName": "Shatrughan Sinha", "characterName": "Dr. Amar"}
(3 rows)
Column value is of the type jsonb so you can use ->> operator for it:
select content_id, value
from content_table,
lateral jsonb_array_elements(cast_and_crew->'Actor')
where value->>'personName' = 'Sharmila Tagore';
content_id | value
------------+--------------------------------------------------------------
1 | {"personName": "Sharmila Tagore", "characterName": "Preeti"}
(1 row)
Note, if you are using json (not jsonb) use json_array_elements() of course.
I want to parse a JSON string that is in the CLOB column from table Tests_1, and insert it into another table (Test_2).
How can I do this in PL/SQL without using any JSON library?
create table Tests_1
(
value CLOB
)
create table Test_2 (a date,b date,c number,d number, e number)
INSERT INTO Tests_1
(value)
VALUES
('{
"a":"01/01/2015",
"b":"31/12/2015",
"c":"11111111111",
"d":"1111111111",
"e":"1234567890"
}');
With 11.0.4 version (there is no 11.0.4 version, of course) you have at least two choices(apart from writing a parser yourself):
Depending on the version of RDBMS you are using, here are a couple of options:
First one: for Oracle 11.1.0.7 and up, install Apex 5 and use apex_json package:
-- here I have 12.1.0.1 version with version 5 of apex installed
column ora_version format a21;
column apex_version format a21;
select (select version from v$instance) as ora_version
, (select version_no from apex_release) as apex_version
from dual;
--drop table test_2;
/* our test table */
create table test_2(
c_a date,
c_b date,
c_c number,
c_d number,
c_e number
);
select * from test_2;
declare
l_json_doc clob;
begin
dbms_output.put_line('Parsing json...');
l_json_doc := '{"a":"01/01/2015","b":"31/12/2015",
"c":"11111111111","d":"1111111111",
"e":"1234567890"}';
apex_json.parse(l_json_doc);
insert into test_2(c_a, c_b, c_c, c_d, c_e)
values(apex_json.get_date(p_path=>'a', p_format=>'dd/mm/yyyy'),
apex_json.get_date(p_path=>'b', p_format=>'dd/mm/yyyy'),
to_number(apex_json.get_varchar2(p_path=>'c')),
to_number(apex_json.get_varchar2(p_path=>'d')),
to_number(apex_json.get_varchar2(p_path=>'e')));
commit;
dbms_output.put_line('Done!');
end;
/
column c_c format 99999999999;
select to_char(c_a, 'dd/mm/yyyy') as c_a
, to_char(c_b, 'dd/mm/yyyy') as c_b
, c_c
, c_d
, c_e
from test_2;
Result:
ORA_VERSION APEX_VERSION
--------------------- ---------------------
12.1.0.1.0 5.0.2.00.07
1 row selected.
Table created.
no rows selected.
Parsing json...
Done!
PL/SQL procedure successfully completed.
C_A C_B C_C C_D C_E
---------- ---------- ------------ ---------- ----------
01/01/2015 31/12/2015 11111111111 1111111111 1234567890
1 row selected.
Second one: Use opensource PL/JSON. Never used it before, so I'm taking this opportunity to try it out. It's quite similar to apex_json.
declare
l_json json; --json object
l_json_doc clob;
begin
dbms_output.put_line('Parsing json...');
-- parsing is done upon object instantiation
l_json_doc := '{"a":"01/01/2015","b":"31/12/2015",
"c":"11111111111","d":"1111111111",
"e":"1234567890"}';
l_json := json(l_json_doc);
insert into test_2(c_a, c_b, c_c, c_d, c_e)
values(to_date(l_json.get('a').get_string, 'dd-mm-yyyy'),
to_date(l_json.get('b').get_string, 'dd-mm-yyyy'),
to_number(l_json.get('c').get_string),
to_number(l_json.get('d').get_string),
to_number(l_json.get('e').get_string));
commit;
dbms_output.put_line('Done!');
end;
column c_c format 99999999999;
select to_char(c_a, 'dd/mm/yyyy') as c_a
, to_char(c_b, 'dd/mm/yyyy') as c_b
, c_c
, c_d
, c_e
from test_2;
Result:
C_A C_B C_C C_D C_E
---------- ---------- ------------ ---------- ----------
01/01/2015 31/12/2015 11111111111 1111111111 1234567890
01/01/2015 31/12/2015 11111111111 1111111111 1234567890
2 rows selected.
Introduction of json_table() in 12.1.0.2 release makes JSON parsing it a bit simpler(just for the sake of demonstration):
insert into test_2
select to_date(c_a, 'dd-mm-yyyy')
, to_date(c_b, 'dd-mm-yyyy')
, c_c
, c_d
, c_e
from json_table('{"a":"01/01/2015",
"b":"31/12/2015",
"c":"11111111111",
"d":"1111111111",
"e":"1234567890"}'
, '$'
columns (
c_a varchar2(21) path '$.a',
c_b varchar2(21) path '$.b',
c_c varchar2(21) path '$.c',
c_d varchar2(21) path '$.d',
c_e varchar2(21) path '$.e'
)) ;
result:
select *
from test_2;
C_A C_B C_C C_D C_E
----------- ----------- ---------- ---------- ----------
1/1/2015 12/31/2015 1111111111 1111111111 1234567890
Oracle 12c supports JSON
if you have an existing table simply do
ALTER TABLE table1 ADD CONSTRAINT constraint_name CHECK (your_column IS json);
SELECT t.your_column.id FROM table1 t;
Note that for some reason t nickname is necessary there
Or complete example:
CREATE TABLE json_documents (
id RAW(16) NOT NULL,
data CLOB,
CONSTRAINT json_documents_pk PRIMARY KEY (id),
CONSTRAINT json_documents_json_chk CHECK (data IS JSON)
);
INSERT INTO json_documents (id, data)
VALUES (SYS_GUID(),
'{
"FirstName" : "John",
"LastName" : "Doe",
"Job" : "Clerk",
"Address" : {
"Street" : "99 My Street",
"City" : "My City",
"Country" : "UK",
"Postcode" : "A12 34B"
},
"ContactDetails" : {
"Email" : "john.doe#example.com",
"Phone" : "44 123 123456",
"Twitter" : "#johndoe"
},
"DateOfBirth" : "01-JAN-1980",
"Active" : true
}');
SELECT a.data.FirstName,
a.data.LastName,
a.data.Address.Postcode AS Postcode,
a.data.ContactDetails.Email AS Email
FROM json_documents a;
FIRSTNAME LASTNAME POSTCODE EMAIL
--------------- --------------- ---------- -------------------------
Jayne Doe A12 34B jayne.doe#example.com
John Doe A12 34B john.doe#example.com
2 rows selected.
More info
https://oracle-base.com/articles/12c/json-support-in-oracle-database-12cr1
https://docs.oracle.com/database/122/ADJSN/using-PLSQL-object-types-for-JSON.htm#ADJSN-GUID-F0561593-D0B9-44EA-9C8C-ACB6AA9474EE
Since you specified you don't want to use any JSON library, if the format is fixed you could coerce it into something you could parse as XML, starting with stripping the curly braces, replacing the colons with equals signs, and removing the double-quotes from the first part of each name/value pair:
select regexp_replace(regexp_replace(value, '(^{|}$)'),
'^"(.*)":(".*")($|,)', '\1=\2', 1, 0, 'm')
from tests_1;
REGEXP_REPLACE(REGEXP_REPLACE(VALUE,'(^{|}$)'),'^"(.*)":(".*")($|,)','\1=\2',1,0
--------------------------------------------------------------------------------
a="01/01/2015"
b="31/12/2015"
c="11111111111"
d="1111111111"
e="1234567890"
which you can use as the attributes of a dummy XML node; convert that to XMLType and you can use XMLTable to extract the attributes:
select x.a, x.b, x.c, x.d, x.e
from tests_1 t
cross join xmltable('/tmp'
passing xmltype('<tmp ' ||regexp_replace(regexp_replace(value, '(^{|}$)'),
'^"(.*)":(".*")($|,)', '\1=\2', 1, 0, 'm') || ' />')
columns a varchar2(10) path '#a',
b varchar2(10) path '#b',
c number path '#c',
d number path '#d',
e number path '#e'
) x;
A B C D E
---------- ---------- ------------- ------------- -------------
01/01/2015 31/12/2015 11111111111 1111111111 1234567890
Then you can convert the strings to dates during insert:
insert into test_2 (a, b, c, d, e)
select to_date(x.a, 'DD/MM/YYYY'), to_date(x.b, 'DD/MM/YYYY'), x.c, x.d, x.e
from tests_1 t
cross join xmltable('/tmp'
passing xmltype('<tmp ' || regexp_replace(regexp_replace(value, '(^{|}$)'),
'^"(.*)":(".*")($|,)', '\1=\2', 1, 0, 'm') || ' />')
columns a varchar2(10) path '#a',
b varchar2(10) path '#b',
c number path '#c',
d number path '#d',
e number path '#e'
) x;
select * from test_2;
A B C D E
---------- ---------- ------------- ------------- -------------
2015-01-01 2015-12-31 11111111111 1111111111 1234567890
That will cope with some of the name/value pairs not being there, and you'll get nulls if that happens.
If all the pairs will always be there you could just tokenize the string and pull out the relevant parts:
select to_date(regexp_substr(value, '[^"]+', 1, 4), 'DD/MM/YYYY') as a,
to_date(regexp_substr(value, '[^"]+', 1, 8), 'DD/MM/YYYY') as b,
to_number(regexp_substr(value, '[^"]+', 1, 12)) as c,
to_number(regexp_substr(value, '[^"]+', 1, 16)) as d,
to_number(regexp_substr(value, '[^"]+', 1, 20)) as e
from tests_1;
A B C D E
---------- ---------- ------------- ------------- -------------
2015-01-01 2015-12-31 11111111111 1111111111 1234567890
From Oracle 18c you could use TREAT AS JSON operator:
SQL Enhancements for JSON
You can specify that a given SQL expression returns JSON data, using TREAT (... AS JSON).
TREAT (... AS JSON) lets you specify that the return value from a given SQL expression is to be treated as JSON data. Such expressions can include PL/SQL function calls and columns specified by a SQL WITH clause. New data-guide views make it easy to access path and type information for JSON fields, which is recorded for index-backed data guides. Returning generated and queried JSON data in LOB instances widens the scope of the use of relational data.
This operator provides a way to inform the database that the content of a VARCHAR2, BLOB, CLOB should be treated as containing JSON. This enables a number of useful features, including the ability to use "Simplified Syntax" on database objects that do not have an "IS JSON" constraint.
And in your example:
create table Test_1(val CLOB);
create table Test_2(a date,b date,c number,d number, e number);
INSERT INTO Test_1(val)
VALUES('{
"a":"01/01/2015",
"b":"31/12/2015",
"c":"11111111111",
"d":"1111111111",
"e":"1234567890"
}');
INSERT INTO Test_2(a,b,c,d,e)
SELECT sub.val_as_json.a,
sub.val_as_json.b,
sub.val_as_json.c,
sub.val_as_json.d,
sub.val_as_json.e
FROM (SELECT TREAT(val as JSON) val_as_json
FROM Test_1) sub;
COMMIT;
db<>fiddle demo
Let's say I have a JSON column named data in some MySQL table, and this column is a single array. So, for example, data may contain:
[1,2,3,4,5]
Now I want to select all rows which have a data column where one of its array elements is greater than 2. Is this possible?
I tried the following, but seems it is always true regardless of the values in the array:
SELECT * from my_table
WHERE JSON_EXTRACT(data, '$[*]') > 2;
You may search an array of integers as follows:
JSON_CONTAINS('[1,2,3,4,5]','7','$') Returns: 0
JSON_CONTAINS('[1,2,3,4,5]','1','$') Returns: 1
You may search an array of strings as follows:
JSON_CONTAINS('["a","2","c","4","x"]','"x"','$') Returns: 1
JSON_CONTAINS('["1","2","3","4","5"]','"7"','$') Returns: 0
Note: JSON_CONTAINS returns either 1 or 0
In your case you may search using a query like so:
SELECT * from my_table
WHERE JSON_CONTAINS(data, '2', '$');
SELECT JSON_SEARCH('["1","2","3","4","5"]', 'one', "2") is not null
is true
SELECT JSON_SEARCH('["1","2","3","4","5"]', 'one', "6") is not null
is false
Since MySQL 8 there is a new function called JSON_TABLE.
CREATE TABLE my_table (id INT, data JSON);
INSERT INTO my_table VALUES
(1, "[1,2,3,4,5]"),
(2, "[0,1,2]"),
(3, "[3,4,-10]"),
(4, "[-1,-2,0]");
SELECT DISTINCT my_table.*
FROM my_table, JSON_TABLE(data, "$[*]" COLUMNS(nr INT PATH '$')) as ids
WHERE ids.nr > 2;
+------+-----------------+
| id | data |
+------+-----------------+
| 1 | [1, 2, 3, 4, 5] |
| 3 | [3, 4, -10] |
+------+-----------------+
2 rows in set (0.00 sec)
I use a combination of JSON_EXTRACT and JSON_CONTAINS (MariaDB):
SELECT * FROM table WHERE JSON_CONTAINS(JSON_EXTRACT(json_field, '$[*].id'), 11, '$');
I don't know if we found the solution.
I found with MariaDB a way, to search path in a array. For example, in array [{"id":1}, {"id":2}], I want find path with id equal to 2.
SELECT JSON_SEARCH('name_field', 'one', 2, null, '$[*].id')
FROM name_table
The result is:
"$[1].id"
The asterisk indicate searching the entire array
This example works for me with mysql 5.7 above
SET #j = '{"a": [ "8428341ffffffff", "8428343ffffffff", "8428345ffffffff", "8428347ffffffff","8428349ffffffff", "842834bffffffff", "842834dffffffff"], "b": 2, "c": {"d": 4}}';
select JSON_CONTAINS(JSON_EXTRACT(#j , '$.a'),'"8428341ffffffff"','$') => returns 1
notice about " around search keyword, '"8428341ffffffff"'
A possible way is to deal with the problem as string matching. Convert the JSON to string and match.
Or you can use JSON_CONTAINS.
You can use JSON extract to search and select data
SELECT data, data->"$.id" as selectdata
FROM table
WHERE JSON_EXTRACT(data, "$.id") = '123'
#ORDER BY c->"$.name";
limit 10 ;
SET #doc = '[{"SongLabels": [{"SongLabelId": "111", "SongLabelName": "Funk"}, {"SongLabelId": "222", "SongLabelName": "RnB"}], "SongLabelCategoryId": "test11", "SongLabelCategoryName": "曲风"}]';
SELECT *, JSON_SEARCH(#doc, 'one', '%un%', null, '$[*].SongLabels[*].SongLabelName')FROM t_music_song_label_relation;
result: "$[0].SongLabels[0].SongLabelName"
SELECT song_label_content->'$[*].SongLabels[*].SongLabelName' FROM t_music_song_label_relation;
result: ["Funk", "RnB"]
I have similar problem, search via function
create function searchGT(threshold int, d JSON)
returns int
begin
set #i = 0;
while #i < json_length(d) do
if json_extract(d, CONCAT('$[', #i, ']')) > threshold then
return json_extract(d, CONCAT('$[', #i, ']'));
end if;
set #i = #i + 1;
end while;
return null;
end;
select searchGT(3, CAST('[1,10,20]' AS JSON));
This seems to be possible with to JSON_TABLE function. It's available in mysql version 8.0 or mariadb version 10.6.
With this test setup
CREATE TEMPORARY TABLE mytable
WITH data(a,json) AS (VALUES ('a','[1]'),
('b','[1,2]'),
('c','[1,2,3]'),
('d','[1,2,3,4]'))
SELECT * from data;
we get the following table
+---+-----------+
| a | json |
+---+-----------+
| a | [1] |
| b | [1,2] |
| c | [1,2,3] |
| d | [1,2,3,4] |
+---+-----------+
It's possible to select every row from mytable wich has a value greater than 2 in the json array with this query.
SELECT * FROM mytable
WHERE TRUE IN (SELECT val > 2
FROM JSON_TABLE(json,'$[*]'
columns (val INT(1) path '$')
) as json
)
Returns:
+---+-----------+
| a | json |
+---+-----------+
| c | [1,2,3] |
| d | [1,2,3,4] |
+---+-----------+
I have a table like this:
+--------+--------------------+
| ID | Attribute |
+--------+--------------------+
| 1 |"color" => "red" |
+--------+--------------------+
| 1 |"color" => "green" |
+--------+--------------------+
| 1 |"shape" => "square" |
+--------+--------------------+
| 2 |"color" => "blue" |
+--------+--------------------+
| 2 |"color" => "black" |
+--------+--------------------+
| 2 |"flavor" => "sweat" |
+--------+--------------------+
| 2 |"flavor" => "salty" |
+--------+--------------------+
And I want to run some postgres query that get a result table like this:
+--------+------------------------------------------------------+
| ID | Attribute |
+--------+------------------------------------------------------+
| 1 |"color" => "red, green", "shape" => "square" |
+--------+------------------------------------------------------+
| 2 |"color" => "blue, black", "flavor" => "sweat, salty" |
+--------+------------------------------------------------------+
The attribute column can either be hstore or json format. I wrote it in hstore for an example, but if we cannot achieve this in hstore, but in json, I would change the column to json.
I know that hstore does not support one key to multiple values, when I tried some merge method, it only kept one value for each key. But for json, I didn't find anything that supports multiple value merge like this neither. I think this can be done by function merging values for the same key into a string/text and add it back to the key/value pair. But I'm stuck in implementing it.
Note: if implement this in some function, ideally any key such as color, shape should not appear in the function since keys can be expanded dynamically.
Does anyone have any idea about this? Any advice or brainstorm might help. Thank you!
Just a note before anything else: in your desidered output I would use some proper json and not that kind of lookalike. So a correct output according to me would be:
+--------+----------------------------------------------------------------------+
| ID | Attribute |
+--------+----------------------------------------------------------------------+
| 1 | '{"color":["red","green"], "flavor":[], "shape":["square"]}' |
+--------+----------------------------------------------------------------------+
| 2 | '{"color":["blue","black"], "flavor":["sweat","salty"], "shape":[]}' |
+--------+----------------------------------------------------------------------+
A PL/pgSQL function which parses the json attributes and executes a dynamic query would do the job, something like that:
CREATE OR REPLACE FUNCTION merge_rows(PAR_table regclass) RETURNS TABLE (
id integer,
attributes json
) AS $$
DECLARE
ARR_attributes text[];
VAR_attribute text;
ARR_query_parts text[];
BEGIN
-- Get JSON attributes names
EXECUTE format('SELECT array_agg(name ORDER BY name) AS name FROM (SELECT DISTINCT json_object_keys(attribute) AS name FROM %s) AS s', PAR_table) INTO ARR_attributes;
-- Write json_build_object() query part
FOREACH VAR_attribute IN ARRAY ARR_attributes LOOP
ARR_query_parts := array_append(ARR_query_parts, format('%L, array_remove(array_agg(l.%s), null)', VAR_attribute, VAR_attribute));
END LOOP;
-- Return dynamic query
RETURN QUERY EXECUTE format('
SELECT t.id, json_build_object(%s) AS attributes
FROM %s AS t,
LATERAL json_to_record(t.attribute) AS l(%s)
GROUP BY t.id;',
array_to_string(ARR_query_parts, ', '), PAR_table, array_to_string(ARR_attributes, ' text, ') || ' text');
END;
$$ LANGUAGE plpgsql;
I've tested it and it seems to work, it returns a json with. Here is my test code:
CREATE TABLE mytable (
id integer NOT NULL,
attribute json NOT NULL
);
INSERT INTO mytable (id, attribute) VALUES
(1, '{"color":"red"}'),
(1, '{"color":"green"}'),
(1, '{"shape":"square"}'),
(2, '{"color":"blue"}'),
(2, '{"color" :"black"}'),
(2, '{"flavor":"sweat"}'),
(2, '{"flavor":"salty"}');
SELECT * FROM merge_rows('mytable');
Of course you can pass the id and attribute column names as parameters as well and maybe refine the function a bit, this is just to give you an idea.
EDIT : If you're on 9.4 please consider using jsonb datatype, it's much better and gives you room for improvements. You would just need to change the json_* functions to their jsonb_* equivalents.
If you just want this for display purposes, this might be enough:
select id, string_agg(key||' => '||vals, ', ')
from (
select t.id, x.key, string_agg(value, ',') vals
from t
join lateral each(t.attributes) x on true
group by id, key
) t
group by id;
If you are not on 9.4, you can't use the lateral join:
select id, string_agg(key||' => '||vals, ', ')
from (
select id, key, string_agg(val, ',') as vals
from (
select t.id, skeys(t.attributes) as key, svals(t.attributes) as val
from t
) t1
group by id, key
) t2
group by id;
This will return:
id | string_agg
---+-------------------------------------------
1 | color => red,green, shape => square
2 | color => blue,black, flavor => sweat,salty
SQLFiddle: http://sqlfiddle.com/#!15/98caa/2