I have table in Postgres with JSONB filed.
JSONB contains jsons as:
{
"id": "adf59079-4921-4abc-a262-1dc8c2b1ccc7",
"lastname": "LOBATOS",
"firstname": "Leslie",
"birth_date": "1988-01-26",
"gender": 3,
"contacts": {
"phoneList": [
{
"fullNumber": "0671234567",
"verifyStateId": 1
},
{
"fullNumber": "0671234588",
"verifyStateId": 0
}
]
}
}
I need select following data-set (in SQL notation)
SELECT id, lastname, fullNumber FROM <JSONB-field>
WHERE fullNumber LIKE '067%' and verifyStateId = 1
Plz help write query
You can use a JSON path expression to filter out the needed rows:
where the_column #? '$.contacts.phoneList[*] ? (#.fullNumber like_regex "^067" && #.verifyStateId == 1)'
To actually get the fullNumber you need to repeat the JSON path in order to extract the array element in question:
select id,
the_column ->> 'lastname',
jsonb_path_query_first(the_column,
'$.contacts.phoneList[*] ? (#.fullNumber like_regex "^067" && #.verifyStateId == 1)'
) ->> 'fullNumber' as fullnumber
from the_table
where the_column #? '$.contacts.phoneList[*] ? (#.fullNumber like_regex "^067" && #.verifyStateId == 1)'
The WHERE condition can potentially make use of a GIN index on the_column to improve performance.
If there is no such index or performance isn't that important, you can avoid repeating the JSON path by using a derived table:
select *
from (
select id,
the_column ->> 'lastname',
jsonb_path_query_first(the_column, '$.contacts.phoneList[*] ? (#.fullNumber like_regex "^067" && #.verifyStateId == 1)') ->> 'fullNumber' as fullnumber
from the_table
) t
where fullnumber is not null
You can use next query:
with unnested as (
select
fld->>'id' id, fld->>'lastname' lastname,
jsonb_array_elements(((fld->>'contacts')::jsonb->>'phoneList')::jsonb)
from tbl
) select id, lastname, jsonb_array_elements->>'fullNumber' from unnested;
PostgreSQL fiddle
+======================================+==========+============+
| id | lastname | ?column? |
+======================================+==========+============+
| adf59079-4921-4abc-a262-1dc8c2b1ccc7 | LOBATOS | 0671234567 |
+--------------------------------------+----------+------------+
| adf59079-4921-4abc-a262-1dc8c2b1ccc7 | LOBATOS | 0671234588 |
+--------------------------------------+----------+------------+
demo
WITH cte AS (
SELECT
jsonb_path_query(data, '$.contacts.phoneList[*].verifyStateId')::text AS verifyStateId,
jsonb_path_query(data, '$.id')::text AS id,
jsonb_path_query(data, '$.lastname')::text AS lastname,
jsonb_path_query(data, '$.contacts.phoneList[*].fullNumber')::text AS fullnumber
FROM
extract_jsonb
)
SELECT
*
FROM
cte
WHERE
verifyStateId = '1'::text
AND fullnumber ILIKE '"067%'::text;
since cast to text, somehow the first character of fullnumber is "
Related
I have a column which contains a JSON value of different lengths
["The Cherries:2.50","Draw:3.25","Swansea Jacks:2.87"]
I want to split them and store into a JSON like so:
[
{
name: "The Cherries",
odds: 2.50
},
{
name: "Draw",
odds: 3.25
},
{
name: "Swansea",
odds: 2.87
},
]
What I did right now is looping and splitting them in the UI which to me is quite heavy for the client. I want to parse and split them all in a single query.
If you are running MySQL 8.0, you can use json_table() to split the original arrayto rows, and then build new objects and aggregate them with json_arrayagg().
We need a primary key column (or set of columns) so we can properly aggreate the generated rows, I assumed id:
select
t.id,
json_arrayagg(json_object(
'name', substring(j.val, 1, locate(':', j.val) - 1),
'odds', substring(j.val, locate(':', j.val) + 1)
)) new_js
from mytable t
cross join json_table(t.js, '$[*]' columns (val varchar(500) path '$')) as j
group by t.id
Demo on DB Fiddle
Sample data:
id | js
-: | :-------------------------------------------------------
1 | ["The Cherries:2.50", "Draw:3.25", "Swansea Jacks:2.87"]
Query results:
id | new_js
-: | :----------------------------------------------------------------------------------------------------------------------
1 | [{"name": "The Cherries", "odds": "2.50"}, {"name": "Draw", "odds": "3.25"}, {"name": "Swansea Jacks", "odds": "2.87"}]
You can use json_table to create rows from the json object.
Just replace table_name with your table name and json with the column that contains json
SELECT json_arrayagg(json_object('name',SUBSTRING_INDEX(person, ':', 1) ,'odds',SUBSTRING_INDEX(person, ':', -1) ))
FROM table_name,
JSON_TABLE(json, '$[*]' COLUMNS (person VARCHAR(40) PATH '$') people;
Here is a Db fiddle you can refer
https://dbfiddle.uk/?rdbms=mysql_8.0&fiddle=801de9f067e89a48d45ef9a5bd2d094a
I've a json object in my postgres db, which looks like as given below
{"Actor":[{"personName":"Shashi Kapoor","characterName":"Prem"},{"personName":"Sharmila Tagore","characterName":"Preeti"},{"personName":"Shatrughan Sinha","characterName":"Dr. Amar"]}
Edited (from editor: left the original because it is an invalid json, in my edit I fixed it)
{
"Actor":[
{
"personName":"Shashi Kapoor",
"characterName":"Prem"
},
{
"personName":"Sharmila Tagore",
"characterName":"Preeti"
},
{
"personName":"Shatrughan Sinha",
"characterName":"Dr. Amar"
}
]
}
the name of the column be xyz and I've a corresponding content_id.
I need to retrieve content_ids that have Actor & personName = Sharmila Tagore.
I tried many queries, among those these two where very possible query to get but still i didn't get.
SELECT content_id
FROM content_table
WHERE cast_and_crew #>> '{Actor,personName}' = '"C. R. Simha"'
.
SELECT cast_and_crew ->> 'content_id' AS content_id
FROM content_table
WHERE cast_and_crew ->> 'Actor' -> 'personName' = 'C. R. Simha'
You should use jsonb_array_elements() to search in a nested jsonb array:
select content_id, value
from content_table,
lateral jsonb_array_elements(cast_and_crew->'Actor');
content_id | value
------------+-----------------------------------------------------------------
1 | {"personName": "Shashi Kapoor", "characterName": "Prem"}
1 | {"personName": "Sharmila Tagore", "characterName": "Preeti"}
1 | {"personName": "Shatrughan Sinha", "characterName": "Dr. Amar"}
(3 rows)
Column value is of the type jsonb so you can use ->> operator for it:
select content_id, value
from content_table,
lateral jsonb_array_elements(cast_and_crew->'Actor')
where value->>'personName' = 'Sharmila Tagore';
content_id | value
------------+--------------------------------------------------------------
1 | {"personName": "Sharmila Tagore", "characterName": "Preeti"}
(1 row)
Note, if you are using json (not jsonb) use json_array_elements() of course.
Let's say I have a JSON column named data in some MySQL table, and this column is a single array. So, for example, data may contain:
[1,2,3,4,5]
Now I want to select all rows which have a data column where one of its array elements is greater than 2. Is this possible?
I tried the following, but seems it is always true regardless of the values in the array:
SELECT * from my_table
WHERE JSON_EXTRACT(data, '$[*]') > 2;
You may search an array of integers as follows:
JSON_CONTAINS('[1,2,3,4,5]','7','$') Returns: 0
JSON_CONTAINS('[1,2,3,4,5]','1','$') Returns: 1
You may search an array of strings as follows:
JSON_CONTAINS('["a","2","c","4","x"]','"x"','$') Returns: 1
JSON_CONTAINS('["1","2","3","4","5"]','"7"','$') Returns: 0
Note: JSON_CONTAINS returns either 1 or 0
In your case you may search using a query like so:
SELECT * from my_table
WHERE JSON_CONTAINS(data, '2', '$');
SELECT JSON_SEARCH('["1","2","3","4","5"]', 'one', "2") is not null
is true
SELECT JSON_SEARCH('["1","2","3","4","5"]', 'one', "6") is not null
is false
Since MySQL 8 there is a new function called JSON_TABLE.
CREATE TABLE my_table (id INT, data JSON);
INSERT INTO my_table VALUES
(1, "[1,2,3,4,5]"),
(2, "[0,1,2]"),
(3, "[3,4,-10]"),
(4, "[-1,-2,0]");
SELECT DISTINCT my_table.*
FROM my_table, JSON_TABLE(data, "$[*]" COLUMNS(nr INT PATH '$')) as ids
WHERE ids.nr > 2;
+------+-----------------+
| id | data |
+------+-----------------+
| 1 | [1, 2, 3, 4, 5] |
| 3 | [3, 4, -10] |
+------+-----------------+
2 rows in set (0.00 sec)
I use a combination of JSON_EXTRACT and JSON_CONTAINS (MariaDB):
SELECT * FROM table WHERE JSON_CONTAINS(JSON_EXTRACT(json_field, '$[*].id'), 11, '$');
I don't know if we found the solution.
I found with MariaDB a way, to search path in a array. For example, in array [{"id":1}, {"id":2}], I want find path with id equal to 2.
SELECT JSON_SEARCH('name_field', 'one', 2, null, '$[*].id')
FROM name_table
The result is:
"$[1].id"
The asterisk indicate searching the entire array
This example works for me with mysql 5.7 above
SET #j = '{"a": [ "8428341ffffffff", "8428343ffffffff", "8428345ffffffff", "8428347ffffffff","8428349ffffffff", "842834bffffffff", "842834dffffffff"], "b": 2, "c": {"d": 4}}';
select JSON_CONTAINS(JSON_EXTRACT(#j , '$.a'),'"8428341ffffffff"','$') => returns 1
notice about " around search keyword, '"8428341ffffffff"'
A possible way is to deal with the problem as string matching. Convert the JSON to string and match.
Or you can use JSON_CONTAINS.
You can use JSON extract to search and select data
SELECT data, data->"$.id" as selectdata
FROM table
WHERE JSON_EXTRACT(data, "$.id") = '123'
#ORDER BY c->"$.name";
limit 10 ;
SET #doc = '[{"SongLabels": [{"SongLabelId": "111", "SongLabelName": "Funk"}, {"SongLabelId": "222", "SongLabelName": "RnB"}], "SongLabelCategoryId": "test11", "SongLabelCategoryName": "曲风"}]';
SELECT *, JSON_SEARCH(#doc, 'one', '%un%', null, '$[*].SongLabels[*].SongLabelName')FROM t_music_song_label_relation;
result: "$[0].SongLabels[0].SongLabelName"
SELECT song_label_content->'$[*].SongLabels[*].SongLabelName' FROM t_music_song_label_relation;
result: ["Funk", "RnB"]
I have similar problem, search via function
create function searchGT(threshold int, d JSON)
returns int
begin
set #i = 0;
while #i < json_length(d) do
if json_extract(d, CONCAT('$[', #i, ']')) > threshold then
return json_extract(d, CONCAT('$[', #i, ']'));
end if;
set #i = #i + 1;
end while;
return null;
end;
select searchGT(3, CAST('[1,10,20]' AS JSON));
This seems to be possible with to JSON_TABLE function. It's available in mysql version 8.0 or mariadb version 10.6.
With this test setup
CREATE TEMPORARY TABLE mytable
WITH data(a,json) AS (VALUES ('a','[1]'),
('b','[1,2]'),
('c','[1,2,3]'),
('d','[1,2,3,4]'))
SELECT * from data;
we get the following table
+---+-----------+
| a | json |
+---+-----------+
| a | [1] |
| b | [1,2] |
| c | [1,2,3] |
| d | [1,2,3,4] |
+---+-----------+
It's possible to select every row from mytable wich has a value greater than 2 in the json array with this query.
SELECT * FROM mytable
WHERE TRUE IN (SELECT val > 2
FROM JSON_TABLE(json,'$[*]'
columns (val INT(1) path '$')
) as json
)
Returns:
+---+-----------+
| a | json |
+---+-----------+
| c | [1,2,3] |
| d | [1,2,3,4] |
+---+-----------+
I have a JSONB column called metrics in a table events. It stores various metrics as a flat hash, e.g.
{"m1": 123, "m2": 122.3, "m3": 32}
I would like to extract all the values stored in that column. Is it possible? I have found a function jsonb_object_keys(jsonb), but I failed to find anything similar for values.
Use jsonb_each() for this purpose:
WITH json_test(data) AS ( VALUES
('{"m1": 123, "m2": 122.3, "m3": 32}'::JSONB)
)
SELECT element.value
FROM json_test jt, jsonb_each(jt.data) as element;
Output:
value
-------
123
122.3
32
(3 rows)
Use jsonb_each() in a lateral join:
with val as (
select '{"m1": 123, "m2": 122.3, "m3": 32}'::jsonb js
)
select key, value
from val,
lateral jsonb_each(js);
key | value
-----+-------
m1 | 123
m2 | 122.3
m3 | 32
(3 rows)
Using json_each you can extract the values with:
SELECT value FROM json_each('{"m1": 123, "m2": 122.3, "m3": 32}')
Output
value
-----
123
122.3
32
I have a table like this:
+--------+--------------------+
| ID | Attribute |
+--------+--------------------+
| 1 |"color" => "red" |
+--------+--------------------+
| 1 |"color" => "green" |
+--------+--------------------+
| 1 |"shape" => "square" |
+--------+--------------------+
| 2 |"color" => "blue" |
+--------+--------------------+
| 2 |"color" => "black" |
+--------+--------------------+
| 2 |"flavor" => "sweat" |
+--------+--------------------+
| 2 |"flavor" => "salty" |
+--------+--------------------+
And I want to run some postgres query that get a result table like this:
+--------+------------------------------------------------------+
| ID | Attribute |
+--------+------------------------------------------------------+
| 1 |"color" => "red, green", "shape" => "square" |
+--------+------------------------------------------------------+
| 2 |"color" => "blue, black", "flavor" => "sweat, salty" |
+--------+------------------------------------------------------+
The attribute column can either be hstore or json format. I wrote it in hstore for an example, but if we cannot achieve this in hstore, but in json, I would change the column to json.
I know that hstore does not support one key to multiple values, when I tried some merge method, it only kept one value for each key. But for json, I didn't find anything that supports multiple value merge like this neither. I think this can be done by function merging values for the same key into a string/text and add it back to the key/value pair. But I'm stuck in implementing it.
Note: if implement this in some function, ideally any key such as color, shape should not appear in the function since keys can be expanded dynamically.
Does anyone have any idea about this? Any advice or brainstorm might help. Thank you!
Just a note before anything else: in your desidered output I would use some proper json and not that kind of lookalike. So a correct output according to me would be:
+--------+----------------------------------------------------------------------+
| ID | Attribute |
+--------+----------------------------------------------------------------------+
| 1 | '{"color":["red","green"], "flavor":[], "shape":["square"]}' |
+--------+----------------------------------------------------------------------+
| 2 | '{"color":["blue","black"], "flavor":["sweat","salty"], "shape":[]}' |
+--------+----------------------------------------------------------------------+
A PL/pgSQL function which parses the json attributes and executes a dynamic query would do the job, something like that:
CREATE OR REPLACE FUNCTION merge_rows(PAR_table regclass) RETURNS TABLE (
id integer,
attributes json
) AS $$
DECLARE
ARR_attributes text[];
VAR_attribute text;
ARR_query_parts text[];
BEGIN
-- Get JSON attributes names
EXECUTE format('SELECT array_agg(name ORDER BY name) AS name FROM (SELECT DISTINCT json_object_keys(attribute) AS name FROM %s) AS s', PAR_table) INTO ARR_attributes;
-- Write json_build_object() query part
FOREACH VAR_attribute IN ARRAY ARR_attributes LOOP
ARR_query_parts := array_append(ARR_query_parts, format('%L, array_remove(array_agg(l.%s), null)', VAR_attribute, VAR_attribute));
END LOOP;
-- Return dynamic query
RETURN QUERY EXECUTE format('
SELECT t.id, json_build_object(%s) AS attributes
FROM %s AS t,
LATERAL json_to_record(t.attribute) AS l(%s)
GROUP BY t.id;',
array_to_string(ARR_query_parts, ', '), PAR_table, array_to_string(ARR_attributes, ' text, ') || ' text');
END;
$$ LANGUAGE plpgsql;
I've tested it and it seems to work, it returns a json with. Here is my test code:
CREATE TABLE mytable (
id integer NOT NULL,
attribute json NOT NULL
);
INSERT INTO mytable (id, attribute) VALUES
(1, '{"color":"red"}'),
(1, '{"color":"green"}'),
(1, '{"shape":"square"}'),
(2, '{"color":"blue"}'),
(2, '{"color" :"black"}'),
(2, '{"flavor":"sweat"}'),
(2, '{"flavor":"salty"}');
SELECT * FROM merge_rows('mytable');
Of course you can pass the id and attribute column names as parameters as well and maybe refine the function a bit, this is just to give you an idea.
EDIT : If you're on 9.4 please consider using jsonb datatype, it's much better and gives you room for improvements. You would just need to change the json_* functions to their jsonb_* equivalents.
If you just want this for display purposes, this might be enough:
select id, string_agg(key||' => '||vals, ', ')
from (
select t.id, x.key, string_agg(value, ',') vals
from t
join lateral each(t.attributes) x on true
group by id, key
) t
group by id;
If you are not on 9.4, you can't use the lateral join:
select id, string_agg(key||' => '||vals, ', ')
from (
select id, key, string_agg(val, ',') as vals
from (
select t.id, skeys(t.attributes) as key, svals(t.attributes) as val
from t
) t1
group by id, key
) t2
group by id;
This will return:
id | string_agg
---+-------------------------------------------
1 | color => red,green, shape => square
2 | color => blue,black, flavor => sweat,salty
SQLFiddle: http://sqlfiddle.com/#!15/98caa/2