use JSON_EXTRACT, JSON_SET, JSON_REPLACE, JSON_INSERT in json array - mysql

I have json type column in mysql named names and its a simple json array (not key/value). i couldn't find any example of using JSON_EXTRACT, JSON_SET, JSON_REPLACE, JSON_INSERT for simple json array field.
I know there are other ways to manipulate a json array in json field type, but is it possible to use these functions for json array?
for example, name field contains ["A","B","C"], how can I use these functions to perform an update, insert and delete on this json?
update
query must execute from a php script

The functions you refer to all work exactly as expected and described in the manual; that is to say JSON_SET will insert or replace if a value already exists, JSON_INSERT will insert if a value doesn't already exist, and JSON_REPLACE will replace a pre-existing value. You can use JSON_ARRAY_INSERT and JSON_ARRAY_APPEND to more easily add values to a JSON array.
-- extract second element
select json_extract('["A", "B", "C"]', '$[1]')
-- "B"
-- replace second element
select json_set('["A", "B", "C"]', '$[1]', 'D')
-- ["A", "D", "C"]
-- insert fourth element
select json_set('["A", "B", "C"]', '$[3]', 'E')
-- ["A", "B", "C", "E"]
-- attempt to insert second element fails as it already exists
select json_insert('["A", "B", "C"]', '$[1]', 'F')
-- ["A", "B", "C"]
-- use json_array_insert to insert a new second element and move the other elements right
select json_array_insert('["A", "B", "C"]', '$[1]', 'F')
-- ["A", "F", "B", "C"]
-- insert fourth element
select json_insert('["A", "B", "C"]', '$[3]', 'F')
-- ["A", "B", "C", "F"]
-- or use json_array_append to add an element at the end
select json_array_append('["A", "B", "C"]', '$', 'F')
-- ["A", "B", "C", "F"]
-- replace second element
select json_replace('["A", "B", "C"]', '$[1]', 'G')
-- ["A", "G", "C"]
-- attempt to replace non-existing element fails
select json_replace('["A", "B", "C"]', '$[3]', 'G')
-- ["A", "B", "C"]
Demo on dbfiddle
To use these functions on a column in a table, simply replace the ["A", "B", "C"] in the above calls with the column name, for example:
create table test (j json);
insert into test values ('["A", "B", "C"]');
select json_array_insert(j, '$[1]', 'F')
from test
-- ["A", "F", "B", "C"]
Demo on dbfiddle

i think found the solution
for json array, it's not possible to use JSON_EXTRACT, JSON_SET, JSON_REPLACE, JSON_INSERT by array values, and you have to know the place of each value in json array (in my opinion it's a weakness).
for example to select 2nd value you can use $[1],
but for insert values you can use
JSON_ARRAY_APPENDand JSON_ARRAY_INSERT

Related

Export SQLite table which contains JSON column without double escaping

Let a SQLite table be built like :
create table t(id, j json);
insert into t values (1, '{"name": "bob"}');
insert into t values (2, '{"name": "alice", "age":20, "hobbies":[ "a", "b", "c"] }');
What's the easiest way to export the whole table as valid JSON, without JSON strings being escaped?
sqlite> .mode json
sqlite> select id, j from t;
[{"id":1,"j":"{\"name\": \"bob\"}"},
{"id":2,"j":"{\"name\": \"alice\", \"age\":20, \"hobbies\":[ \"a\", \"b\", \"c\"] }"}] -- WRONG!
JSON column may vary. Couldn't do it with json_extract function. Expecting parsable JSON,
[{"id":1,"j":{"name": "bob"}},
{"id":2,"j":{"name": "alice", "age":20, "hobbies":[ "a", "b", "c"] }}]
Use the functions json() to remove escaping from the column j and json_group_array() to aggregate:
select json_group_array(json_object('id', id, 'j', json(j))) result
from t
See the demo.
Results:
result
[{"id":1,"j":{"name":"bob"}},{"id":2,"j":{"name":"alice","age":20,"hobbies":["a","b","c"]}}]

How to declare a JSON string in postgres?

There are many examples of json parsing in POSTGRES, which pull data from a table. I have a raw json string handy and would like to practice using JSON functions and operators. Is it possible to do this without using tables? Or ... what is the most straightfoward way to declare it as a variable? Something like...
# Declare
foojson = "{'a':'foo', 'b':'bar'}"
# Use
jsonb_array_elements(foojson) -> 'a'
Basically I'd like the last line to print to console or be wrappable in a SELECT statement so I can rapidly "play" with some of these operators.
You can pass it directly to the function
select '{"a": "foo", "b": "bar"}'::jsonb ->> 'a';
select *
from jsonb_each('{"a": "foo", "b": "bar"}');
select *
from jsonb_array_elements('[{"a": "foo"}, {"b": "bar"}]');
Or if you want to pretend, it's part of a table:
with data (json_value) as (
values
('{"a": "foo", "b": "bar"}'::jsonb),
('{"foo": 42, "x": 100}')
)
select e.*
from data d
cross join jsonb_each(d.json_value) as e;
with data (json_value) as (
values
('{"a": 1, "b": "x"}'::jsonb),
('{"a": 42, "b": "y"}')
)
select d.json_value ->> 'a',
d.json_value ->> 'b'
from data d;

Select from mysql JSON field where array starts with subarray

I'd like to select a row in a mysql table, filtering on the start of an array contained in a json field in the row.
CREATE TABLE test (id int, c1 json);
INSERT INTO test VALUES
(1, '{ "path": ["a", "b", "c"] }'),
(2, '{ "path": ["a", "b"] }'),
(3, '{ "path": ["a", "b", "d"] }'),
(4, '{ "path": ["a"] }'),
(5, '{ "path": ["e", "a", "b"] }')
(6, '{ "path": ["a", "e", "b"] }');
So with the setup above, I would like to search for paths starting with ["a", "b"] and get 1, 2, and 3.
4, 5 and 6 would not be returned as they do not start with the path ["a", "b"].
select c1 from test where json_contains(c1, '["a", "b"]', '$.path'); is as close as I've found, but fails as json_contains does not try and match the items contiguously or from the start of the array.
select id from test where json_extract(c1,"$.path[0]") ='a' and json_extract(c1,"$.path[1]") ='b';

Merging JSONB values in PostgreSQL?

Using the || operator yields the following result:
select '{"a":{"b":2}}'::jsonb || '{"a":{"c":3}}'::jsonb ;
?column?
-----------------
{"a": {"c": 3}}
(1 row)
I would like to be able to do achieve the following result (?? just a placeholder for the operator):
select '{"a":{"b":2}}'::jsonb ?? '{"a":{"c":3}}'::jsonb ;
?column?
-----------------
{"a": {"b": 2, "c": 3}}
(1 row)
So, you can see the top-level a key has its child values "merged" such that the result contains both b and c.
How do you "deep" merge two JSONB values in Postgres?
Is this possible, if so how?
A more complex test case:
select '{"a":{"b":{"c":3},"z":true}}'::jsonb ?? '{"a":{"b":{"d":4},"z":false}}'::jsonb ;
?column?
-----------------
{"a": {"b": {"c": 3, "d": 4}, "z": false}}
(1 row)
Another test case where a primitive "merges over" and object:
select '{"a":{"b":{"c":3},"z":true}}'::jsonb ?? '{"a":{"b":false,"z":false}}'::jsonb ;
?column?
-----------------
{"a": {"b": false, "z": false}}
(1 row)
You should merge unnested elements using jsonb_each() for both values. Doing this in a non-trivial query may be uncomfortable, so I would prefer a custom function like this one:
create or replace function jsonb_my_merge(a jsonb, b jsonb)
returns jsonb language sql as $$
select
jsonb_object_agg(
coalesce(ka, kb),
case
when va isnull then vb
when vb isnull then va
else va || vb
end
)
from jsonb_each(a) e1(ka, va)
full join jsonb_each(b) e2(kb, vb) on ka = kb
$$;
Use:
select jsonb_my_merge(
'{"a":{"b":2}, "d": {"e": 10}, "x": 1}'::jsonb,
'{"a":{"c":3}, "d": {"f": 11}, "y": 2}'::jsonb
)
jsonb_my_merge
------------------------------------------------------------------
{"a": {"b": 2, "c": 3}, "d": {"e": 10, "f": 11}, "x": 1, "y": 2}
(1 row)
You can slightly modify the function using recursion to get a solution working on any level of nesting:
create or replace function jsonb_recursive_merge(a jsonb, b jsonb)
returns jsonb language sql as $$
select
jsonb_object_agg(
coalesce(ka, kb),
case
when va isnull then vb
when vb isnull then va
when jsonb_typeof(va) <> 'object' then va || vb
else jsonb_recursive_merge(va, vb)
end
)
from jsonb_each(a) e1(ka, va)
full join jsonb_each(b) e2(kb, vb) on ka = kb
$$;
Examples:
select jsonb_recursive_merge(
'{"a":{"b":{"c":3},"x":5}}'::jsonb,
'{"a":{"b":{"d":4},"y":6}}'::jsonb);
jsonb_recursive_merge
------------------------------------------------
{"a": {"b": {"c": 3, "d": 4}, "x": 5, "y": 6}}
(1 row)
select jsonb_recursive_merge(
'{"a":{"b":{"c":{"d":{"e":1}}}}}'::jsonb,
'{"a":{"b":{"c":{"d":{"f":2}}}}}'::jsonb)
jsonb_recursive_merge
----------------------------------------------
{"a": {"b": {"c": {"d": {"e": 1, "f": 2}}}}}
(1 row)
Finally, the variant of the function with changes proposed by OP (see comments below):
create or replace function jsonb_recursive_merge(a jsonb, b jsonb)
returns jsonb language sql as $$
select
jsonb_object_agg(
coalesce(ka, kb),
case
when va isnull then vb
when vb isnull then va
when jsonb_typeof(va) <> 'object' or jsonb_typeof(vb) <> 'object' then vb
else jsonb_recursive_merge(va, vb) end
)
from jsonb_each(a) e1(ka, va)
full join jsonb_each(b) e2(kb, vb) on ka = kb
$$;
This kind of "deep merge" can be interpreted quite differently, depending on your use case. For completeness, my intuition usually dictates the following rules:
object + object: Every property survives from each object, which is not in the other object (JSON's null value is considered to be in the object, if it's explicitly mentioned). When a property is in both objects, the merge continues recursively with the same rules (this point is usually agreed on).
array + array: The result is the concatenation of the two arrays.
array + primitive/object: the result is the first array, with the second JSON value appended to it.
any other cases: The result is the second JSON value (so f.ex. primitives or incompatible types override each other).
create or replace function jsonb_merge_deep(jsonb, jsonb)
returns jsonb
language sql
immutable
as $func$
select case jsonb_typeof($1)
when 'object' then case jsonb_typeof($2)
when 'object' then (
select jsonb_object_agg(k, case
when e2.v is null then e1.v
when e1.v is null then e2.v
else jsonb_merge_deep(e1.v, e2.v)
end)
from jsonb_each($1) e1(k, v)
full join jsonb_each($2) e2(k, v) using (k)
)
else $2
end
when 'array' then $1 || $2
else $2
end
$func$;
This function's added bonus is that it can be called with literally any type of JSON values: always produces a result & never complains about JSON value types.
http://rextester.com/FAC95623
After PostgreSQL 9.5 you can use jsonb_set function:
'{a,c}' looking into path if it is not there, it will created.
'{"a":{"c":3}}'::jsonb#>'{a,c}' this will get the value of c
new_value added if create_missing is true ( default is true)
Hier is document jsonb -functions
select jsonb_set('{"a":{"b":2}}', '{a,c}','{"a":{"c":3}}'::jsonb#>'{a,c}' )
Result: {"a":{"c":3,"b":2}}
Merge more attribute at once:
with jsonb_paths(main_part,missing_part) as (
values ('{"a":{"b":2}}','{"a":{"c":3,"d":4}}')
)
select jsonb_object_agg(t.k,t.v||t2.v)
from jsonb_paths,
jsonb_each(main_part::jsonb) t(k,v),
jsonb_each(missing_part::jsonb) t2(k,v);
result: {"a":{"c":3,"b":2,"d":4}}
As #lightSouls say, after PostgreSQL 9.5 you can use jsonb_set() function... But you must to learn how to use it!
jsonb_set can merge or destroy...
Supposing j:='{"a":{"x":1},"b":2}'::jsonb.
jsonb_set(j, '{a,y}', '1'::jsonb); will merge object {"y":1} with object {"x":1}. Result: {"a": {"x": 1, "y": 1}, "b": 2}
jsonb_set(j, '{a}', '{"x":1}'::jsonb); will destroy! replacing full old object by the new one.Result: {"a": {"x": 1}, "b": 2}
Combining (merging :-D) answers from #klin, #pozs and comment from #Arman Khubezhov while also actually merging arrays instead of concatenating (which resulted in duplicates otherwise), came up with the following function:
create or replace function jsonb_merge_deep(jsonb, jsonb)
returns jsonb
language sql
immutable
as $func$
select case jsonb_typeof($1)
when 'object' then
case jsonb_typeof($2)
when 'object' then (
select jsonb_object_agg(k,
case
when e2.v is null then e1.v
when e1.v is null then e2.v
else jsonb_merge_deep(e1.v, e2.v)
end
)
from jsonb_each($1) e1(k, v)
full join jsonb_each($2) e2(k, v) using (k)
)
else COALESCE($2, $1)
end
when 'array' then
(
SELECT jsonb_agg(items.val)
FROM (
SELECT jsonb_array_elements($1) AS val
UNION
SELECT jsonb_array_elements($2) AS val
) AS items
)
else $2
end
$func$;
Based on comment from #Arman Khubezhov, enhanced the case when any of $1 or $2 is null with:
else COALESCE($2, $1)
And added real merge (no duplicate) of the 2 arrays values with:
when 'array' then
(
SELECT jsonb_agg(items.val)
FROM (
SELECT jsonb_array_elements($1) AS val
UNION
SELECT jsonb_array_elements($2) AS val
) AS items
)
Glad if one can come up with a enhanced code for this one - like an existing PostreSQL function I am not aware of?
Pros: no data loss when combining 2 JSONB values or updating a JSONB field in an UPDATE query like.
UPDATE my_table
SET my_jsonb_field = jsonb_merge_deep(my_jsonb_field, '{ "a": { "aa" : { "aaa" : [6, 4, 7] } } }'::jsonb)
Cons: removing a key/value or array value requires a dedicated query.

How can I merge records inside two JSON arrays?

I have two Postgres SQL queries returning JSON arrays:
q1:
[
{"id": 1, "a": "text1a", "b": "text1b"},
{"id": 2, "a": "text2a", "b": "text2b"},
{"id": 2, "a": "text3a", "b": "text3b"},
...
]
q2:
[
{"id": 1, "percent": 12.50},
{"id": 2, "percent": 75.00},
{"id": 3, "percent": 12.50}
...
]
I want the result to be a union of both array unique elements:
[
{"id": 1, "a": "text1a", "b": "text1b", "percent": 12.50},
{"id": 2, "a": "text2a", "b": "text2b", "percent": 75.00},
{"id": 3, "a": "text3a", "b": "text3b", "percent": 12.50},
...
]
How can this be done with SQL in Postgres 9.4?
Assuming data type jsonb and that you want to merge records of each JSON array that share the same 'id' value.
Postgres 9.5
makes it simpler with the new concatenate operator || for jsonb values:
SELECT json_agg(elem1 || elem2) AS result
FROM (
SELECT elem1->>'id' AS id, elem1
FROM (
SELECT '[
{"id":1, "percent":12.50},
{"id":2, "percent":75.00},
{"id":3, "percent":12.50}
]'::jsonb AS js
) t, jsonb_array_elements(t.js) elem1
) t1
FULL JOIN (
SELECT elem2->>'id' AS id, elem2
FROM (
SELECT '[
{"id": 1, "a": "text1a", "b": "text1b", "percent":12.50},
{"id": 2, "a": "text2a", "b": "text2b", "percent":75.00},
{"id": 3, "a": "text3a", "b": "text3b", "percent":12.50}]'::jsonb AS js
) t, jsonb_array_elements(t.js) elem2
) t2 USING (id);
The FULL [OUTER] JOIN makes sure you don't lose records without match in the other array.
The type jsonb has the convenient property to only keep the latest value for each key in the record. Hence, the duplicate 'id' key in the result is merged automatically.
The Postgres 9.5 manual also advises:
Note: The || operator concatenates the elements at the top level of
each of its operands. It does not operate recursively. For example, if
both operands are objects with a common key field name, the value of
the field in the result will just be the value from the right hand operand.
Postgres 9.4
Is a bit less convenient. My idea would be to extract array elements, then extract all key/value pairs, UNION both results, aggregate into a single new jsonb values per id value and finally aggregate into a single array.
SELECT json_agg(j) -- ::jsonb
FROM (
SELECT json_object_agg(key, value)::jsonb AS j
FROM (
SELECT elem->>'id' AS id, x.*
FROM (
SELECT '[
{"id":1, "percent":12.50},
{"id":2, "percent":75.00},
{"id":3, "percent":12.50}]'::jsonb AS js
) t, jsonb_array_elements(t.js) elem, jsonb_each(elem) x
UNION ALL -- or UNION, see below
SELECT elem->>'id' AS id, x.*
FROM (
SELECT '[
{"id": 1, "a": "text1a", "b": "text1b", "percent":12.50},
{"id": 2, "a": "text2a", "b": "text2b", "percent":75.00},
{"id": 3, "a": "text3a", "b": "text3b", "percent":12.50}]'::jsonb AS js
) t, jsonb_array_elements(t.js) elem, jsonb_each(elem) x
) t
GROUP BY id
) t;
The cast to jsonb removes duplicate keys. Alternatively you could use UNION to fold duplicates (for instance if you want json as result). Test which is faster for your case.
Related:
How to turn json array into postgres array?
Merging Concatenating JSON(B) columns in query
For any single jsonb element this use of the concat || operator works well for me with strip_nulls and another trick to cast the result back to jsonb (not an array).
select jsonb_array_elements(jsonb_strip_nulls(jsonb_agg(
'{
"a" : "unchanged value",
"b" : "old value",
"d" : "delete me"
}'::jsonb
|| -- The concat operator works as merge on jsonb, the right operand takes precedence
-- NOTE: it only works one JSON level deep
'{
"b" : "NEW value",
"c" : "NEW field",
"d" : null
}'::jsonb
)));
This gives the result
{"a": "unchanged value", "b": "NEW value", "c": "NEW field"}
which is properly typed jsonb