Postgresql 9.4 - Invalid input syntax when converting to JSONB - json

I've made a simple function to update a jsonb with new values:
CREATE OR REPLACE FUNCTION jsonupdate(
IN "pJson" jsonb, IN "pNewValues" jsonb)
RETURNS jsonb AS
$BODY$
DECLARE
jsonreturn jsonb;
BEGIN
jsonreturn := (SELECT json_object_agg(keyval.key, keyval.value::jsonb)
FROM (SELECT key,
CASE WHEN "pNewValues" ? key THEN
(SELECT "pNewValues" ->> key)
ELSE
value
END
FROM jsonb_each_text("pJson")) keyval);
RETURN jsonreturn;
END; $BODY$
LANGUAGE plpgsql IMMUTABLE
COST 100;
Sample inputs and outputs:
IN: SELECT jsonupdate('{"a" : "1", "b" : "2"}', '{"a": "3"}');
OUT: {"a": 3, "b": 2}
IN: SELECT jsonupdate('{"a" : "3", "b" : { "c": "text", "d": 1 }}', '{"b": { "c": "another text" }}');
OUT: {"a": 3, "b": {"c": "another text"}}
IN: SELECT jsonupdate('{"a" : "1", "b" : "2", "c": 3, "d": 4}', '{"a": "5", "d": 6}');
OUT: {"a": 5, "b": 2, "c": 3, "d": 6}
The problem happens when using inputs like this one: SELECT jsonupdate('{"a" : "1", "b" : ""}', '{"a": "5"}') or this one: SELECT jsonupdate('{"a" : "1", "b" : "2"}', '{"a": "."}') or this one: SELECT jsonupdate('{"a" : "1", "b" : "2"}', '{"a": ""}') it gives me an error
ERROR: invalid input syntax for type json
DETAIL: The input string ended unexpectedly.
CONTEXT: JSON data, line 1:
What's wrong here?

You sould use the jsonb_each() function (instead of jsonb_each_text()). Also, the -> operator (instead of ->>):
CREATE OR REPLACE FUNCTION jsonupdate(IN "pJson" jsonb, IN "pNewValues" jsonb)
RETURNS jsonb
LANGUAGE sql
IMMUTABLE AS
$BODY$
SELECT json_object_agg(key, CASE
WHEN "pNewValues" ? key THEN "pNewValues" -> key
ELSE value
END)
FROM jsonb_each("pJson")
$BODY$;
jsonb_each_text() and the ->> operator converts any non-string JSON value to their string representation. Converting those back to JSON will modify your data in a way you probably don't want to.
But I have to admit, what you are trying to achieve is almost the || (concatenation) operator. I.e.
SELECT jsonb '{"a" : "1", "b" : "2"}' || jsonb '{"a": "3"}'
will give you your desired output. The only difference between || and your function is when pNewValues contains key(s), which are not in pJson: || will append those too, while your function does not append them (it only modifies existing ones).
Update: for simulating the || operator on 9.4, you can use the following function:
CREATE OR REPLACE FUNCTION jsonb_merge_objects(jsonb, jsonb)
RETURNS jsonb
LANGUAGE sql
IMMUTABLE AS
$func$
SELECT json_object_agg(key, COALESCE(b.value, a.value))
FROM jsonb_each($1) a
LEFT JOIN jsonb_each($2) b USING (key)
$func$;

Related

Use MySQL 8 json_table to dynamically extract JSON keys and nested value

I've been attempting to use MySQL 8's JSON_TABLE to extract the root keys and then their nested values. The problem is the root keys are dynamic and the nested key/value pairs might not exist.
JSON:
{
"Foo": {
"A": 3
},
"Bar": {
"A": 1,
"B": 368
},
"Biz": {
"C": 2,
"D": 10
}
}
In this JSON the root keys "Foo", "Bar", and "Biz" are dynamic and for each of their objects I want to extract the "A" key's value, which may or may not exist. For example, the above code would return this result set:
json_key
a_value
Foo
3
Bar
1
Biz
null
I've been something along these lines but no luck (just returns one row of nulls):
select * from json_table('{"Foo": {"A": 3}, "Bar": {"A": 1, "B": 368}, "Biz": {"C": 2}}',
'$' COLUMNS(
json_key varchar(255) path '$.*',
sub_value integer path '$.*.A'
)
) as i;
In the worst case I can try to restructure the JSON but it's already in the database so I'm hoping to leverage MySQL's JSON capability. Any ideas?
SELECT json_key,
JSON_EXTRACT(#json_value, CONCAT('$.', json_key, '.A')) a_value
FROM JSON_TABLE(JSON_KEYS(#json_value),
'$[*]' COLUMNS (json_key VARCHAR(255) PATH '$')) keystable
https://dbfiddle.uk/?rdbms=mysql_8.0&fiddle=dd9c01e77d57206d587dd2d17340bc02

How to declare a JSON string in postgres?

There are many examples of json parsing in POSTGRES, which pull data from a table. I have a raw json string handy and would like to practice using JSON functions and operators. Is it possible to do this without using tables? Or ... what is the most straightfoward way to declare it as a variable? Something like...
# Declare
foojson = "{'a':'foo', 'b':'bar'}"
# Use
jsonb_array_elements(foojson) -> 'a'
Basically I'd like the last line to print to console or be wrappable in a SELECT statement so I can rapidly "play" with some of these operators.
You can pass it directly to the function
select '{"a": "foo", "b": "bar"}'::jsonb ->> 'a';
select *
from jsonb_each('{"a": "foo", "b": "bar"}');
select *
from jsonb_array_elements('[{"a": "foo"}, {"b": "bar"}]');
Or if you want to pretend, it's part of a table:
with data (json_value) as (
values
('{"a": "foo", "b": "bar"}'::jsonb),
('{"foo": 42, "x": 100}')
)
select e.*
from data d
cross join jsonb_each(d.json_value) as e;
with data (json_value) as (
values
('{"a": 1, "b": "x"}'::jsonb),
('{"a": 42, "b": "y"}')
)
select d.json_value ->> 'a',
d.json_value ->> 'b'
from data d;

Merging JSONB values in PostgreSQL?

Using the || operator yields the following result:
select '{"a":{"b":2}}'::jsonb || '{"a":{"c":3}}'::jsonb ;
?column?
-----------------
{"a": {"c": 3}}
(1 row)
I would like to be able to do achieve the following result (?? just a placeholder for the operator):
select '{"a":{"b":2}}'::jsonb ?? '{"a":{"c":3}}'::jsonb ;
?column?
-----------------
{"a": {"b": 2, "c": 3}}
(1 row)
So, you can see the top-level a key has its child values "merged" such that the result contains both b and c.
How do you "deep" merge two JSONB values in Postgres?
Is this possible, if so how?
A more complex test case:
select '{"a":{"b":{"c":3},"z":true}}'::jsonb ?? '{"a":{"b":{"d":4},"z":false}}'::jsonb ;
?column?
-----------------
{"a": {"b": {"c": 3, "d": 4}, "z": false}}
(1 row)
Another test case where a primitive "merges over" and object:
select '{"a":{"b":{"c":3},"z":true}}'::jsonb ?? '{"a":{"b":false,"z":false}}'::jsonb ;
?column?
-----------------
{"a": {"b": false, "z": false}}
(1 row)
You should merge unnested elements using jsonb_each() for both values. Doing this in a non-trivial query may be uncomfortable, so I would prefer a custom function like this one:
create or replace function jsonb_my_merge(a jsonb, b jsonb)
returns jsonb language sql as $$
select
jsonb_object_agg(
coalesce(ka, kb),
case
when va isnull then vb
when vb isnull then va
else va || vb
end
)
from jsonb_each(a) e1(ka, va)
full join jsonb_each(b) e2(kb, vb) on ka = kb
$$;
Use:
select jsonb_my_merge(
'{"a":{"b":2}, "d": {"e": 10}, "x": 1}'::jsonb,
'{"a":{"c":3}, "d": {"f": 11}, "y": 2}'::jsonb
)
jsonb_my_merge
------------------------------------------------------------------
{"a": {"b": 2, "c": 3}, "d": {"e": 10, "f": 11}, "x": 1, "y": 2}
(1 row)
You can slightly modify the function using recursion to get a solution working on any level of nesting:
create or replace function jsonb_recursive_merge(a jsonb, b jsonb)
returns jsonb language sql as $$
select
jsonb_object_agg(
coalesce(ka, kb),
case
when va isnull then vb
when vb isnull then va
when jsonb_typeof(va) <> 'object' then va || vb
else jsonb_recursive_merge(va, vb)
end
)
from jsonb_each(a) e1(ka, va)
full join jsonb_each(b) e2(kb, vb) on ka = kb
$$;
Examples:
select jsonb_recursive_merge(
'{"a":{"b":{"c":3},"x":5}}'::jsonb,
'{"a":{"b":{"d":4},"y":6}}'::jsonb);
jsonb_recursive_merge
------------------------------------------------
{"a": {"b": {"c": 3, "d": 4}, "x": 5, "y": 6}}
(1 row)
select jsonb_recursive_merge(
'{"a":{"b":{"c":{"d":{"e":1}}}}}'::jsonb,
'{"a":{"b":{"c":{"d":{"f":2}}}}}'::jsonb)
jsonb_recursive_merge
----------------------------------------------
{"a": {"b": {"c": {"d": {"e": 1, "f": 2}}}}}
(1 row)
Finally, the variant of the function with changes proposed by OP (see comments below):
create or replace function jsonb_recursive_merge(a jsonb, b jsonb)
returns jsonb language sql as $$
select
jsonb_object_agg(
coalesce(ka, kb),
case
when va isnull then vb
when vb isnull then va
when jsonb_typeof(va) <> 'object' or jsonb_typeof(vb) <> 'object' then vb
else jsonb_recursive_merge(va, vb) end
)
from jsonb_each(a) e1(ka, va)
full join jsonb_each(b) e2(kb, vb) on ka = kb
$$;
This kind of "deep merge" can be interpreted quite differently, depending on your use case. For completeness, my intuition usually dictates the following rules:
object + object: Every property survives from each object, which is not in the other object (JSON's null value is considered to be in the object, if it's explicitly mentioned). When a property is in both objects, the merge continues recursively with the same rules (this point is usually agreed on).
array + array: The result is the concatenation of the two arrays.
array + primitive/object: the result is the first array, with the second JSON value appended to it.
any other cases: The result is the second JSON value (so f.ex. primitives or incompatible types override each other).
create or replace function jsonb_merge_deep(jsonb, jsonb)
returns jsonb
language sql
immutable
as $func$
select case jsonb_typeof($1)
when 'object' then case jsonb_typeof($2)
when 'object' then (
select jsonb_object_agg(k, case
when e2.v is null then e1.v
when e1.v is null then e2.v
else jsonb_merge_deep(e1.v, e2.v)
end)
from jsonb_each($1) e1(k, v)
full join jsonb_each($2) e2(k, v) using (k)
)
else $2
end
when 'array' then $1 || $2
else $2
end
$func$;
This function's added bonus is that it can be called with literally any type of JSON values: always produces a result & never complains about JSON value types.
http://rextester.com/FAC95623
After PostgreSQL 9.5 you can use jsonb_set function:
'{a,c}' looking into path if it is not there, it will created.
'{"a":{"c":3}}'::jsonb#>'{a,c}' this will get the value of c
new_value added if create_missing is true ( default is true)
Hier is document jsonb -functions
select jsonb_set('{"a":{"b":2}}', '{a,c}','{"a":{"c":3}}'::jsonb#>'{a,c}' )
Result: {"a":{"c":3,"b":2}}
Merge more attribute at once:
with jsonb_paths(main_part,missing_part) as (
values ('{"a":{"b":2}}','{"a":{"c":3,"d":4}}')
)
select jsonb_object_agg(t.k,t.v||t2.v)
from jsonb_paths,
jsonb_each(main_part::jsonb) t(k,v),
jsonb_each(missing_part::jsonb) t2(k,v);
result: {"a":{"c":3,"b":2,"d":4}}
As #lightSouls say, after PostgreSQL 9.5 you can use jsonb_set() function... But you must to learn how to use it!
jsonb_set can merge or destroy...
Supposing j:='{"a":{"x":1},"b":2}'::jsonb.
jsonb_set(j, '{a,y}', '1'::jsonb); will merge object {"y":1} with object {"x":1}. Result: {"a": {"x": 1, "y": 1}, "b": 2}
jsonb_set(j, '{a}', '{"x":1}'::jsonb); will destroy! replacing full old object by the new one.Result: {"a": {"x": 1}, "b": 2}
Combining (merging :-D) answers from #klin, #pozs and comment from #Arman Khubezhov while also actually merging arrays instead of concatenating (which resulted in duplicates otherwise), came up with the following function:
create or replace function jsonb_merge_deep(jsonb, jsonb)
returns jsonb
language sql
immutable
as $func$
select case jsonb_typeof($1)
when 'object' then
case jsonb_typeof($2)
when 'object' then (
select jsonb_object_agg(k,
case
when e2.v is null then e1.v
when e1.v is null then e2.v
else jsonb_merge_deep(e1.v, e2.v)
end
)
from jsonb_each($1) e1(k, v)
full join jsonb_each($2) e2(k, v) using (k)
)
else COALESCE($2, $1)
end
when 'array' then
(
SELECT jsonb_agg(items.val)
FROM (
SELECT jsonb_array_elements($1) AS val
UNION
SELECT jsonb_array_elements($2) AS val
) AS items
)
else $2
end
$func$;
Based on comment from #Arman Khubezhov, enhanced the case when any of $1 or $2 is null with:
else COALESCE($2, $1)
And added real merge (no duplicate) of the 2 arrays values with:
when 'array' then
(
SELECT jsonb_agg(items.val)
FROM (
SELECT jsonb_array_elements($1) AS val
UNION
SELECT jsonb_array_elements($2) AS val
) AS items
)
Glad if one can come up with a enhanced code for this one - like an existing PostreSQL function I am not aware of?
Pros: no data loss when combining 2 JSONB values or updating a JSONB field in an UPDATE query like.
UPDATE my_table
SET my_jsonb_field = jsonb_merge_deep(my_jsonb_field, '{ "a": { "aa" : { "aaa" : [6, 4, 7] } } }'::jsonb)
Cons: removing a key/value or array value requires a dedicated query.

How to update jsonb string with PostgreSQL?

I'm using PostgreSQL 9.4.5. I'd like to update a jsonb column.
My table is structured this way:
CREATE TABLE my_table (
gid serial PRIMARY KEY,
"data" jsonb
);
JSON strings are like this:
{"files": [], "ident": {"id": 1, "country": null, "type ": "20"}}
The following SQL doesn't do the job (syntax error - SQL state = 42601):
UPDATE my_table SET "data" -> 'ident' -> 'country' = 'Belgium';
Is there a way to achieve that?
Ok there are two functions:
create or replace function set_jsonb_value(p_j jsonb, p_key text, p_value jsonb) returns jsonb as $$
select jsonb_object_agg(t.key, t.value) from (
select
key,
case
when jsonb_typeof(value) = 'object' then set_jsonb_value(value, p_key, p_value)
when key = p_key then p_value
else value
end as value from jsonb_each(p_j)) as t;
$$ language sql immutable;
First one just changes the value of the existing key regardless of the key path:
postgres=# select set_jsonb_value(
'{"files": [], "country": null, "ident": {"id": 1, "country": null, "type ": "20"}}',
'country',
'"foo"');
set_jsonb_value
--------------------------------------------------------------------------------------
{"files": [], "ident": {"id": 1, "type ": "20", "country": "foo"}, "country": "foo"}
(1 row)
create or replace function set_jsonb_value(p_j jsonb, p_path text[], p_value jsonb) returns jsonb as $$
select jsonb_object_agg(t.key, t.value) from (
select
key,
case
when jsonb_typeof(value) = 'object' then set_jsonb_value(value, p_path[2:1000], p_value)
when key = p_path[1] then p_value
else value
end as value from jsonb_each(p_j)
union all
select
p_path[1],
case
when array_length(p_path,1) = 1 then p_value
else set_jsonb_value('{}', p_path[2:1000], p_value) end
where not p_j ? p_path[1]) as t;
$$ language sql immutable;
Second one changes the value of the existing key using the path specified or creates it if the path does not exists:
postgres=# select set_jsonb_value(
'{"files": [], "country": null, "ident": {"id": 1, "type ": "20"}}',
'{ident,country}'::text[],
'"foo"');
set_jsonb_value
-------------------------------------------------------------------------------------
{"files": [], "ident": {"id": 1, "type ": "20", "country": "foo"}, "country": null}
(1 row)
postgres=# select set_jsonb_value(
'{"files": [], "country": null, "ident": {"id": 1, "type ": "20"}}',
'{ident,foo,bar,country}'::text[],
'"foo"');
set_jsonb_value
-------------------------------------------------------------------------------------------------------
{"files": [], "ident": {"id": 1, "foo": {"bar": {"country": "foo"}}, "type ": "20"}, "country": null}
(1 row)
Hope it will help to someone who uses the PostgreSQL < 9.5
Disclaimer: Tested on PostgreSQL 9.5
In PG 9.4 you are out of luck with "easy" solutions like jsonb_set() (9.5). Your only option is to unpack the JSON object, make the changes and re-build the object. That sounds very cumbersome and it is indeed: JSON is horrible to manipulate, no matter how advanced or elaborate the built-in functions.
CREATE TYPE data_ident AS (id integer, country text, "type" integer);
UPDATE my_table
SET "data" = json_build_object('files', "data"->'files', 'ident', ident.j)::jsonb
FROM (
SELECT gid, json_build_object('id', j.id, 'country', 'Belgium', 'type', j."type") AS j
FROM my_table
JOIN LATERAL jsonb_populate_record(null::data_ident, "data"->'ident') j ON true) ident
WHERE my_table.gid = ident.gid;
In the SELECT clause "data"->'ident' is unpacked into a record (for which you need to CREATE TYPE a structure). Then it is built right back into a JSON object with the new country name. In the UPDATE that "ident" object is re-joined with the "files" object and the whole thing cast to a jsonb.
A pure thing of beauty -- just so long as speed is not your thing...
My previous solution relied on 9.5 functionality.
I would recommend instead either going with abelisto's solutions below or using pl/perlu, plpythonu, or plv8js to write json mutators in a language that has better support for them.

How can I merge records inside two JSON arrays?

I have two Postgres SQL queries returning JSON arrays:
q1:
[
{"id": 1, "a": "text1a", "b": "text1b"},
{"id": 2, "a": "text2a", "b": "text2b"},
{"id": 2, "a": "text3a", "b": "text3b"},
...
]
q2:
[
{"id": 1, "percent": 12.50},
{"id": 2, "percent": 75.00},
{"id": 3, "percent": 12.50}
...
]
I want the result to be a union of both array unique elements:
[
{"id": 1, "a": "text1a", "b": "text1b", "percent": 12.50},
{"id": 2, "a": "text2a", "b": "text2b", "percent": 75.00},
{"id": 3, "a": "text3a", "b": "text3b", "percent": 12.50},
...
]
How can this be done with SQL in Postgres 9.4?
Assuming data type jsonb and that you want to merge records of each JSON array that share the same 'id' value.
Postgres 9.5
makes it simpler with the new concatenate operator || for jsonb values:
SELECT json_agg(elem1 || elem2) AS result
FROM (
SELECT elem1->>'id' AS id, elem1
FROM (
SELECT '[
{"id":1, "percent":12.50},
{"id":2, "percent":75.00},
{"id":3, "percent":12.50}
]'::jsonb AS js
) t, jsonb_array_elements(t.js) elem1
) t1
FULL JOIN (
SELECT elem2->>'id' AS id, elem2
FROM (
SELECT '[
{"id": 1, "a": "text1a", "b": "text1b", "percent":12.50},
{"id": 2, "a": "text2a", "b": "text2b", "percent":75.00},
{"id": 3, "a": "text3a", "b": "text3b", "percent":12.50}]'::jsonb AS js
) t, jsonb_array_elements(t.js) elem2
) t2 USING (id);
The FULL [OUTER] JOIN makes sure you don't lose records without match in the other array.
The type jsonb has the convenient property to only keep the latest value for each key in the record. Hence, the duplicate 'id' key in the result is merged automatically.
The Postgres 9.5 manual also advises:
Note: The || operator concatenates the elements at the top level of
each of its operands. It does not operate recursively. For example, if
both operands are objects with a common key field name, the value of
the field in the result will just be the value from the right hand operand.
Postgres 9.4
Is a bit less convenient. My idea would be to extract array elements, then extract all key/value pairs, UNION both results, aggregate into a single new jsonb values per id value and finally aggregate into a single array.
SELECT json_agg(j) -- ::jsonb
FROM (
SELECT json_object_agg(key, value)::jsonb AS j
FROM (
SELECT elem->>'id' AS id, x.*
FROM (
SELECT '[
{"id":1, "percent":12.50},
{"id":2, "percent":75.00},
{"id":3, "percent":12.50}]'::jsonb AS js
) t, jsonb_array_elements(t.js) elem, jsonb_each(elem) x
UNION ALL -- or UNION, see below
SELECT elem->>'id' AS id, x.*
FROM (
SELECT '[
{"id": 1, "a": "text1a", "b": "text1b", "percent":12.50},
{"id": 2, "a": "text2a", "b": "text2b", "percent":75.00},
{"id": 3, "a": "text3a", "b": "text3b", "percent":12.50}]'::jsonb AS js
) t, jsonb_array_elements(t.js) elem, jsonb_each(elem) x
) t
GROUP BY id
) t;
The cast to jsonb removes duplicate keys. Alternatively you could use UNION to fold duplicates (for instance if you want json as result). Test which is faster for your case.
Related:
How to turn json array into postgres array?
Merging Concatenating JSON(B) columns in query
For any single jsonb element this use of the concat || operator works well for me with strip_nulls and another trick to cast the result back to jsonb (not an array).
select jsonb_array_elements(jsonb_strip_nulls(jsonb_agg(
'{
"a" : "unchanged value",
"b" : "old value",
"d" : "delete me"
}'::jsonb
|| -- The concat operator works as merge on jsonb, the right operand takes precedence
-- NOTE: it only works one JSON level deep
'{
"b" : "NEW value",
"c" : "NEW field",
"d" : null
}'::jsonb
)));
This gives the result
{"a": "unchanged value", "b": "NEW value", "c": "NEW field"}
which is properly typed jsonb