Update every value in an array in postgres json - json

In my postgres database I have json that looks similar to this:
{
"myArray": [
{
"myValue": 1
},
{
"myValue": 2
},
{
"myValue": 3
}
]
}
Now I want to rename myValue to otherValue. I can't be sure about the length of the array! Preferably I would like to use something like set_jsonb with a wildcard as the array index, but that does not seem to be supported. So what is the nicest solution?

You have to decompose a whole jsonb object, modify individual elements and build the object back.
The custom function will be helpful:
create or replace function jsonb_change_keys_in_array(arr jsonb, old_key text, new_key text)
returns jsonb language sql as $$
select jsonb_agg(case
when value->old_key is null then value
else value- old_key || jsonb_build_object(new_key, value->old_key)
end)
from jsonb_array_elements(arr)
$$;
Use:
with my_table (id, data) as (
values(1,
'{
"myArray": [
{
"myValue": 1
},
{
"myValue": 2
},
{
"myValue": 3
}
]
}'::jsonb)
)
select
id,
jsonb_build_object(
'myArray',
jsonb_change_keys_in_array(data->'myArray', 'myValue', 'otherValue')
)
from my_table;
id | jsonb_build_object
----+------------------------------------------------------------------------
1 | {"myArray": [{"otherValue": 1}, {"otherValue": 2}, {"otherValue": 3}]}
(1 row)

Using json functions are definitely the most elegant, but you can get by on using character replacement. Cast the json(b) as text, perform the replace, then change it back to json(b). In this example I included the quotes and colon to help the text replace target the json keys without conflict with values.
CREATE TABLE mytable ( id INT, data JSONB );
INSERT INTO mytable VALUES (1, '{"myArray": [{"myValue": 1},{"myValue": 2},{"myValue": 3}]}');
INSERT INTO mytable VALUES (2, '{"myArray": [{"myValue": 4},{"myValue": 5},{"myValue": 6}]}');
SELECT * FROM mytable;
UPDATE mytable
SET data = REPLACE(data :: TEXT, '"myValue":', '"otherValue":') :: JSONB;
SELECT * FROM mytable;
http://sqlfiddle.com/#!17/1c28a/9/4

Related

Update value inside of nested json array

I have JSON stored in a table. The JSON is nested and has the following structure
[
{
"name": "abc",
"ques": [
{
"qId": 100
},
{
"qId": 200
}
]
},{
"name": "xyz",
"ques": [
{
"qId": 100
},
{
"qId": 300
}
]
}
]
Update TABLE_NAME
set COLUMN_NAME = jsonb_set(COLUMN_NAME, '{ques,qId}', '101')
WHERE COLUMN_NAME->>'qId'=100
I am trying to update qId value from JSON. If qId is 100, I want to update it to 101.
1st solution, simple but to be used carefully
You convert your json data into text and you use the replace function :
Update TABLE_NAME
set COLUMN_NAME = replace(COLUMN_NAME :: text,'"qId": 100}', '"qId": 101}') :: jsonb
2nd solution more elegant and more complex
jsonb_set cannot make several replacements in the same jsonb data at the same time. To do so, you need to create your own aggregate based on the jsonb_set function :
CREATE OR REPLACE FUNCTION jsonb_set(x jsonb, y jsonb, path text[], new_value jsonb) RETURNS jsonb LANGUAGE sql AS $$
SELECT jsonb_set(COALESCE(x, y), path, new_value) ; $$ ;
CREATE OR REPLACE AGGREGATE jsonb_set_agg(x jsonb, path text[], new_value jsonb)
( stype = jsonb, sfunc = jsonb_set);
Then you get your result with the following query :
UPDATE TABLE_NAME
SET COLUMN_NAME =
( SELECT jsonb_set_agg(COLUMN_NAME :: jsonb, array[(a.id - 1) :: text, 'ques', (b.id - 1) :: text], jsonb_build_object('qId', 101))
FROM jsonb_path_query(COLUMN_NAME :: jsonb, '$[*]') WITH ORDINALITY AS a(content, id)
CROSS JOIN LATERAL jsonb_path_query(a.content->'ques', '$[*]') WITH ORDINALITY AS b(content, id)
WHERE (b.content)->'qId' = to_jsonb(100)
)
Note that this query is not universal, and it must breakdown the jsonb data according to its structure.
Note that jsonb_array_elements can be used in place of jsonb_path_query, but you will get an error with jsonb_array_elements when the jsonb data is not an array, whereas you won't get any error with jsonb_path_query in lax mode which is the default mode.
Full test results in dbfiddle
You must specify the whole path to the value.
In this case your json is an array so you need to address which element of this array your are trying to modify.
A direct approach (over your example) would be:
jsonb_set(
jsonb_set(
COLUMN_NAME
, '{0,ques,qId}'
, '101'
)
, '{1,ques,qId}'
, '101'
)
Of course, if you want to modify every element of different arrays of different lengths you would need to elaborate this approach disassembling the array to modify every contained element.

Add a new key/value pair into a nested array inside a PostgreSQL JSON column

Using PostgreSQL 13.4 I have a table with a JSON column in a structure like the following sample:
{
"username": "jsmith",
"location": "United States",
"posts": [
{
"id":"1",
"title":"Welcome",
"newKey":true <----------- insert new key/value pair here
},
{
"id":"4",
"title":"What started it all",
"newKey":true <----------- insert new key/value pair here
}
]
}
For changing keys on the first level, I used a simple query like this
UPDATE
sample_table_json
SET
json = json::jsonb || '{"active": true}';
But this doesn't work for nested objects and objects in an array like in the sample.
How would I insert a key/value pair into a JSON column with nested objects in an array?
You have to use the jsonb_set function while specifying the right path see the manual.
For a single json update :
UPDATE sample_table_json
SET json = jsonb_set( json::jsonb
, '{post,0,active}'
, 'true'
, true
)
For a (very) limited set of json updates :
UPDATE sample_table_json
SET json = jsonb_set(jsonb_set( json::jsonb
, '{post,0,active}'
, 'true'
, true
)
, '{post,1,active}'
, 'true'
, true
)
For a larger set of json updates of the same json data, you can create the "aggregate version" of the jsonb_set function :
CREATE OR REPLACE FUNCTION jsonb_set(x jsonb, y jsonb, p text[], e jsonb, b boolean)
RETURNS jsonb LANGUAGE sql AS $$
SELECT jsonb_set(COALESCE(x,y), p, e, b) ; $$ ;
CREATE OR REPLACE AGGREGATE jsonb_set_agg(x jsonb, p text[], e jsonb, b boolean)
( STYPE = jsonb, SFUNC = jsonb_set) ;
and then use the new aggregate function jsonb_set_agg while iterating on a query result where the path and val fields could be calculated :
SELECT jsonb_set_agg('{"username": "jsmith","location": "United States","posts": [{"id":"1","title":"Welcome"},{"id":"4","title":"What started it all"}]}' :: jsonb
, l.path :: text[]
, to_jsonb(l.val)
, true)
FROM (VALUES ('{posts,0,active}', 'true'), ('{posts,1,active}', 'true')) AS l(path, val) -- this list could be the result of a subquery
This query could finally be used in order to update some data :
WITH list AS
(
SELECT id
, jsonb_set_agg(json :: jsonb
, l.path :: text[]
, to_jsonb(l.val)
, true) AS res
FROM sample_table_json
CROSS JOIN (VALUES ('{posts,0,active}', 'true'), ('{posts,1,active}', 'true')) AS l(path, val)
GROUP BY id
)
UPDATE sample_table_json AS t
SET json = l.res
FROM list AS l
WHERE t.id = l.id
see the test result in dbfiddle
It became a bit complicated. Loop through the array, add the new key/value pair to each array element and re-aggregate the array, then rebuild the whole object.
with t(j) as
(
values ('{
"username": "jsmith",
"location": "United States",
"posts": [
{
"id":"1", "title":"Welcome", "newKey":true
},
{
"id":"4", "title":"What started it all", "newKey":true
}]
}'::jsonb)
)
select j ||
jsonb_build_object
(
'posts',
(select jsonb_agg(je||'{"active":true}') from jsonb_array_elements(j->'posts') je)
)
from t;

SQL Server For JSON Path dynamic column name

We are exploring the JSON feature in SQL Sever and for one of the scenarios we want to come up with a SQL which can return a JSON like below
[
{
"field": {
"uuid": "uuid-field-1"
},
"value": {
"uuid": "uuid-value" //value is an object
}
},
{
"field": {
"uuid": "uuid-field-2"
},
"value": "1". //value is simple integer
}
... more rows
]
The value field can be a simple integer/string or a nested object.
We are able to come up with a table which looks like below:
field.uuid | value.uuid | value|
------------|---------- | -----|
uuid-field-1| value-uuid | null |
uuid-field-2| null | 1 |
... more rows
But as soon as we apply for json path, it fails saying
Property 'value' cannot be generated in JSON output due to a conflict with another column name or alias. Use different names and aliases for each column in SELECT list.
Is it possible to do it somehow generate this? The value will either be in the value.uuid or value not both?
Note: We are open to possibility of if we can convert each row to individual JSON and add all of them in an array.
select
json_query((select v.[field.uuid] as 'uuid' for json path, without_array_wrapper)) as 'field',
value as 'value',
json_query((select v.[value.uuid] as 'uuid' where v.[value.uuid] is not null for json path, without_array_wrapper)) as 'value'
from
(
values
('uuid-field-1', 'value-uuid1', null),
('uuid-field-2', null, 2),
('uuid-field-3', 'value-uuid3', null),
('uuid-field-4', null, 4)
) as v([field.uuid], [value.uuid], value)
for json auto;--, without_array_wrapper;
The reason for this error is that (as is mentioned in the documentation) ... FOR JSON PATH clause uses the column alias or column name to determine the key name in the JSON output. If an alias contains dots, the PATH option creates nested objects. In your case value.uuid and value both generate a key with name value.
I can suggest an approach (probably not the best one), which uses JSON_MODIFY() to generate the expected JSON from an empty JSON array:
Table:
CREATE TABLE Data (
[field.uuid] varchar(100),
[value.uuid] varchar(100),
[value] int
)
INSERT INTO Data
([field.uuid], [value.uuid], [value])
VALUES
('uuid-field-1', 'value-uuid', NULL),
('uuid-field-2', NULL, 1),
('uuid-field-3', NULL, 3),
('uuid-field-4', NULL, 4)
Statement:
DECLARE #json nvarchar(max) = N'[]'
SELECT #json = JSON_MODIFY(
#json,
'append $',
JSON_QUERY(
CASE
WHEN [value.uuid] IS NOT NULL THEN (SELECT d.[field.uuid], [value.uuid] FOR JSON PATH, WITHOUT_ARRAY_WRAPPER)
WHEN [value] IS NOT NULL THEN (SELECT d.[field.uuid], [value] FOR JSON PATH, WITHOUT_ARRAY_WRAPPER)
END
)
)
FROM Data d
SELECT #json
Result:
[
{
"field":{
"uuid":"uuid-field-1"
},
"value":{
"uuid":"value-uuid"
}
},
{
"field":{
"uuid":"uuid-field-2"
},
"value":1
},
{
"field":{
"uuid":"uuid-field-3"
},
"value":3
},
{
"field":{
"uuid":"uuid-field-4"
},
"value":4
}
]

MySQL 8 search JSON key by value in array

I've got MySQL table with JSON field, where I store data in such a format.
{
"fields": {
"1": {
"s": "y"
},
"2": {
"s": "n"
}
}
}
I need to obtain the keys in fields, e.g. 1 or 2 given the value of s.
Example query:
create table mytable ( mycol json );
insert into mytable set mycol = '{"fields": {"1": {"s": "y"},"2": {"s": "n"}}}';
select j.* from mytable, JSON_TABLE(mycol,
'$.fields.*' COLUMNS (
json_key VARCHAR(10) PATH '$',
s VARCHAR(10) PATH '$.s'
)
) AS j where j.s = 'y';
gives:
# json_key, s
null, y
I would expect to get
# json_key, s
1, y
Is it possible to get that data somehow?
I don't need the results in row / table format. I would be happy to get the comma separated list of IDs (json_keys) meeting my criterium.
EDIT:
I was also thinking about getting the paths using JSON_SEARCH and passing that to JSON_EXTRACT, this was achieved here: Combining JSON_SEARCH and JSON_EXTRACT get me: "Invalid JSON path expression."
Unfortunately the difference is that I would need to use JSON_SEARCH in all mode, as I need all results. In such a mode JSON_SEARCH returns list of paths, where as JSON_EXTRACT accepts list of arguments.
Try FOR ORDINALITY (see 12.17.6 JSON Table Functions), this type enumerates rows in the COLUMNS clause:
SELECT
JSON_UNQUOTE(
JSON_EXTRACT(
JSON_KEYS(`mycol` ->> '$.fields'),
CONCAT('$[', `j`.`row` - 1, ']')
)
) `json_key`,
`j`.`s`
FROM
`mytable`,
JSON_TABLE(
`mycol`,
'$.fields.*' COLUMNS (
`row` FOR ORDINALITY,
`s` VARCHAR(10) PATH '$.s'
)
) `j`
WHERE
`j`.`s` = 'y';
See dbfiddle.

Does Postgres allow json[] or jsonb[]?

So l have been trying to find an answer on the internet with 0 luck.
Does postgres support having arrays of objects in a single field, e.g.
[
{
key: value,
another: value
},
{
key: value,
value: key
}
]
and saving this to a single field?
Also how would you perform the single INSERT or UPDATE
would it be: UPDATE db SET value='[{ key: val }, { key: val }]' ??
Postgres supports any valid json values, including json arrays.
What you are going to use is a single json (jsonb) column, not a Postgres array:
create table example (id int, val jsonb);
insert into example
values (1, '[{ "name": "aga" }, { "gender": "female" }]');
select * from example;
id | val
----+-----------------------------------------
1 | [{"name": "aga"}, {"gender": "female"}]
(1 row)
It depends on your definitoion of objects I guess.
You can use JSON: http://www.postgresql.org/docs/current/static/functions-json.html and insert unstructured data:
# create table test (field json);
CREATE TABLE
# insert into test values ('[1,2,3]');
INSERT 0 1
# insert into test values ('[{"key": "value"}, {"key": "value"}]');
INSERT 0 1
# select * from test;
field
--------------------------------------
[1,2,3]
[{"key": "value"}, {"key": "value"}]
There is also support for arrays: http://www.postgresql.org/docs/current/static/arrays.html