postgres cast json vs. to_json function (why different output) - json

I have a Postgres table like:
CREATE table json_str(
foo int,
js text
);
insert into json_str(foo, js) values (1, '{ "key": "value", "foo": 1}');
and want to cast the column js of type text to JSON.
Why is the output of cast different from to_json?
Ideally, I can fit the type change into an alter table statement. However, a simple cast is not accepted and I am forced to use to_json (which returns doubly quoted non-plain JSON. Is there a way to get the standard (direct) JSON layout for an alter table statement?
select * from json_str;
-- { "key": "value", "foo": 1}
select cast(js as jsonb) from json_str;
--- {"foo": 1, "key": "value"}
select to_json(js) from json_str;
-- "{ \"key\": \"value\", \"foo\": 1}"
alter table json_str alter column js type jsonb using to_jsonb(js);
select * from json_str;
-- "{ \"key\": \"value\", \"foo\": 1}"

This is the working alter table statement:
alter table json_str alter column js type jsonb using js::jsonb;

Related

Update value inside of nested json array

I have JSON stored in a table. The JSON is nested and has the following structure
[
{
"name": "abc",
"ques": [
{
"qId": 100
},
{
"qId": 200
}
]
},{
"name": "xyz",
"ques": [
{
"qId": 100
},
{
"qId": 300
}
]
}
]
Update TABLE_NAME
set COLUMN_NAME = jsonb_set(COLUMN_NAME, '{ques,qId}', '101')
WHERE COLUMN_NAME->>'qId'=100
I am trying to update qId value from JSON. If qId is 100, I want to update it to 101.
1st solution, simple but to be used carefully
You convert your json data into text and you use the replace function :
Update TABLE_NAME
set COLUMN_NAME = replace(COLUMN_NAME :: text,'"qId": 100}', '"qId": 101}') :: jsonb
2nd solution more elegant and more complex
jsonb_set cannot make several replacements in the same jsonb data at the same time. To do so, you need to create your own aggregate based on the jsonb_set function :
CREATE OR REPLACE FUNCTION jsonb_set(x jsonb, y jsonb, path text[], new_value jsonb) RETURNS jsonb LANGUAGE sql AS $$
SELECT jsonb_set(COALESCE(x, y), path, new_value) ; $$ ;
CREATE OR REPLACE AGGREGATE jsonb_set_agg(x jsonb, path text[], new_value jsonb)
( stype = jsonb, sfunc = jsonb_set);
Then you get your result with the following query :
UPDATE TABLE_NAME
SET COLUMN_NAME =
( SELECT jsonb_set_agg(COLUMN_NAME :: jsonb, array[(a.id - 1) :: text, 'ques', (b.id - 1) :: text], jsonb_build_object('qId', 101))
FROM jsonb_path_query(COLUMN_NAME :: jsonb, '$[*]') WITH ORDINALITY AS a(content, id)
CROSS JOIN LATERAL jsonb_path_query(a.content->'ques', '$[*]') WITH ORDINALITY AS b(content, id)
WHERE (b.content)->'qId' = to_jsonb(100)
)
Note that this query is not universal, and it must breakdown the jsonb data according to its structure.
Note that jsonb_array_elements can be used in place of jsonb_path_query, but you will get an error with jsonb_array_elements when the jsonb data is not an array, whereas you won't get any error with jsonb_path_query in lax mode which is the default mode.
Full test results in dbfiddle
You must specify the whole path to the value.
In this case your json is an array so you need to address which element of this array your are trying to modify.
A direct approach (over your example) would be:
jsonb_set(
jsonb_set(
COLUMN_NAME
, '{0,ques,qId}'
, '101'
)
, '{1,ques,qId}'
, '101'
)
Of course, if you want to modify every element of different arrays of different lengths you would need to elaborate this approach disassembling the array to modify every contained element.

Export SQLite table which contains JSON column without double escaping

Let a SQLite table be built like :
create table t(id, j json);
insert into t values (1, '{"name": "bob"}');
insert into t values (2, '{"name": "alice", "age":20, "hobbies":[ "a", "b", "c"] }');
What's the easiest way to export the whole table as valid JSON, without JSON strings being escaped?
sqlite> .mode json
sqlite> select id, j from t;
[{"id":1,"j":"{\"name\": \"bob\"}"},
{"id":2,"j":"{\"name\": \"alice\", \"age\":20, \"hobbies\":[ \"a\", \"b\", \"c\"] }"}] -- WRONG!
JSON column may vary. Couldn't do it with json_extract function. Expecting parsable JSON,
[{"id":1,"j":{"name": "bob"}},
{"id":2,"j":{"name": "alice", "age":20, "hobbies":[ "a", "b", "c"] }}]
Use the functions json() to remove escaping from the column j and json_group_array() to aggregate:
select json_group_array(json_object('id', id, 'j', json(j))) result
from t
See the demo.
Results:
result
[{"id":1,"j":{"name":"bob"}},{"id":2,"j":{"name":"alice","age":20,"hobbies":["a","b","c"]}}]

Update every value in an array in postgres json

In my postgres database I have json that looks similar to this:
{
"myArray": [
{
"myValue": 1
},
{
"myValue": 2
},
{
"myValue": 3
}
]
}
Now I want to rename myValue to otherValue. I can't be sure about the length of the array! Preferably I would like to use something like set_jsonb with a wildcard as the array index, but that does not seem to be supported. So what is the nicest solution?
You have to decompose a whole jsonb object, modify individual elements and build the object back.
The custom function will be helpful:
create or replace function jsonb_change_keys_in_array(arr jsonb, old_key text, new_key text)
returns jsonb language sql as $$
select jsonb_agg(case
when value->old_key is null then value
else value- old_key || jsonb_build_object(new_key, value->old_key)
end)
from jsonb_array_elements(arr)
$$;
Use:
with my_table (id, data) as (
values(1,
'{
"myArray": [
{
"myValue": 1
},
{
"myValue": 2
},
{
"myValue": 3
}
]
}'::jsonb)
)
select
id,
jsonb_build_object(
'myArray',
jsonb_change_keys_in_array(data->'myArray', 'myValue', 'otherValue')
)
from my_table;
id | jsonb_build_object
----+------------------------------------------------------------------------
1 | {"myArray": [{"otherValue": 1}, {"otherValue": 2}, {"otherValue": 3}]}
(1 row)
Using json functions are definitely the most elegant, but you can get by on using character replacement. Cast the json(b) as text, perform the replace, then change it back to json(b). In this example I included the quotes and colon to help the text replace target the json keys without conflict with values.
CREATE TABLE mytable ( id INT, data JSONB );
INSERT INTO mytable VALUES (1, '{"myArray": [{"myValue": 1},{"myValue": 2},{"myValue": 3}]}');
INSERT INTO mytable VALUES (2, '{"myArray": [{"myValue": 4},{"myValue": 5},{"myValue": 6}]}');
SELECT * FROM mytable;
UPDATE mytable
SET data = REPLACE(data :: TEXT, '"myValue":', '"otherValue":') :: JSONB;
SELECT * FROM mytable;
http://sqlfiddle.com/#!17/1c28a/9/4

Does Postgres allow json[] or jsonb[]?

So l have been trying to find an answer on the internet with 0 luck.
Does postgres support having arrays of objects in a single field, e.g.
[
{
key: value,
another: value
},
{
key: value,
value: key
}
]
and saving this to a single field?
Also how would you perform the single INSERT or UPDATE
would it be: UPDATE db SET value='[{ key: val }, { key: val }]' ??
Postgres supports any valid json values, including json arrays.
What you are going to use is a single json (jsonb) column, not a Postgres array:
create table example (id int, val jsonb);
insert into example
values (1, '[{ "name": "aga" }, { "gender": "female" }]');
select * from example;
id | val
----+-----------------------------------------
1 | [{"name": "aga"}, {"gender": "female"}]
(1 row)
It depends on your definitoion of objects I guess.
You can use JSON: http://www.postgresql.org/docs/current/static/functions-json.html and insert unstructured data:
# create table test (field json);
CREATE TABLE
# insert into test values ('[1,2,3]');
INSERT 0 1
# insert into test values ('[{"key": "value"}, {"key": "value"}]');
INSERT 0 1
# select * from test;
field
--------------------------------------
[1,2,3]
[{"key": "value"}, {"key": "value"}]
There is also support for arrays: http://www.postgresql.org/docs/current/static/arrays.html

Sort by length of nested JSON array

Let's assume I have a PostgreSQL table with following schema:
id: 1,
attribute_a: 'value'
attribute_b: 'value'
data: { attribute_c: 'value', array_of_values: [1,2,3] }
Where data is stored in a JSON structure. Is that possible to order the elements in the table by array_of_values length?
Use json_array_length() or jsonb_array_length() like Eggplant commented. Assuming jsonb:
SELECT *
FROM tbl
ORDER BY jsonb_array_length(data -> 'array_of_values')
BTW, the syntax for your JSON value should be:
{"attribute_c": "value", "array_of_values": [1, 2, 3]}