Let a SQLite table be built like :
create table t(id, j json);
insert into t values (1, '{"name": "bob"}');
insert into t values (2, '{"name": "alice", "age":20, "hobbies":[ "a", "b", "c"] }');
What's the easiest way to export the whole table as valid JSON, without JSON strings being escaped?
sqlite> .mode json
sqlite> select id, j from t;
[{"id":1,"j":"{\"name\": \"bob\"}"},
{"id":2,"j":"{\"name\": \"alice\", \"age\":20, \"hobbies\":[ \"a\", \"b\", \"c\"] }"}] -- WRONG!
JSON column may vary. Couldn't do it with json_extract function. Expecting parsable JSON,
[{"id":1,"j":{"name": "bob"}},
{"id":2,"j":{"name": "alice", "age":20, "hobbies":[ "a", "b", "c"] }}]
Use the functions json() to remove escaping from the column j and json_group_array() to aggregate:
select json_group_array(json_object('id', id, 'j', json(j))) result
from t
See the demo.
Results:
result
[{"id":1,"j":{"name":"bob"}},{"id":2,"j":{"name":"alice","age":20,"hobbies":["a","b","c"]}}]
Related
Let's say I have inserted a record like this:
id | u_id | the_data
-------+------+--------------------------------------------------
1 | 2863 |[{name: a, body: lorem}]
using this command:
CREATE TABLE users (
id SERIAL PRIMARY KEY,
u_id INT,
the_data JSON
);
INSERT INTO users (u_id, the_data) VALUES (2863, '[{"name": "a", "body": "lorem"}]');
But now, I want to insert some more data into the same record without losing the old array of json. How to do this type of insertion?
id | u_id | the_data
-------+------+------------------------------------------------------------------------------
1 | 2863 |[ {name: a, body: lorem}, {name: b, body: ipsum} ]
Please note: Below command creates a new record which I don't want.
INSERT INTO users (u_id, the_data)
VALUES (2863, '[{"name": "b", "body": "ipsum"}]');
Not looking for solutions like below since they all insert at the same time:
INSERT INTO users (u_id, the_data)
VALUES (2863, '[{"name": "a", "body": "lorem"}, {"name": "b", "body": "ipsum"}]');
INSERT INTO users (u_id, the_data)
VALUES (2863, '[{"name": "a", "body": "lorem"}]'), (2863, '[{"name": "b", "body": "ipsum"}]');
As the top level JSON object is an array, you can use the standard concatenation operator || to append an element to the array:
update users
set the_data = the_data || '{"name": "b", "body": "ipsum"}'
where u_id = 2863;
You should change your column definition to jsonb as that offers a lot more possibilities for querying or changing the value. Otherwise you will be forced to cast the column to jsonb every time you want to do something more interesting with it.
If you can't or don't want to change the data type you need to cast it:
set the_data = the_data::jsonb || '....'
You can create list of object and parse it to loop :
For Example:
var data = {
Id : Id
Name : Name
}
Json Request :
Data: data
Well, that's not a simple json object. You're trying to add a object to an array of values that is saved as json field.
So it's not about keeping the old array, but rather keeping the objects that were already present on the array saved in the json field and adding the new one.
I tried this on Postgres 12, it works, basically as someone else said, you will need to cast the jsonb type if you've json and use pipes operator to concatenate the new value.
UPDATE users
SET the_data = the_data::jsonb || '{"name": "b", "body": "ipsum"}'
WHERE id = 1;
Taken from here:
https://stackoverflow.com/a/69630521/9231145
I have a Postgres table like:
CREATE table json_str(
foo int,
js text
);
insert into json_str(foo, js) values (1, '{ "key": "value", "foo": 1}');
and want to cast the column js of type text to JSON.
Why is the output of cast different from to_json?
Ideally, I can fit the type change into an alter table statement. However, a simple cast is not accepted and I am forced to use to_json (which returns doubly quoted non-plain JSON. Is there a way to get the standard (direct) JSON layout for an alter table statement?
select * from json_str;
-- { "key": "value", "foo": 1}
select cast(js as jsonb) from json_str;
--- {"foo": 1, "key": "value"}
select to_json(js) from json_str;
-- "{ \"key\": \"value\", \"foo\": 1}"
alter table json_str alter column js type jsonb using to_jsonb(js);
select * from json_str;
-- "{ \"key\": \"value\", \"foo\": 1}"
This is the working alter table statement:
alter table json_str alter column js type jsonb using js::jsonb;
There are many examples of json parsing in POSTGRES, which pull data from a table. I have a raw json string handy and would like to practice using JSON functions and operators. Is it possible to do this without using tables? Or ... what is the most straightfoward way to declare it as a variable? Something like...
# Declare
foojson = "{'a':'foo', 'b':'bar'}"
# Use
jsonb_array_elements(foojson) -> 'a'
Basically I'd like the last line to print to console or be wrappable in a SELECT statement so I can rapidly "play" with some of these operators.
You can pass it directly to the function
select '{"a": "foo", "b": "bar"}'::jsonb ->> 'a';
select *
from jsonb_each('{"a": "foo", "b": "bar"}');
select *
from jsonb_array_elements('[{"a": "foo"}, {"b": "bar"}]');
Or if you want to pretend, it's part of a table:
with data (json_value) as (
values
('{"a": "foo", "b": "bar"}'::jsonb),
('{"foo": 42, "x": 100}')
)
select e.*
from data d
cross join jsonb_each(d.json_value) as e;
with data (json_value) as (
values
('{"a": 1, "b": "x"}'::jsonb),
('{"a": 42, "b": "y"}')
)
select d.json_value ->> 'a',
d.json_value ->> 'b'
from data d;
I have json type column in mysql named names and its a simple json array (not key/value). i couldn't find any example of using JSON_EXTRACT, JSON_SET, JSON_REPLACE, JSON_INSERT for simple json array field.
I know there are other ways to manipulate a json array in json field type, but is it possible to use these functions for json array?
for example, name field contains ["A","B","C"], how can I use these functions to perform an update, insert and delete on this json?
update
query must execute from a php script
The functions you refer to all work exactly as expected and described in the manual; that is to say JSON_SET will insert or replace if a value already exists, JSON_INSERT will insert if a value doesn't already exist, and JSON_REPLACE will replace a pre-existing value. You can use JSON_ARRAY_INSERT and JSON_ARRAY_APPEND to more easily add values to a JSON array.
-- extract second element
select json_extract('["A", "B", "C"]', '$[1]')
-- "B"
-- replace second element
select json_set('["A", "B", "C"]', '$[1]', 'D')
-- ["A", "D", "C"]
-- insert fourth element
select json_set('["A", "B", "C"]', '$[3]', 'E')
-- ["A", "B", "C", "E"]
-- attempt to insert second element fails as it already exists
select json_insert('["A", "B", "C"]', '$[1]', 'F')
-- ["A", "B", "C"]
-- use json_array_insert to insert a new second element and move the other elements right
select json_array_insert('["A", "B", "C"]', '$[1]', 'F')
-- ["A", "F", "B", "C"]
-- insert fourth element
select json_insert('["A", "B", "C"]', '$[3]', 'F')
-- ["A", "B", "C", "F"]
-- or use json_array_append to add an element at the end
select json_array_append('["A", "B", "C"]', '$', 'F')
-- ["A", "B", "C", "F"]
-- replace second element
select json_replace('["A", "B", "C"]', '$[1]', 'G')
-- ["A", "G", "C"]
-- attempt to replace non-existing element fails
select json_replace('["A", "B", "C"]', '$[3]', 'G')
-- ["A", "B", "C"]
Demo on dbfiddle
To use these functions on a column in a table, simply replace the ["A", "B", "C"] in the above calls with the column name, for example:
create table test (j json);
insert into test values ('["A", "B", "C"]');
select json_array_insert(j, '$[1]', 'F')
from test
-- ["A", "F", "B", "C"]
Demo on dbfiddle
i think found the solution
for json array, it's not possible to use JSON_EXTRACT, JSON_SET, JSON_REPLACE, JSON_INSERT by array values, and you have to know the place of each value in json array (in my opinion it's a weakness).
for example to select 2nd value you can use $[1],
but for insert values you can use
JSON_ARRAY_APPENDand JSON_ARRAY_INSERT
So l have been trying to find an answer on the internet with 0 luck.
Does postgres support having arrays of objects in a single field, e.g.
[
{
key: value,
another: value
},
{
key: value,
value: key
}
]
and saving this to a single field?
Also how would you perform the single INSERT or UPDATE
would it be: UPDATE db SET value='[{ key: val }, { key: val }]' ??
Postgres supports any valid json values, including json arrays.
What you are going to use is a single json (jsonb) column, not a Postgres array:
create table example (id int, val jsonb);
insert into example
values (1, '[{ "name": "aga" }, { "gender": "female" }]');
select * from example;
id | val
----+-----------------------------------------
1 | [{"name": "aga"}, {"gender": "female"}]
(1 row)
It depends on your definitoion of objects I guess.
You can use JSON: http://www.postgresql.org/docs/current/static/functions-json.html and insert unstructured data:
# create table test (field json);
CREATE TABLE
# insert into test values ('[1,2,3]');
INSERT 0 1
# insert into test values ('[{"key": "value"}, {"key": "value"}]');
INSERT 0 1
# select * from test;
field
--------------------------------------
[1,2,3]
[{"key": "value"}, {"key": "value"}]
There is also support for arrays: http://www.postgresql.org/docs/current/static/arrays.html