postgres force json datatype - json

When working with JSON datatype, is there a way to ensure the input JSON must have elements. I don't mean primary, I want the JSON that gets inserted to at least have the id and name element, it can have more but at the minimum the id and name must be there.
thanks

The function checks what you want:
create or replace function json_has_id_and_name(val json)
returns boolean language sql as $$
select coalesce(
(
select array['id', 'name'] <# array_agg(key)
from json_object_keys(val) key
),
false)
$$;
select json_has_id_and_name('{"id":1, "name":"abc"}'), json_has_id_and_name('{"id":1}');
json_has_id_and_name | json_has_id_and_name
----------------------+----------------------
t | f
(1 row)
You can use it in a check constraint, e.g.:
create table my_table (
id int primary key,
jdata json check (json_has_id_and_name(jdata))
);
insert into my_table values (1, '{"id":1}');
ERROR: new row for relation "my_table" violates check constraint "my_table_jdata_check"
DETAIL: Failing row contains (1, {"id":1}).

Related

Insert date in table using json_populate_recordset

I'm trying insert data in table using this query
INSERT INTO table (
url,
v_count,
v_date)
SELECT
url,
v_count,
v_date FROM json_populate_recorset(null::record,
'[{"url_site":"test.com","visit_count":1,"visit_date":"2022-08-31"},
{"url_site":"dev.com","visit_count":2,"visit_date":"2022-08-31"}]'::json)
AS ("url" varchar(700), "v_count" integer, "v_date" date)
And I'm getting this error:
null value in column "v_date" of relation table violates not null constraint
Since my json could be hundreds of entries at some times,
how should I send the date in my json ?
There is another (efficient) way to insert this data in the table ?
Edit: in postico 1.5.20 my example above works as long as I have the json key named the same as the table columns, how can I reference differents names in my json keys?
Since v_date can resolve to null, you'll need to either skip them or provide a value when null appears.
To skip the null values, you may want to add a WHERE v_date NOTNULL clause to your SELECT statement.
Otherwise, you can use COALESCE() to assign a value when v_date is null. For example ... SELECT url, v_count, COALESCE(v_date,now()) FROM json_populate_recordset...

How to insert a json object with ORACLE 19 and 21

Because I don't use Oracle 21. I can't use the JSON type in the definition of a table.
CREATE TABLE TABLE_TEST_QUERY_2
(
TTQ_NR INTEGER GENERATED BY DEFAULT AS IDENTITY,
TTQ_QUERY_TO_BE_TESTED VARCHAR2 (4000 BYTE),
TTQ_RESULT CLOB,
--RESULT JSON, UPGRADE oracle 21
TTQ_TTQ_CREATION_DATE DATE DEFAULT SYSDATE,
TTQ_ALREADY_TESTED INTEGER DEFAULT 0,
TTQ_TEST_PASSED INTEGER,
PRIMARY KEY (TTQ_NR),
CONSTRAINT RESULT CHECK (TTQ_RESULT IS JSON)
)
I want to add a json object in ttq_result. Not a string representing a json.
I've a way to transform a json into a clob.
select to_clob(utl_raw.cast_to_raw (json_object('a' value 2))) from dual;
But it's not working, if I try to insert the clob created from a json in the table
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 TTQ_RESULT
VALUES to_clob(utl_raw.cast_to_raw (json_object(a value '2')));
[Error] Execution (3: 13): ORA-03001: unimplemented feature
code(oracle 18)
update:
I've tried to add a json on dbfiddle with oracle 21. I'm using the json type to define a column.
CREATE TABLE TABLE_TEST_QUERY_2
(
TTQ_NR INTEGER GENERATED BY DEFAULT AS IDENTITY,
TTQ_QUERY_TO_BE_TESTED VARCHAR2 (4000 BYTE),
TTQ_RESULT JSON,
TTQ_TTQ_CREATION_DATE DATE DEFAULT SYSDATE,
TTQ_ALREADY_TESTED INTEGER DEFAULT 0,
TTQ_TEST_PASSED INTEGER,
PRIMARY KEY (TTQ_NR)
)
INSERT INTO TABLE_TEST_QUERY_2 TTQ_RESULT
VALUES json_object('a' value 2);
I have the same error.
ORA-03001: unimplemented feature
Maybe are these 2 problems related.
code oracle 21
Your first problem is because you are using the wrong syntax as you have omitted the brackets from around column identifiers or the column value:
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 (TTQ_RESULT)
VALUES ( to_clob(utl_raw.cast_to_raw (json_object(a value '2'))));
Which fixes the unimplemented feature exception but now you get:
ORA-00984: column not allowed here
Which is because you are using a different query to the SELECT as you have changed json_object('a' value 2) to json_object(a value '2') and the query cannot find a column a.
If you fix that by using the original code from the SELECT with 'a' as a string literal and not a a column identifier:
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 (TTQ_RESULT)
VALUES ( to_clob(utl_raw.cast_to_raw (json_object('a' value 2))));
You will then get the error:
ORA-02290: check constraint (FIDDLE_FCJHJVMCPHKXUCUPDUSV.RESULT) violated
Because converting to a RAW and then to a CLOB will mangle the value.
You need something much simpler:
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 (TTQ_RESULT)
VALUES (json_object('a' value 2));
or:
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 (TTQ_RESULT)
VALUES (EMPTY_CLOB() || json_object('a' value 2));
Which both work.
db<>fiddle here

Oracle reading JSON data using json_query

While working on oracle json datatype and trying to extract data from it, not able to extract name & value elements from this. tried using all known notations but getting null.
select json_query(po_document, '$.actions.parameters[0]') from j_purchaseorder where ID='2';
You can use the JSON_VALUE function as follows:
SQL> select JSON_VALUE('{"_class":"123", "name":"tejash","value":"so"}', '$.name') as name,
2 JSON_VALUE('{"_class":"123", "name":"tejash","value":"so"}', '$.value') as value
3 from dual;
NAME VALUE
---------- ----------
tejash so
SQL>
Thanks for your help. got required output using below
select json_value(json_query(po_document, '$.actions.parameters[0]'),'$.value') from j_purchaseorder where ID='2' and
json_value(json_query(po_document, '$.actions.parameters[0]'),'$.name') = 'SERVERUSER';
As explained, for example, in the Oracle documentation, multiple calls to JSON_VALUE() on the same JSON document may result in very poor performance. When we need to extract multiple values from a single document, it is often best (for performance) to make a single call to JSON_TABLE().
Here is how that would work on the provided document. First I create and populate the table, then I show the query and the output. Note the handling of column (attribute) "_class", both in the JSON document and in the SQL SELECT statement. In both cases the name must be enclosed in double-quotes, because it begins with an underscore.
create table j_purchaseorder (
id number primary key,
po_document clob check (po_document is json)
);
insert into j_purchaseorder (id, po_document) values (
2, '{"_class":"hudson.model.StringParameterValue","name":"SERVERUSER","value":"avlipwcnp04"}'
);
commit;
select "_CLASS", name, value
from j_purchaseorder
cross apply
json_table(po_document, '$'
columns (
"_CLASS" varchar2(40) path '$."_class"',
name varchar2(20) path '$.name',
value varchar2(20) path '$.value'
)
)
where id = 2
;
_CLASS NAME VALUE
---------------------------------------- ------------------ ------------------
hudson.model.StringParameterValue SERVERUSER avlipwcnp04

Is it possible to transform a settings table with unique names into a json hash object in postgresql?

I have a settings table with two columns - name and value. Names are unique. I can easily read it into memory and then create a dictionary using the entry names as the keys.
I was wondering whether this can be done entirely from the SQL using some postgresql functions and applying the row_to_json function at the end.
I have version 9.2
Is it possible? It should be.
I think what you'd have to do is create a function for pulling a record in (as an argument) and transforming it to a record of arbitrary type and turning that into JSON.
This was done on 9.1 with the json extension.
create or replace function to_json(test) returns json language plpgsql
as $$
declare t_row record;
retval json;
begin
EXECUTE $E$ SELECT $1 AS $E$ || quote_ident($1.name) INTO t_row
USING $1.value;
RETURN row_to_json(t_row);
end;
$$;
Then I can:
select * from test;
name | value
-------+--------
test1 | foo
test2 | foobar
(2 rows)
SELECT to_json(test) from test;
to_json
--------------------
{"test1":"foo"}
{"test2":"foobar"}
Now if you want to merge these all into one object you have a little more work to do but it could be done using the same basic tools.
This should work in postgres-9.3. (untested, since I don't have 9.3 available here yet)
DROP SCHEMA tmp CASCADE;
CREATE SCHEMA tmp ;
SET search_path=tmp;
CREATE table pipo (name varchar NOT NULL PRIMARY KEY
, value varchar);
INSERT INTO pipo (name, value ) VALUES
('memory' , '10Mb'), ('disk' , '1Gb'), ('video' , '10Mpix/sec'), ('sound' , '100dB');
SELECT row_to_json( ROW(p.name,p.value) )
FROM pipo p ;

PostgreSQL: insert data into table from json

Now I use to manually parse json into insert string like so
insert into Table (field1, field2) values (val1, val2)
but its not comfortable way to insert data from json!
I've found function json_populate_record and tried to use it:
create table test (id serial, name varchar(50));
insert into test select * from json_populate_record(NULL::test, '{"name": "John"}');
but it fails with the message: null value in column "id" violates not-null constraint
PG knows that id is serial but pretends to be a fool. Same it do for all fieds with defaults.
Is there more elegant vay to insert data from json into a table?
There's no easy way for json_populate_record to return a marker that means "generate this value".
PostgreSQL does not allow you to insert NULL to specify that a value should be generated. If you ask for NULL Pg expects to mean NULL and doesn't want to second-guess you. Additionally it's perfectly OK to have a generated column that has no NOT NULL constraint, in which case it's perfectly fine to insert NULL into it.
If you want to have PostgreSQL use the table default for a value there are two ways to do this:
Omit that row from the INSERT column-list; or
Explicitly write DEFAULT, which is only valid in a VALUES expression
Since you can't use VALUES(DEFAULT, ...) here, your only option is to omit the column from the INSERT column-list:
regress=# create table test (id serial primary key, name varchar(50));
CREATE TABLE
regress=# insert into test(name) select name from json_populate_record(NULL::test, '{"name": "John"}');
INSERT 0 1
Yes, this means you must list the columns. Twice, in fact, once in the SELECT list and once in the INSERT column-list.
To avoid the need for that this PostgreSQL would need to have a way of specifying DEFAULT as a value for a record, so json_populate_record could return DEFAULT instead of NULL for columns that aren't defined. That might not be what you intended for all columns and would lead to the question of how DEFAULT would be treated when json_populate_record was not being used in an INSERT expression.
So I guess json_populate_record might be less useful than you hoped for rows with generated keys.
Continuing from Craig's answer, you probably need to write some sort of stored procedure to perform the necessary dynamic SQL, like as follows:
CREATE OR REPLACE FUNCTION jsoninsert(relname text, reljson text)
RETURNS record AS
$BODY$DECLARE
ret RECORD;
inputstring text;
BEGIN
SELECT string_agg(quote_ident(key),',') INTO inputstring
FROM json_object_keys(reljson::json) AS X (key);
EXECUTE 'INSERT INTO '|| quote_ident(relname)
|| '(' || inputstring || ') SELECT ' || inputstring
|| ' FROM json_populate_record( NULL::' || quote_ident(relname) || ' , json_in($1)) RETURNING *'
INTO ret USING reljson::cstring;
RETURN ret;
END;
$BODY$
LANGUAGE plpgsql VOLATILE;
Which you'd then call with
SELECT jsoninsert('test', '{"name": "John"}');