Because I don't use Oracle 21. I can't use the JSON type in the definition of a table.
CREATE TABLE TABLE_TEST_QUERY_2
(
TTQ_NR INTEGER GENERATED BY DEFAULT AS IDENTITY,
TTQ_QUERY_TO_BE_TESTED VARCHAR2 (4000 BYTE),
TTQ_RESULT CLOB,
--RESULT JSON, UPGRADE oracle 21
TTQ_TTQ_CREATION_DATE DATE DEFAULT SYSDATE,
TTQ_ALREADY_TESTED INTEGER DEFAULT 0,
TTQ_TEST_PASSED INTEGER,
PRIMARY KEY (TTQ_NR),
CONSTRAINT RESULT CHECK (TTQ_RESULT IS JSON)
)
I want to add a json object in ttq_result. Not a string representing a json.
I've a way to transform a json into a clob.
select to_clob(utl_raw.cast_to_raw (json_object('a' value 2))) from dual;
But it's not working, if I try to insert the clob created from a json in the table
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 TTQ_RESULT
VALUES to_clob(utl_raw.cast_to_raw (json_object(a value '2')));
[Error] Execution (3: 13): ORA-03001: unimplemented feature
code(oracle 18)
update:
I've tried to add a json on dbfiddle with oracle 21. I'm using the json type to define a column.
CREATE TABLE TABLE_TEST_QUERY_2
(
TTQ_NR INTEGER GENERATED BY DEFAULT AS IDENTITY,
TTQ_QUERY_TO_BE_TESTED VARCHAR2 (4000 BYTE),
TTQ_RESULT JSON,
TTQ_TTQ_CREATION_DATE DATE DEFAULT SYSDATE,
TTQ_ALREADY_TESTED INTEGER DEFAULT 0,
TTQ_TEST_PASSED INTEGER,
PRIMARY KEY (TTQ_NR)
)
INSERT INTO TABLE_TEST_QUERY_2 TTQ_RESULT
VALUES json_object('a' value 2);
I have the same error.
ORA-03001: unimplemented feature
Maybe are these 2 problems related.
code oracle 21
Your first problem is because you are using the wrong syntax as you have omitted the brackets from around column identifiers or the column value:
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 (TTQ_RESULT)
VALUES ( to_clob(utl_raw.cast_to_raw (json_object(a value '2'))));
Which fixes the unimplemented feature exception but now you get:
ORA-00984: column not allowed here
Which is because you are using a different query to the SELECT as you have changed json_object('a' value 2) to json_object(a value '2') and the query cannot find a column a.
If you fix that by using the original code from the SELECT with 'a' as a string literal and not a a column identifier:
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 (TTQ_RESULT)
VALUES ( to_clob(utl_raw.cast_to_raw (json_object('a' value 2))));
You will then get the error:
ORA-02290: check constraint (FIDDLE_FCJHJVMCPHKXUCUPDUSV.RESULT) violated
Because converting to a RAW and then to a CLOB will mangle the value.
You need something much simpler:
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 (TTQ_RESULT)
VALUES (json_object('a' value 2));
or:
INSERT INTO BV_OWN.TABLE_TEST_QUERY_2 (TTQ_RESULT)
VALUES (EMPTY_CLOB() || json_object('a' value 2));
Which both work.
db<>fiddle here
Related
I'm trying insert data in table using this query
INSERT INTO table (
url,
v_count,
v_date)
SELECT
url,
v_count,
v_date FROM json_populate_recorset(null::record,
'[{"url_site":"test.com","visit_count":1,"visit_date":"2022-08-31"},
{"url_site":"dev.com","visit_count":2,"visit_date":"2022-08-31"}]'::json)
AS ("url" varchar(700), "v_count" integer, "v_date" date)
And I'm getting this error:
null value in column "v_date" of relation table violates not null constraint
Since my json could be hundreds of entries at some times,
how should I send the date in my json ?
There is another (efficient) way to insert this data in the table ?
Edit: in postico 1.5.20 my example above works as long as I have the json key named the same as the table columns, how can I reference differents names in my json keys?
Since v_date can resolve to null, you'll need to either skip them or provide a value when null appears.
To skip the null values, you may want to add a WHERE v_date NOTNULL clause to your SELECT statement.
Otherwise, you can use COALESCE() to assign a value when v_date is null. For example ... SELECT url, v_count, COALESCE(v_date,now()) FROM json_populate_recordset...
When working with JSON datatype, is there a way to ensure the input JSON must have elements. I don't mean primary, I want the JSON that gets inserted to at least have the id and name element, it can have more but at the minimum the id and name must be there.
thanks
The function checks what you want:
create or replace function json_has_id_and_name(val json)
returns boolean language sql as $$
select coalesce(
(
select array['id', 'name'] <# array_agg(key)
from json_object_keys(val) key
),
false)
$$;
select json_has_id_and_name('{"id":1, "name":"abc"}'), json_has_id_and_name('{"id":1}');
json_has_id_and_name | json_has_id_and_name
----------------------+----------------------
t | f
(1 row)
You can use it in a check constraint, e.g.:
create table my_table (
id int primary key,
jdata json check (json_has_id_and_name(jdata))
);
insert into my_table values (1, '{"id":1}');
ERROR: new row for relation "my_table" violates check constraint "my_table_jdata_check"
DETAIL: Failing row contains (1, {"id":1}).
I have tried to Insert a value into a table in MySQL but I can't make it work. I am using the following queries:
INSERT into articulo values (32,'Sala',CAST('$10,000.45999' AS DECIMAL(10,5)),40.2399,200.2399,3,'kid 3');
MySQL shows the following error:
1 row(s) affected, 1 warning(s): 1292 Truncated incorrect DECIMAL value: '$10,000.45999'
And it shows the following into the table:
Of course I created the table 'articulo' before:
CREATE Table articulo
(
id_art int NOT NULL,
nom_art varchar (25) DEFAULT 'XXXXXXXXXXXXX',
prec_art decimal (10,5) DEFAULT 0.000,
peso_art decimal (10,5),
existencia float,
color_art int, CONSTRAINT chk_color1 CHECK (color_art between 0 and 20),
um_art varchar (10) DEFAULT 'DEF_PZA',
primary key (id_art)
);
I have seen many examples for Casting but all of them use the cast function under a select
statement.
Any idea how I can do in order to perform what I want?
I want to store $10,000.45999 into the table as a decimal value.
This would be 10000.45999
Thanks for your support!
You can insert the value by fixing up the number. For your case, this should work:
INSERT into articulo
SELECT 32, 'Sala',
CAST(REPLACE(REPLACE('$10,000.45999', ',', ''), '$', '') AS DECIMAL(10,5)),
40.2399, 200.2399, 3, 'kid 3';
Strictly speaking, the cast() is not necessary, but I like to avoid implicit conversions -- these can lead to hard-to-detect problems.
As a note: it is a good idea to include the column list in the insert statement.
You can't use commas or the dollar symbol in your value in that query.
You could rewrite your query as:
INSERT into articulo values (32,'Sala',CAST('10000.45999' AS DECIMAL(10,5)),40.2399,200.2399,3,'kid 3');
However you don't need to cast your value as a decimal if your column is already well defined as DECIMAL(10,5).
Simply write:
INSERT into articulo values (32,'Sala',10000.45999,40.2399,200.2399,3,'kid 3');
INSERT INTO `ree`.`media`
(`CREATEDATE`, `FILETYPE`, `MIMETYPE`, `MLSNUMBER`, `MODIFYDATE`, `POSITION`, `URL`) VALUES
('2011-12-27T15:00:16', 'PRIMARY PHOTO', 'image/jpeg', 5030011414, '2011-12-27T15:00:16', 1, 'http://image.realcomponline.com/photos.rps?PATH=PROPERTY/57FA/57FAA44C48854C/3QQGONGA03I7CN.jpg&g=100&sp=0&l=0&t=0&r=10000&b=10000&o=0&1cf=0&w=320&h=240'),
('2011-12-27T15:00:18', 'PRIMARY PHOTO', 'image/jpeg', 5030011507, '2011-12-27T15:00:18', 1, 'http://image.realcomponline.com/photos.rps?PATH=PROPERTY/6FC7/6FC7B6F88D8F45/3SQGONGA01RXH1.jpg&g=100&sp=0&l=0&t=0&r=10000&b=10000&o=0&1cf=0&w=320&h=240')
Error: Duplicate entry '2147483647-1' for key 'uneek'
It seems like my UNIQUE key of MLSNUMBER isn't parsing the entire number before differentiating the two inserts.
Both start with 5030011...
Here is how I build my the key:
ADD UNIQUE uneek ( MLSNUMBER , POSITION )
Is there a way to build this key so it accepts the entire 10 digits instead of the first 7 digits?
Thanks in advance!
You have run out of range of int.
2147483647 = 2^31 - 1
You might change this int on that field to be e.g. 64bit:
ALTER TABLE media MODIFY COLUMN MLSNUMBER BIGINT NOT NULL;
(Modify unsigned and not null for your needs).
It seems that you've defined the MLSNUMBER column as an Integer type, and the large values are getting truncated to the largest 32 bit signed value, which is 2147483647.
I verified this by attempting to add the value 5030011507 to an Int column, and it ended up storing the value 2147483647. That matches the number in your error message. I also received a warning when the value was truncated.
You can try changing the column type to BIGINT, which will allow values up to 9223372036854775807.
I am using MySQL database.
I have one table having column with datatype binary(16).
I need help with the insert statement for this table.
Example:
CREATE TABLE `assignedresource` (
`distid` binary(16) NOT NULL
)
insert into assignedresource values ('9fad5e9e-efdf-b449');
Error : Lookup Error - MySQL Database Error: Data too long for column 'distid' at row 1
How to resolve this issue?
You should remove the hyphens to make the value match the length of the field...
Example:
CREATE TABLE `assignedresource` (
`distid` binary(16) NOT NULL
)
insert into assignedresource values ('9fad5e9eefdfb449');
Also, MySQL standard is to use this notation to denote the string as binary... X'9fad5e9eefdfb449', i.e.
insert into assignedresource values (X'9fad5e9eefdfb449');
Well, assuming that you want to strictly insert a hexadecimal string, first you need to remove the dashes and then "unhex" your string before inserting it into a binary(16) data type column, the code would go like this:
INSERT INTO `assignedresource` VALUES(UNHEX(REPLACE('9fad5e9e-efdf-b449','-','')));
Also... the "usable" data you are inserting is actually 8 bytes after undashing it, so binary(8) would do fine if you plan on not storing the dashes.
You can strip the hyphens and perpend 0x to the value unquoted, like this:
insert into assignedresource values (0x9fad5e9eefdfb449);
As well as, as this (mentioned in other answers):
insert into assignedresource values (X'9fad5e9eefdfb449');
Both are valid notation for a hexadecimal literal.
Your string is 18 char long, change the database
CREATE TABLE `assignedresource` (
`distid` binary(18) NOT NULL
)