I have save data in PostgreSQL as given below:
{"tags": "Tag 1,Tag 2,Tag 3"}
{"tags": "Tag 1,Tag 4,Tag 5"}
{"tags": "Tag 6,Tag 1,Tag 2"}
I want search records where 'Tag 2' or Tag 3 exists?
Table schema, create procedure is as below,
--create table
CREATE TABLE "tblIrsInputTagging" (
"IrsInputTaggId" serial NOT NULL,
"Irs_tags" json NOT NULL, CONSTRAINT "tblIrsInputTagging_pkey"
PRIMARY KEY ("IrsInputTaggId")
) WITH ( OIDS=FALSE );
ALTER TABLE "tblIrsInputTagging" OWNER TO "postgre";
--insert json record
INSERT INTO "tblIrsInputTagging" ("Irs_tags")
VALUES ( '{"tags": "Tag 1,Tag 2,Tag 3"}' );
INSERT INTO "tblIrsInputTagging" ("Irs_tags")
VALUES ( '{"tags": "Tag 1,Tag 4,Tag 5"}' );
INSERT INTO "tblIrsInputTagging" ("Irs_tags")
VALUES ( '{"tags": "Tag 6,Tag 1,Tag 2"}' );
i don't think there is some specific function to check per json items. But you can do so by casting the data type as text then check it with like or ilike operator.
SELECT col1, col2 FROM table_name WHERE CAST(col1 AS text) ilike '%value%'
notice that the col1 column is the json data type.
Let's say the column name is tags. In that case, you can query like below.
If there is something specific, let me know, so I might need to update below snippet.
SELECT col_1, col_2, .. col_n
FROM your_table
WHERE tags::json->'tags' ILIKE '%Tag2%'
OR
tags::json->'tags' ILIKE '%Tag3%'
Related
I am looking for a way of find rows by given element of the json table that match the pattern.
Lets start with mysql table:
CREATE TABLE `person` (
`attributes` json DEFAULT NULL
);
INSERT INTO `person` (`attributes`)
VALUES ('[{"scores": 1, "name": "John"},{"scores": 1, "name": "Adam"}]');
INSERT INTO `person` (`attributes`)
VALUES ('[{"scores": 1, "name": "Johny"}]');
INSERT INTO `person` (`attributes`)
VALUES ('[{"scores": 1, "name": "Peter"}]');
How to find all records where attributes[*].name consists John* pattern?
In the John* case the query should return 2 records (with John and Johny).
SELECT DISTINCT person.*
FROM person
CROSS JOIN JSON_TABLE(person.attributes, '$[*]' COLUMNS (name TEXT PATH '$.name')) parsed
WHERE parsed.name LIKE 'John%';
https://sqlize.online/sql/mysql80/c9e4a3ffa159c4be8c761d696e06d946/
I have Json file and need to insert the data to Oracle table dynamically. For that, table columns and corresponding jason data columns are keeping in a lookup table. Lookup table has the below structure,
Json_Column | Table_Column
While receiving Json file, I select the table columns and corresponding json columns from lookup table and put into variable A and B respectively. Here B used to define the values we have to select from Json file and A refers to which columns we have insert these values in the table.
For that, I used below query,
execute immediate ' insert into oracle_tbl('||A||') values ('||B||')';
But, I got below err.
ORA-00984: column not allowed here
Finally, realised that the error is due to it is not fetching the data from Json file. Bcos, the B returns the column names only not the corresponding values in the Json file. Actually, I needed the values inside the json columns mentioned in the variable B.
Please advise how can I fix this issue ?
Thanks in advance.
Monica
Reading the Json data using below,
FOR ALL_REC1 IN (SELECT * FROM JSON_TABLE (PJSON_DATA,'$.ExcelData[*]'
COLUMNS ( Column1 NUMBER PATH '$.col1',
Column2 NUMBER PATH '$.col1'
)))
Column1 and Column2 are defined in variable B. But values of Column1 and Column2 are not picking in execute immediate.
What I'm getting after executing "execute immediate ' insert into oracle_tbl('||A||') values ('||B||')';" is,
execute immediate ' insert into oracle_tbl(tbl_clo1, tbl_col2) values (column1, column2)';
But column1 and column2 has value in Json file. It is not getting. I'm expecting below.
execute immediate ' insert into oracle_tbl(tbl_clo1, tbl_col2) values (value of column1, value of column2)';
May be this example will give you some ideas:
WITH colnames(column1, column2) AS
(
SELECT 'col1', 'col2' FROM DUAL
)
select
REPLACE(
REPLACE(
REPLACE(
REPLACE(
q'~INSERT INTO ora_table( {col1name}, {col2name} ) VALUES( '{col1value}', '{col2value}')~',
'{col1name}', c.column1
),
'{col2name}', c.column2
),
'{col1value}', t.c1
),
'{col2value}', t.c2
) AS sql
FROM colnames c,
JSON_TABLE(
q'~[ { "col1" : "v11", "col2": "v12" }, { "col1" : "v21", "col2": "v22" } ]~'
, '$[*]'
columns (
c1 VARCHAR2 PATH '$.col1',
c2 VARCHAR2 PATH '$.col2'
)
) t
;
Then "FOR rec IN ..." loop on the query and "execute immediate" the rec.sql returned.
I am trying to create a key/value table which fetches the instructor name and salary over 80000. I am not able to make a SELECT statement here. using Json here.
CREATE TABLE instructortest (
ID INT PRIMARY KEY,
info VARCHAR(max) NOT NULL
);
ALTER TABLE instructortest
ADD CONSTRAINT "valid JSON"
CHECK (ISJSON (info) = 1);
INSERT INTO instructortest
VALUES (78699,'{"name":"Pingr","Department":"Statistics","salary":"59303.62"}' )
select JSON_VALUE(info, '$.name') from instructortest
where ('$.salary') = 59303.62
You just need to use JSON_VALUE again in your where clause:
select JSON_VALUE(info, '$.name') from instructortest
where cast(JSON_VALUE(info, '$.salary') as decimal(18,2)) > 80000
When working with JSON datatype, is there a way to ensure the input JSON must have elements. I don't mean primary, I want the JSON that gets inserted to at least have the id and name element, it can have more but at the minimum the id and name must be there.
thanks
The function checks what you want:
create or replace function json_has_id_and_name(val json)
returns boolean language sql as $$
select coalesce(
(
select array['id', 'name'] <# array_agg(key)
from json_object_keys(val) key
),
false)
$$;
select json_has_id_and_name('{"id":1, "name":"abc"}'), json_has_id_and_name('{"id":1}');
json_has_id_and_name | json_has_id_and_name
----------------------+----------------------
t | f
(1 row)
You can use it in a check constraint, e.g.:
create table my_table (
id int primary key,
jdata json check (json_has_id_and_name(jdata))
);
insert into my_table values (1, '{"id":1}');
ERROR: new row for relation "my_table" violates check constraint "my_table_jdata_check"
DETAIL: Failing row contains (1, {"id":1}).
INSERT INTO `dictionary2` (word,verb)
SELECT SUBSTRING(word FROM 2)
FROM `dictionary1`
WHERE `dictionary1`.word LIKE "w%"
I have two tables, dictionary1(word) and dictionary2(word,verb).
I would like to insert into dictionary2, values from dictionary1, where word starts with 'w' and the value is not present in dictionary2.word.
In the same insert I would like to set the value of dictionary2.verb to 1.
You could use this:
INSERT INTO dictionary2 (word, verb)
SELECT dictionary1.word, 1
FROM
dictionary1 LEFT JOIN dictionary2
ON dictionary1.word = dictionary2.word
WHERE
dictionary1.word LIKE 'w%'
AND dictionary2.word IS NULL
Please see fiddle here.
Try this:
SQL Query:
INSERT INTO `dictionary2` (word,verb)
SELECT word, (select #number :=1)
FROM `dictionary1`
WHERE `dictionary1`.word LIKE "t%"
AND `dictionary1`.word NOT IN(SELECT word FROM dictionary2);
Sample data:
CREATE TABLE `dictionary2` (
ID INT AUTO_INCREMENT PRIMARY KEY,
word VARCHAR (50),
verb VARCHAR (50)
);
CREATE TABLE `dictionary1` (
ID INT AUTO_INCREMENT PRIMARY KEY,
word VARCHAR (50)
);
INSERT INTO `dictionary2`(word,verb)
VALUES
('test','test'),
('test1','test1');
INSERT INTO `dictionary1`(word) VALUES('test'),('testing1');
B.T.W. you can safely change LIKE "t%" into LIKE "w%" ;-)
SQL FIDDLE DEMO