MySql json reverse search - mysql

I have a MySQL table with a column of type json. The values of this columns are json array not json object. I need to find records of this table that at least one value of their json column is substring of the given string/phrase.
Let's suppose the table is looks like this:
create table if not exists test(id int, col json);
insert into test values (1, '["ab", "cd"]');
insert into test values (2, '["ef", "gh", "ij"]');
insert into test values (3, '["xyz"]');
If the input string/phrase is "acf ghi z" the second column must be returned as the result, because "gh" is a substring of the input. I read a lot about json_contains, json_extract, json_search and even json_overlaps but couldn't manage to solve this problem.
What is the correct sql syntax to retrieve the related rows?
MySQL version is 8.0.20

You can use json_table() to extract the JSON array as rows in a table. Then just filter:
select *
from test t cross join
json_table(t.col, '$[*]' columns (str varchar(255) path '$')) j
where 'acf ghi z' like concat('%', j.str, '%');
Here is a db<>fiddle.

Related

MYSQL - append new array elements to JSON column

I have a table with json columns with default empty arrays [].
old table
id
myJson
A1
[1, 2]
A12
[]
I want the table updated to below.
id
myJson
A1
[1, 2, 321, 432]
A12
[222]
Tried - INSERT INTO table (id, myJson) VALUES ("A1", "[321, 432]"), ("A12", "[222]") ON DUPLICATE KEY UPDATE myJson = JSON_ARRAY_APPEND(myJson, "$", myJson)
Above query and other tried so far did not produce desirable result.
What query can i use to append new arrays to old as shown in the tables?
What version of MySQL are you using?
One option is to use JSON_MERGE_PRESERVE or JSON_MERGE_PATCH (as needed):
INSERT INTO `table` (`id`, `myJson`)
VALUES ('A1', '[321, 432]'), ('A12', '[222]') AS `new`
ON DUPLICATE KEY UPDATE
`table`.`myJson` = JSON_MERGE_PRESERVE(`table`.`myJson`, `new`.`myJson`);
See dbfiddle.

Transform records of a column from int to JSON?

I have a column with int datatype records. I want to first convert them to varchar so I can represent them as JSON inside the table. Is this possible?
Edit:
I created a test table so I could explain better what I meant with my question.
This is the query I made:
SELECT TOP (1000) [test_id], [ToJsonTestValue]
FROM [Test].[dbo].[Test]
This is the result from the query. What I wish is to convert the column "ToJsonTestValue" to actual JSON. It is of datatype int, but what is intended is to alter that to varchar and then represent it as JSON.
Solution
I was overthinking on this one. I just needed to make a set and an update like this:
UPDATE dbo.TestTwo
SET ToJsonTestValue = '["' + ToJsonTestValue + '"]'
The output is like this:
Original answer:
If I understand the question correctly, what you may try is to generate a JSON content for each row using FOR JSON PATH. The following basic example is a possible solution to your problem:
Table:
CREATE TABLE Test (
TestId int,
ToJsonTestValue int
)
INSERT INTO Test (TestId, ToJsonTestValue)
VALUES
(2, 1),
(3, 1),
(4, 2),
(5, 3)
ALTER and UPDATE table:
ALTER TABLE Test ALTER COLUMN ToJsonTestValue varchar(1000)
UPDATE Test
SET ToJsonTestValue = (SELECT ToJsonTestValue FOR JSON PATH)
Result:
TestId ToJsonTestValue
--------------------------------
2 [{"ToJsonTestValue":"1"}]
3 [{"ToJsonTestValue":"1"}]
4 [{"ToJsonTestValue":"2"}]
5 [{"ToJsonTestValue":"3"}]
Update:
Note, that with FOR JSON you can't generate a JSON array of scalar values ([1, 2, 3]), but you may try an approach using JSON_MODIFY() (of course, concatenating strings to build the expected outpur is always an option):
ALTER TABLE Test ALTER COLUMN ToJsonTestValue varchar(1000)
UPDATE Test
SET ToJsonTestValue = JSON_MODIFY('[]', 'append $', ToJsonTestValue)
Result:
TestId ToJsonTestValue
----------------------
2 ["1"]
3 ["1"]
4 ["2"]
5 ["3"]

Oracle reading JSON data using json_query

While working on oracle json datatype and trying to extract data from it, not able to extract name & value elements from this. tried using all known notations but getting null.
select json_query(po_document, '$.actions.parameters[0]') from j_purchaseorder where ID='2';
You can use the JSON_VALUE function as follows:
SQL> select JSON_VALUE('{"_class":"123", "name":"tejash","value":"so"}', '$.name') as name,
2 JSON_VALUE('{"_class":"123", "name":"tejash","value":"so"}', '$.value') as value
3 from dual;
NAME VALUE
---------- ----------
tejash so
SQL>
Thanks for your help. got required output using below
select json_value(json_query(po_document, '$.actions.parameters[0]'),'$.value') from j_purchaseorder where ID='2' and
json_value(json_query(po_document, '$.actions.parameters[0]'),'$.name') = 'SERVERUSER';
As explained, for example, in the Oracle documentation, multiple calls to JSON_VALUE() on the same JSON document may result in very poor performance. When we need to extract multiple values from a single document, it is often best (for performance) to make a single call to JSON_TABLE().
Here is how that would work on the provided document. First I create and populate the table, then I show the query and the output. Note the handling of column (attribute) "_class", both in the JSON document and in the SQL SELECT statement. In both cases the name must be enclosed in double-quotes, because it begins with an underscore.
create table j_purchaseorder (
id number primary key,
po_document clob check (po_document is json)
);
insert into j_purchaseorder (id, po_document) values (
2, '{"_class":"hudson.model.StringParameterValue","name":"SERVERUSER","value":"avlipwcnp04"}'
);
commit;
select "_CLASS", name, value
from j_purchaseorder
cross apply
json_table(po_document, '$'
columns (
"_CLASS" varchar2(40) path '$."_class"',
name varchar2(20) path '$.name',
value varchar2(20) path '$.value'
)
)
where id = 2
;
_CLASS NAME VALUE
---------------------------------------- ------------------ ------------------
hudson.model.StringParameterValue SERVERUSER avlipwcnp04

How to convert text to jsonb entirely for a postgresql column

What I have is a text column in Postgresql which I want to convert to JSONB column.
What I have tried is this:
CREATE TABLE test (id serial, sec text, name text);
INSERT INTO test (id, sec, name) VALUES (1,'{"gender":"male","sections":{"a":1,"b":2}}','subject');
ALTER TABLE test ALTER COLUMN sec TYPE JSONB USING sec::JSONB;
This did convert the text column to jsonb.
However, if I try to query:
SELECT sec->>'sections'->>'a' FROM test
I get an error.
I see the conversion is done only at one level(i.e: sec->>'sections' works fine).
The query SELECT pg_typeof(name->>'sections') from test; gives me column type as text.
Is there a way I can convert the text to jsonb entirely, such that I can query SELECT sec->>'sections'->>'a' FROM test; successfully?
I don't want to convert the text to json in the query like below, as I need to create index on 'a' later.
select (sec->>'sections')::json->>'a' from test;
The operator ->> gives a text as a result. Use -> if you want jsonb:
select
pg_typeof(sec->>'sections') a,
pg_typeof(sec->'sections') b
from test;
a | b
------+-------
text | jsonb
(1 row)
Use:
select sec->'sections'->>'a'
from test;
Or better, yet, use the operator #>>:
SELECT sec #>> '{sections,a}' FROM test;
And to use this in an expression index you need extra parentheses:
CREATE INDEX foo ON test ((sec #>> '{sections,a}'));
Make sure to use a matching expression (without parentheses) in queries to allow index usage.

Query Postgres for number of items in JSON

I am running Postgres 9.3 and have a problem with a query involving a JSON column that I cannot seem to crack.
Let's assume this is the table:
# CREATE TABLE aa (a int, b json);
# INSERT INTO aa VALUES (1, '{"f1":1,"f2":true}');
# INSERT INTO aa VALUES (2, '{"f1":2,"f2":false,"f3":"Hi I''m \"Dave\""}');
# INSERT INTO aa VALUES (3, '{"f1":3,"f2":true,"f3":"Hi I''m \"Popo\""}');
I now want to create a query that returns all rows that have exactly three items/keys in the root node of the JSON column (i.e., row 2 and 3). Whether the JSON is nested doesn't matter.
I tried to use json_object_keys and json_each but couldn't get it to work.
json_each(json) should do the job. Counting only root elements:
SELECT aa.*
FROM aa, json_each(aa.b) elem
GROUP BY aa.a -- possible, because it's the PK!
HAVING count(*) = 3;
SQL Fiddle.