Below is the json stored in a table called "Sample" and the column name is "argument". I want to fetch all those records having a particular value in a specified argument. I could query the argument name but not able to query a particular value as it is an array of strings. (Please find my keys have . in it)
{
"arguments":{
"app.argument1.appId":["123", "456"],
"app.argument2.testId":["546", "567"]
}
}
This gives me all the records having particular argument.
select * from sample where json_exists(argument, '$.arguments."app.argument1.appId"');
But I need to match argument value. I tried below but getting JSON expression error.
select * from sample where json_exists(argument, '$.arguments."app.argument1.appId[*]"?(# == "123"));
Please help. Thanks in advance.
You have the quotation marks in the wrong place; you want the double quotes before the square-brackets for the array instead of afterwards:
select *
from sample
where json_exists(
argument,
'$.arguments."app.argument1.appId"[*]?(# == "123")'
);
Which, for the sample data:
CREATE TABLE sample ( argument CLOB CHECK ( argument IS JSON ) );
INSERT INTO sample ( argument ) VALUES ( '{
"arguments":{
"app.argument1.appId":["123", "456"],
"app.argument2.testId":["546", "567"]
}
}');
Outputs:
| ARGUMENT |
| :----------------------------------------------------------------------------------------------------------------------- |
| {<br> "arguments":{<br> "app.argument1.appId":["123", "456"],<br> "app.argument2.testId":["546", "567"]<br> }<br>} |
db<>fiddle here
Do you know a way to do this in 12.1?
You could also use EXISTS with a correlated JSON_TABLE (which is available from Oracle 12c Release 1 (12.1.0.2)).:
select *
from sample
where EXISTS (
SELECT 1
FROM JSON_TABLE(
argument,
'$.arguments."app.argument1.appId"[*]'
COLUMNS (
value VARCHAR2(100) PATH '$'
)
)
WHERE value = '123'
);
db<>fiddle here
Related
I have a column data consisting of {"name":["John","Peter"],id:["20","30"]}
If I do
SELECT JSON_VALUE(data,'$.name[0]') from table
it returns John but doing
SELECT JSON_VALUE(data,'$') from db
SELECT JSON_VALUE(data,'$.name') from table
returns NULL in both.
How come it does not return:
{"name":["John","Peter"],id:["20","30"]}
["John","Peter"]
As mentioned in the remarks section of the JSON_VALUE documentation there is a table that says for tags array in the json says: Use JSON_QUERY instead.
SELECT json_query(j,'$.name') from a;
Fiddle
I have some JSON in an oracle table:
{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}
And using JSON_TABLE to select/create a view:
SELECT jt.*
FROM table1
JSON_TABLE (table1.json_data, '$.orders[*]' ERROR ON ERROR
COLUMNS ( StartTime TIMESTAMP PATH '$.timestamp')) AS jt;
However no matter what format I put the date/time in JSON I always get:
ORA-01830: date format picture ends before converting entire input
string
Is there a way to format the json, or something I am missing? If i pass in a date like "2016-08-10", then it will successfully create a DATE column.
When running your query on my Oracle 19.6.0.0.0 database, I do not have any problem parsing your example (see below). If you are on an older version of Oracle, it may help to apply the latest patch set. You also might have to parse it out as a string, then use TO_DATE based on the format of the date you are receiving.
SQL> SELECT jt.*
2 FROM (SELECT '{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}' AS json_data FROM DUAL) table1,
3 JSON_TABLE (table1.json_data,
4 '$.orders[*]'
5 ERROR ON ERROR
6 COLUMNS (StartTime TIMESTAMP PATH '$.timestamp')) AS jt;
STARTTIME
__________________________________
10-AUG-16 06.15.00.400000000 AM
In Oracle 18c, your query also works (if you add in CROSS JOIN, CROSS APPLY or a comma, for a legacy cross join after table1) and change $.timeStamp to lower case.
However, if you can't get it working in Oracle 12c then you can get the string value and use TO_TIMESTAMP to convert it:
SELECT StartTime,
TO_TIMESTAMP( StartTime_Str, 'YYYY-MM-DD"T"HH24:MI:SS.FF9' )
AS StartTime_FromStr
FROM table1
CROSS JOIN
JSON_TABLE(
table1.json_data,
'$.orders[*]'
ERROR ON ERROR
COLUMNS (
StartTime TIMESTAMP PATH '$.timestamp',
StartTime_Str VARCHAR2(30) PATH '$.timestamp'
)
) jt;
So, for your sample data:
CREATE TABLE table1 ( json_data VARCHAR2(4000) CHECK ( json_data IS JSON ) );
INSERT INTO table1 ( json_data )
VALUES ( '{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}' );
This outputs:
STARTTIME | STARTTIME_FROMSTR
:------------------------ | :---------------------------
10-AUG-16 06.15.00.400000 | 10-AUG-16 06.15.00.400000000
db<>fiddle here
I have a table in which one column we have json array data. in some rows this json is very big (with 10000+ json objects) like below. wanted to know is there any way just to select first 250 objects from the array-
[
{
"product":"Vegetable",
"name":"Potato",
"price":"$60.00"
},
{
"product":"Fruit",
"name":"Mango",
"price":"$3.30"
},
{
"product":"Milk",
"name":"Milk",
"price":"$1.08"
},
.....10,000
]
Well I investigated the question, and I found one can use json_query to select single entries from the JSON.
CREATE TABLE json_table ( JSON varchar(1024) NOT NULL , constraint CK_JSON_IS_JSON check (JSON is json));
insert into json_table columns (JSON) values ('[ { "product":"Vegetable", "name":"Potato", "price":"$60.00" }, { "product":"Fruit", "name":"Mango", "price":"$3.30" }, { "product":"Milk", "name":"Milk", "price":"$1.08" }]');
select json_query(JSON, '$[0]'),
json_query(JSON, '$[1]'),
json_query(JSON, '$[2]'),
json_query(JSON, '$[3]')
from json_table;
This selects entries 0 to 3, with 3 not being found and being NULL.
You could probably stitch together a database stored procedure to return a list of the first n entries in the JSON.
An analytic function such as ROW_NUMBER() might be used within the subquery to determine the restriction, and then JSON_ARRAYAGG() and JSON_OBJECT() combination might be added to get back the reduced array :
SELECT JSON_ARRAYAGG(
JSON_OBJECT('product' VALUE product,
'name' VALUE name,
'price' VALUE price) ) AS "Result"
FROM
(
SELECT t.*, ROW_NUMBER() OVER (ORDER BY 1) AS rn
FROM tab
CROSS JOIN
JSON_TABLE(jsdata, '$[*]' COLUMNS (
product VARCHAR(100) PATH '$.product',
name VARCHAR(100) PATH '$.name',
price VARCHAR(100) PATH '$.price'
)
) t
)
WHERE rn <= 250
Demo
table looks like this:
my_table
id (int) create_time (timestamp) meta_data (json)
1 2019-01-20 18:35:42 {"my property": "123"}
2 2019-01-20 19:35:42 {"more data": "456"}
I've tried querying with:
SELECT * FROM my_table
WHERE meta_data = '{"my property": "123"}';
SELECT * FROM my_table
WHERE meta_data = '\{\"my property\"\: \"123\"\}';
And it doesn't work, how can I query an exact match on a json field string?
I noticed this DOES work...
SELECT * FROM my_table
WHERE meta_data LIKE '\{\"my property\"\: \"123\"\}';
Do I need to use LIKE? Why = not work?
I know the JSON field is a special field type that is designed to let you easily query specific properties, but I wanted to be able to just check against the full JSON easily. And clearly the JSON field has some parameters that cause the = not to work as I expected.
This is the solution I figured out, cast JSON as CHAR:
SELECT * FROM my_table
WHERE CAST(meta_data as CHAR) = '{"my property": "123"}';
Another option:
SELECT * FROM my_table
WHERE meta_data = CAST('{"my property": "123"}' AS JSON);
You can also obtain JSON values by casting values of other types to the JSON type using CAST(value AS JSON);"
https://dev.mysql.com/doc/refman/5.7/en/json-creation-functions.html#function_json-quote
Use JSON_CONTAINS:
SELECT *
FROM my_table
WHERE JSON_CONTAINS(meta_data, '"123"', '.$"my property"');
Demo
Notes:
Since the target value to match for the key my property is a literal string, we need to also search for the double quotes.
To escape the key my property in the JSON path, we can escape that using double quotes.
If you want to compare a JSON column to a JSON value, then use JSON_OBJECT() to create the value.
Demo:
create table t (id int primary key, data json);
insert into t values (1, json_object('my property', '123'));
insert into t values (2, json_object('more_data', 456));
select * from t where data = json_object('more_data', 456);
+----+--------------------+
| id | data |
+----+--------------------+
| 2 | {"more_data": 456} |
+----+--------------------+
You should refer this
https://dev.mysql.com/doc/refman/5.7/en/json-search-functions.html
to use JSON data for query
Try (modify according to the right name of the property and datatype. If String use "123".
SELECT * FROM my_table WHERE JSON_EXTRACT(meta_data, "$.`my property`")= 123
Or may be
SELECT * FROM my_table WHERE JSON_EXTRACT(meta_data, "$.my property")= 123
I need to get json from a restful service in a pl/pgsql function. (I have no control on restful webservice. It's published by someone else). I could get the json but I couldn't convert it to rows (for inserting to a table). The simplified format of it is below. Every GUID is random and I don't know their content before. When I convert it from text (with ::json) I get no errors, so it's valid json.
l_json := '{"GUID-0001":{"Id":"1","Field1":"aaa1","Field2":"bbb1"}, "GUID-0002":{"Id":"2","Field1":"aaa2","Field2":"bbb2"}}'::json;
I tried several Postgresql json functions but each time I got a different error. i.e. when I use json_array_elements() function, I got "ERROR: cannot call json_array_elements on a non-array". When I tried json_each_text() function I got "ERROR: query has no destination for result data"
I need a resultset as the following:
GUID | Id | Field1 | Field2
---------+----+--------+-------
GUID-0001| 1 | aaa1 | bbb1
GUID-0002| 1 | aaa2 | bbb2
You can get all the keys using jsonb_object_keys() and the use that to access the fields inside the JSON:
with data(doc) as (
values ('{"GUID-0001":{"Id":"1","Field1":"aaa1","Field2":"bbb1"}, "GUID-0002":{"Id":"2","Field1":"aaa2","Field2":"bbb2"}}'::jsonb)
)
select t.uid,
d.doc -> t.uid ->> 'Id' as id,
d.doc -> t.uid ->> 'Field1' as column1,
d.doc -> t.uid ->> 'Field2' as column2
from data d, jsonb_object_keys(doc) as t(uid);
returns:
uid | id | column1 | column2
----------+----+---------+--------
GUID-0001 | 1 | aaa1 | bbb1
GUID-0002 | 2 | aaa2 | bbb2
You can put that into a function that accepts a jsonb as a parameter:
create or replace function store_json(p_doc jsonb)
returns void
as
$$
insert into the_table (guid, id, column1, column2)
select t.uid,
(d.doc -> t.uid ->> 'Id')::int,
d.doc -> t.uid ->> 'Field1',
d.doc -> t.uid ->> 'Field2'
from (select p_doc) as d(doc),
jsonb_object_keys(doc) as t(uid);
$$
language sql;
I think the easiest way to tackle this problem is to adjust your json a bit prior to inserting and then use json_populate_recordset to convert the json to rows.
Convert the outer object to an array in your app code, and move the GUID-000x value inside the relevant object under a GUID key like so:
[
{
"GUID": "GUID-0001",
"Id": "1",
...
},
{
"GUID": "GUID-0002",
"Id": "1",
...
}
...
]
I don't know what language you are using for your app code, but I assume you have some form of reduce at your disposal to do the job.
Once you have your data in the proper format, you can use json_populate_recordset like so:
insert into your_table
select GUID, Id, Field1, Field2
from json_populate_recordset(null::your_table, _your_modified_json_from_above)
;
json_populate_recordset basically takes your json and matches keys to columns defined on your_table and adds the values accordingly. The catch is your object keys must match the column names exactly, and your values need to match (or be able to be cast to match) the data types defined on those columns.