Format timestamp in JSON_TABLE COLUMNS - json

I have some JSON in an oracle table:
{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}
And using JSON_TABLE to select/create a view:
SELECT jt.*
FROM table1
JSON_TABLE (table1.json_data, '$.orders[*]' ERROR ON ERROR
COLUMNS ( StartTime TIMESTAMP PATH '$.timestamp')) AS jt;
However no matter what format I put the date/time in JSON I always get:
ORA-01830: date format picture ends before converting entire input
string
Is there a way to format the json, or something I am missing? If i pass in a date like "2016-08-10", then it will successfully create a DATE column.

When running your query on my Oracle 19.6.0.0.0 database, I do not have any problem parsing your example (see below). If you are on an older version of Oracle, it may help to apply the latest patch set. You also might have to parse it out as a string, then use TO_DATE based on the format of the date you are receiving.
SQL> SELECT jt.*
2 FROM (SELECT '{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}' AS json_data FROM DUAL) table1,
3 JSON_TABLE (table1.json_data,
4 '$.orders[*]'
5 ERROR ON ERROR
6 COLUMNS (StartTime TIMESTAMP PATH '$.timestamp')) AS jt;
STARTTIME
__________________________________
10-AUG-16 06.15.00.400000000 AM

In Oracle 18c, your query also works (if you add in CROSS JOIN, CROSS APPLY or a comma, for a legacy cross join after table1) and change $.timeStamp to lower case.
However, if you can't get it working in Oracle 12c then you can get the string value and use TO_TIMESTAMP to convert it:
SELECT StartTime,
TO_TIMESTAMP( StartTime_Str, 'YYYY-MM-DD"T"HH24:MI:SS.FF9' )
AS StartTime_FromStr
FROM table1
CROSS JOIN
JSON_TABLE(
table1.json_data,
'$.orders[*]'
ERROR ON ERROR
COLUMNS (
StartTime TIMESTAMP PATH '$.timestamp',
StartTime_Str VARCHAR2(30) PATH '$.timestamp'
)
) jt;
So, for your sample data:
CREATE TABLE table1 ( json_data VARCHAR2(4000) CHECK ( json_data IS JSON ) );
INSERT INTO table1 ( json_data )
VALUES ( '{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}' );
This outputs:
STARTTIME | STARTTIME_FROMSTR
:------------------------ | :---------------------------
10-AUG-16 06.15.00.400000 | 10-AUG-16 06.15.00.400000000
db<>fiddle here

Related

MySQL how to search JSON Array or JSON_CONTAINS in where statement with column name

I am currently using MySQL 5.7. Products table contains a column to store category ids. These ids are stored in a JSON string. I am looking for the most efficient method to count the numbers of products in each categories.
I have the following categories and products tables
Categories:
id
name
1
clothes
2
Electronics
Products:
id
name
category_id
1
test 01
["1"]
2
test 02
["1","2"]
3
test 03
["2"]
4
test 04
NULL
My Query:
SELECT
`categories`.`id`,
`categories`.`name`
(
SELECT
COUNT(`id`)
FROM
`products`
WHERE
JSON_CONTAINS(
`products`.`category_id`,
'\"`categories`.`id`\"'
)
) AS `products_count`
FROM
`categories`
ORDER BY `products_count`
But i am getting products_count as 0. But if i use the value instead on column name like
JSON_CONTAINS( `products`.`category_id`, '"2"')
I am getting the correct products_count. And also with other test queries while using values.
I have tried many answers but none produce the expected results. Most of them are based on the values or inner json key/values only.
I have tested many queries but none of them actual results. Some of them getting json parameter or similar errors. Some of the tested queries are
JSON_SEARCH(`courses`.`category_id`, 'all', `categories`.`id`)
JSON_CONTAINS( `courses`.`category_id`, `categories`.`id`, '$')
JSON_CONTAINS(`courses`.`sub_category_id`, JSON_QUOTE(`categories`.`id`), '$')
I am using MySQL 5.7, PHP 7.4
Thanks in advance...
SELECT
`Categories`.`id`,
`Categories`.`name`,
(
SELECT
COUNT(`id`)
FROM
`Products`
WHERE
JSON_CONTAINS(
`Products`.`category_id`,
CONCAT('"',`Categories`.`id`,'"')
)
) AS `products_count`
FROM
`Categories`
ORDER BY `products_count`
fiddle
Values in JSON have string type whereas in products table they are numbers. MySQL won't convert datatypes for JSON implicitly rather than another datatypes, because " chars in JSON are not only datatype marks but, from MySQL looking point, they are a part of value. So add dquote chars to the value to be searched for.

query json array having only strings oracle

Below is the json stored in a table called "Sample" and the column name is "argument". I want to fetch all those records having a particular value in a specified argument. I could query the argument name but not able to query a particular value as it is an array of strings. (Please find my keys have . in it)
{
"arguments":{
"app.argument1.appId":["123", "456"],
"app.argument2.testId":["546", "567"]
}
}
This gives me all the records having particular argument.
select * from sample where json_exists(argument, '$.arguments."app.argument1.appId"');
But I need to match argument value. I tried below but getting JSON expression error.
select * from sample where json_exists(argument, '$.arguments."app.argument1.appId[*]"?(# == "123"));
Please help. Thanks in advance.
You have the quotation marks in the wrong place; you want the double quotes before the square-brackets for the array instead of afterwards:
select *
from sample
where json_exists(
argument,
'$.arguments."app.argument1.appId"[*]?(# == "123")'
);
Which, for the sample data:
CREATE TABLE sample ( argument CLOB CHECK ( argument IS JSON ) );
INSERT INTO sample ( argument ) VALUES ( '{
"arguments":{
"app.argument1.appId":["123", "456"],
"app.argument2.testId":["546", "567"]
}
}');
Outputs:
| ARGUMENT |
| :----------------------------------------------------------------------------------------------------------------------- |
| {<br> "arguments":{<br> "app.argument1.appId":["123", "456"],<br> "app.argument2.testId":["546", "567"]<br> }<br>} |
db<>fiddle here
Do you know a way to do this in 12.1?
You could also use EXISTS with a correlated JSON_TABLE (which is available from Oracle 12c Release 1 (12.1.0.2)).:
select *
from sample
where EXISTS (
SELECT 1
FROM JSON_TABLE(
argument,
'$.arguments."app.argument1.appId"[*]'
COLUMNS (
value VARCHAR2(100) PATH '$'
)
)
WHERE value = '123'
);
db<>fiddle here

How can I convert values of column into array in SQL Server?

I want to convert value of column into an array. But I don't know how. Can anyone help?
Below is the structure of table that I want to change.
[{"entity":"Job","value":"400072 "},{"entity":"Job","value":"400087"}]
Expected result:
[{"entity":"Job","value":[400072, 400087]}]
The code I tried :
SELECT (
SELECT ose.TaggedEntity AS 'entity', ose.TaggedEntityId AS 'value'
FROM #OldSharedEntity AS ose
WHERE ose.TaggedEntityId NOT IN (
SELECT nse.TaggedEntityId
FROM #NewSharedEntity AS nse
)
FOR JSON PATH, INCLUDE_NULL_VALUES
) AS json
If your table's name #yourtable
You can try this
SELECT entity,
(Select JSON_QUERY('['+ STRING_AGG(value,',')+']')
FROM #yourtable t2 where t2.entity=entity) value
FROM #yourtable t
GROUP BY entity FOR JSON PATH

how to ORDER BY json object in MySQL

i'm trying to execute a query to give me 10 rows with the most biggest score ,column score in my table is a json object like :
{fa="7",en="7"}
how can i set my query to order by this json object ( it doesn't matter which of them (en or fa) used because they are always same )
Assuming your json is {"fa"="7","en"="7"} and assuming your json are in my_json_col column you could access using a -> operator and order by
SELECT *
from my_table
order by my_json_col->"fa"

Export DB2 select to CSV with headhers

I am trying to export DB2 select with headhers. But without any success, my actual code is:
db2 "EXPORT TO /tmp/result5.csv OF DEL MODIFIED BY NOCHARDEL
SELECT 1 as id, 'DEVICE_ID', 'USER_ID' from sysibm.sysdummy1
UNION ALL (SELECT 2 as id, DEVICE_ID, USER_ID FROM MOB_DEVICES) ORDER BY id"
which is not working (I suggest because USER_ID is INTEGER), when I change it for:
db2 "EXPORT TO /tmp/result5.csv OF DEL MODIFIED BY NOCHARDEL
SELECT 1 as id, 'DEVICE_ID', 'PUSH_ID' from sysibm.sysdummy1
UNION ALL (SELECT 2 as id, DEVICE_ID, PUSH_ID FROM MOB_DEVICES) ORDER BY id"
It works, DEVICE_ID and PUSH_ID are both VARCHAR.
MOB_DEVICE TABLE Any suggest how to solve this?
Thanks for advice.
DB2 will not export a CSV file with the headers, because the headers will be included as data. Normally, CSV file is for storage not viewing. If you want to view a file with its headers you have the following options:
Export to IXF file, but this file is not a flat file. You will need a spreadsheet to view it.
Export to a CSV file and include the headers by:
Select the columns names from the name, and then perform an extra step to add it to the file. You can use the describe command or perform a select on syscat.columns for this purpose, but this process is manual.
Perform a select union, in one part the data and in the other part the headers.
Perform a select and take the output to a file. Do not use export.
select * from myTable > myTable
Ignoring the EXPORT, thus just looking exclusively at the problematic UNION ALL query:
The DB2 SQL will want to conform the data of the mismatched data-types, into the numeric data-type; in this scenario, into the INTEGER data-type. Because conspicuously, the literal string value 'USER_ID' is not a valid representation of numeric value, that value can not be cast into an INTEGER value.
However, one can explicitly request to reverse that casting [whereby SQL wants to convert from string into numeric], to ensure that the SQL obeys the desired effect, to convert the INTEGER values from the column into VARCHAR values; i.e. explicit casting can ensure the data-types between the common columns of the UNION will be compatible, by forcing the values from the INTEGER column to match the data-type of the literal\constant character-string value of 'USER_ID':
with
mob_devices (DEVICE_ID, USER_ID, PUSH_ID) as
( values( varchar('dev', 1000 ), int( 1 ), varchar('pull', 1000) ) )
( SELECT 1 as id, 'DEVICE_ID', 'USER_ID'
from sysibm.sysdummy1
)
UNION ALL
( SELECT 2 as id, DEVICE_ID , cast( USER_ID as varchar(1000) )
FROM MOB_DEVICES
)
ORDER BY id