How to parse datetime data in a json string in BigQuery - json

I am trying to parse a datetime column inside a json string in Bigquery. So far BQ only return the year part (as a number). For example, this script
select json_extract(json_data, '$.application_date') from (select '{"user_id":"10000561","application_date":2020-08-03 12:55:21}' as json_data)
returns 2020 instead of the desired result 2020-08-03 12:55:21.
Please help me. How can I fix this?

I would use json_extract_scalor(), and then parse_datetime() on the returning string.
Using your example:
select parse_datetime("%F %T", json_extract_scalar(json_data, '$.application_date'))
from (
select '{"user_id":"10000561","application_date":"2020-08-03 12:55:21"}'
as json_data
)

For your requirement, json_value can be used to parse date and time from the json string.You can try the below code to get the expected output.
Code
select parse_datetime("%F %T", json_value(json_data, '$.application_date')) as date_and_time
from (
select '{"user_id":"10000561","application_date":"2020-08-03 12:55:21"}'
as json_data
)
Output

Related

Athena CTAS saves json as string with special char escaped

I'm creating a new table using CTAS in athena, everything works fine except json string in the raw table (not defined as a struct).
It was
"screen_orientation":"{"angle":"0"}",
Now becomes:
"screen_orientation":"{\"angle\":\"0\"}",
CTAS statement is straight forward:
CREATE TABLE destination_table
WITH (
format='JSON',
partitioned_by=ARRAY['partition_date'],
write_compression='GZIP'
)
AS
SELECT * FROM src_table
Source column is of type string.
Is there anyway I could prevent this from happening? I can't redefine the source table's column definition due to permission issue.
The behavior is expected in Athena. For example if I run below query where I am casting a string to JSON then the double quotes are escaped by backslash(\).
SQL:
WITH dataset AS (
SELECT
CAST('{"test": "value"}' AS JSON) AS hello_msg
)
SELECT * FROM dataset
Output:
But you can always work around this by using json_format function as shown below :
SQL:
WITH dataset AS (
SELECT
json_format(JSON '{"test": "value"}' ) as hello_msg
)
SELECT * FROM dataset
Output:
So you can add json_format to your select query in CTAS statement which will not embed these backslashes.
If your json comes as a string you can also use json_parse:
WITH dataset AS (
SELECT
json_parse('{"test": "value"}') as hello_msg
)
SELECT * FROM dataset

Extract all values from a PostgreSQL JSON array given a key

How exctract from json in postgres
[{"val":"2","dsk:"one"},{"val":"2","dsk":"two"},{"val":"3","dsk":"three"}]
where dsk values
It return null values
SELECT '[{"val":"2","dsk:"one"},{"val":"2","dsk":"two"},{"val":"3","dsk":"three"}]'::json->'dsk'
You can use the jsonb_path_query_array function and extract the entire value from the array
select jsonb_path_query_array('[{"val":"2","dsk":"one"},{"val":"2","dsk":"two"},{"val":"3","dsk":"three"}]','$[*].dsk')
Demo in DBfiddle
As mentioned you cannot use your approach because it is an array, but you can try a different one with a json function:
WITH data
AS (
SELECT *
FROM json_array_elements('[{"val":"2","dsk":"one"},{"val":"2","dsk":"two"},{"val":"3","dsk":"three"}]'::json)
)
SELECT value->'dsk'
FROM data

Format timestamp in JSON_TABLE COLUMNS

I have some JSON in an oracle table:
{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}
And using JSON_TABLE to select/create a view:
SELECT jt.*
FROM table1
JSON_TABLE (table1.json_data, '$.orders[*]' ERROR ON ERROR
COLUMNS ( StartTime TIMESTAMP PATH '$.timestamp')) AS jt;
However no matter what format I put the date/time in JSON I always get:
ORA-01830: date format picture ends before converting entire input
string
Is there a way to format the json, or something I am missing? If i pass in a date like "2016-08-10", then it will successfully create a DATE column.
When running your query on my Oracle 19.6.0.0.0 database, I do not have any problem parsing your example (see below). If you are on an older version of Oracle, it may help to apply the latest patch set. You also might have to parse it out as a string, then use TO_DATE based on the format of the date you are receiving.
SQL> SELECT jt.*
2 FROM (SELECT '{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}' AS json_data FROM DUAL) table1,
3 JSON_TABLE (table1.json_data,
4 '$.orders[*]'
5 ERROR ON ERROR
6 COLUMNS (StartTime TIMESTAMP PATH '$.timestamp')) AS jt;
STARTTIME
__________________________________
10-AUG-16 06.15.00.400000000 AM
In Oracle 18c, your query also works (if you add in CROSS JOIN, CROSS APPLY or a comma, for a legacy cross join after table1) and change $.timeStamp to lower case.
However, if you can't get it working in Oracle 12c then you can get the string value and use TO_TIMESTAMP to convert it:
SELECT StartTime,
TO_TIMESTAMP( StartTime_Str, 'YYYY-MM-DD"T"HH24:MI:SS.FF9' )
AS StartTime_FromStr
FROM table1
CROSS JOIN
JSON_TABLE(
table1.json_data,
'$.orders[*]'
ERROR ON ERROR
COLUMNS (
StartTime TIMESTAMP PATH '$.timestamp',
StartTime_Str VARCHAR2(30) PATH '$.timestamp'
)
) jt;
So, for your sample data:
CREATE TABLE table1 ( json_data VARCHAR2(4000) CHECK ( json_data IS JSON ) );
INSERT INTO table1 ( json_data )
VALUES ( '{"orders":[{"timestamp": "2016-08-10T06:15:00.4"}]}' );
This outputs:
STARTTIME | STARTTIME_FROMSTR
:------------------------ | :---------------------------
10-AUG-16 06.15.00.400000 | 10-AUG-16 06.15.00.400000000
db<>fiddle here

how to ORDER BY json object in MySQL

i'm trying to execute a query to give me 10 rows with the most biggest score ,column score in my table is a json object like :
{fa="7",en="7"}
how can i set my query to order by this json object ( it doesn't matter which of them (en or fa) used because they are always same )
Assuming your json is {"fa"="7","en"="7"} and assuming your json are in my_json_col column you could access using a -> operator and order by
SELECT *
from my_table
order by my_json_col->"fa"

Query json column based on the integer value in Postgres

I am using postgres v9.3
I have a table called temp which have a column all_data. The value looks something like below :-
{"Accountid" : "1364", "Personalid" : "4629-87c3-04e6a7a60208", "quote_number" : "QWQA62364384"}
Now, I want to query the all_data column by accountid=1364.
Could you please tell what would be the query?
Use the ->> operator
select *
from temp
where all_data ->> 'Accountid' = '1364';
Online example for 9.3: http://sqlfiddle.com/#!15/55981/1
However the above will not work if the JSON contains an array instead of a JSON object. E.g. a value like '[1,2]'::json will cause that query to fail.
With 9.4 and above you could check that using json_typeof() but with your soon to be unsupported version the only workaround I can think of is to convert the column to text and exclude those that start with a [
with valid_rows as (
select *
from temp
where all_data::text not like '[%'
)
select *
from valid_rows
where all_data ->> 'Accountid' = '1364';
Online example for 9.3: http://sqlfiddle.com/#!15/d01f43/3