snowflake read null from s3 json - json

This seems like it should be easy, but for the life of me on a Friday night...
I have a json file i'm reading from s3
{"name":"bob", "currentTime":"null"}
I created a stage in snowflake.
When I do,
Select $1:name, $2:currentTime
from #myStage/mydocument
I get as expected
$1:name $2:currentTime
"bob" "null"
I have a snowflake table
create table test_bob
(
name varchar
,currentTime TIMESTAMP_NTZ
)
But when I do
Copy into test_bob
Select $1:name, $2:currentTime
from #myStage/mydocument
I get an error,
Failed to cast variant value "null" to TIMESTAMP_NTZ
I tried using
NULL_IF
as suggested here.
I tried using STRIP_NULL_VALUES as a file format

It looks like you have a string of "null" rather than a null value. Did you try this?
Copy into test_bob
Select $1:name, NULLIF($2:currentTime::string,'null')::timestamp_ntz
from #myStage/mydocument
this should check the value of the string before trying to convert the json attribute object to a timestamp_ntz.

Related

Big Query JSON Extract Function

I'm extracting 2 fields from a JSON using JSON_EXTRACT using BQ as the following:
select JSON_EXTRACT_SCALAR('Event_Value','$.user_id') as cid, JSON_EXTRACT_SCALAR('Event_Value','$.tsts') as ts
if the JSON format is missing one of the field I'm receiving NULLs all over the place.
Is there a way to overcome it?
I feel the fix is quite simple:
select JSON_EXTRACT_SCALAR(Event_Value,'$.user_id') as cid, JSON_EXTRACT_SCALAR(Event_Value,'$.tsts') as ts
So, there are extra ' around Event_Value - thus Event_Value was treated not as a column name but rather as string

Change datatype of SSIS flat file data with string "NULL" values

In my SSIS project I have to retrieve my data from a flat csv file. The data itself looks something like this:
AccountType,SID,PersonID,FirstName,LastName,Email,Enabled
NOR,0001,0001,Test,Test0001,Test1#email.com,TRUE
NOR,1001,NULL,Test,Test1002,Test2#email.com,FALSE
TST,1002,NULL,Test,Test1003,Test3#email.com,TRUE
I need to read this data and make sure it has the correct datatypes for future checks. Meaning SID and PersonID should have a numeric datatype, Enabled should be a boolean. But I would like to keep the same columns and names as my source file.
It seems like the only correct way to read this data trough the 'Flat File Source'-Task is as String. Otherwise I keep getting errors because "NULL" is literally a String and not a NULL value.
Next I perform a Derived Column transformation to get rid of all "NULL" values. For example, I use the following expression for PersonId:
(TRIM(PersonID) == "" || UPPER(PersonID) == "NULL") ? (DT_WSTR,50)NULL(DT_WSTR,50) : PersonID
I would like to immediatly convert it to the correct datatype by adding it in the expression above, but it seems impossible to select another datatype for the same column when I select 'Replace 'PersonId'' in the Derived Column dropdown box.
So next up I thought of using the Data Conversion task next to change the datatypes of these columns, but when I use this it only creates new columns, even when I enter the output alias to remain the same.
How could I alter my solution to efficiently and correctly read this data and convert its values to the correct datatypes?

JSON update single value in MySQL table

I have a JSON array in the MySQL payment table details column. I need to update a single value of this JSON array. What is the procedure to update JSON using MySQL?
JSON Array
{"items":[{"ca_id":18,"appointment_date":"2018-09-15 15:00:00","service_name":"Software Installation / Up-gradation","service_price":165}],"coupon":{"code":"GSSPECIAL","discount":"10","deduction":"0.00"},"subtotal":{"price":165,"deposit":0},"tax_in_price":"included","adjustments":[{"reason":"Over-time","amount":"20","tax":"0"}]}
I need to update the appointment _date 2018-09-15 15:00:00 to 2018-09-28 15:00:00.
Here is a pure MySQL JSON way of doing this:
UPDATE yourTable
SET col = JSON_REPLACE(col, '$.items[0].appointment_date', '2018-09-28 15:00:00');
The best I could come up with is to address the first element of the JSON array called items, and then update the appointment_date field in that array element.
Here is a demo showing that the JSON replacement syntax/logic is working:
Demo
But, you could equally as well have done this JSON work in your PHP layer. It might make more sense to do this in PHP.
If you want to do this in php then, steps to follow:
Select the respective column from the table
Use json_decode to convert the string to array
Now you have the json object, apply your modifications
Use json_encode to convert your json object back to string
Save this string in table

JSON_EXTRACT not working for nested json data

I want to select the data from my table which is json data so to show my table data is like this :
user_id: 1
metaname: mymetaname
meta_value: a:1:{i:0;a:10:{s:7:"street1";s:36:"shiv plaza";s:4:"city";s:5:"surat";s:5:"state";s:7:"gujarat";s:7:"zipcode";s:6:"395010";s:14:"dollet_country";s:2:"IN";s:10:"tostreet1l";s:5:"surat";s:7:"tocityl";s:5:"surat";s:8:"tostatel";s:5:"surat";s:10:"tozipcodel";s:6:"395000";s:17:"todollet_countryl";s:2:"IN";}}
And i am trying to run this query :
SELECT user_id,JSON_EXTRACT(meta_value, '$."city"') FROM `usermetatable`
But it's showing error :
[Invalid JSON text in argument 1 to function json_extract: "Invalid
value." at position 0.]
My json data in table can not be changed to other and it's correct JSON for sure, Could anyone correct above query ?
That's not JSON data. It looks like a serialized PHP object. See http://php.net/serialize
There's no MySQL function for extracting a field from that serialized object. You should fetch the whole object into a PHP app, and call unserialize() on it, then access the object members.

How do I ignore invalid JSON when using json_parse with PrestoDB?

I am fairly new to Presto, and am trying to parse a bunch of records containing JSON data. It appears that some of the data is invalid, which causes Presto to abort the query during the call to json_parse. Is it possible to somehow return NULL instead of throwing an error in this case?
It seems like previously you could use try_cast(value as json), but that was removed in favor of json_parse. Is there any sort of configuration I can change to resolve this, or do I need to resort to creating a custom SerDe?
It looks like json_extract(data, '$') will return NULL for invalid JSON:
presto:default> select json_extract('{', '$');
_col0
-------
NULL
(1 row)