JSON file Size loaded into Snowflake Variant Column - json

We have JSON files loaded into Snowflake table (Variant column), the table has 2 column - File name and Variant column (JSON records).
I am able to get the whole table size from information schema, however we are interested in getting the size of each row/record in the table.
Can you please help me if there is any formula/function that can help me to get the row size.

LEN(expression) Returns the length of an input string or binary value.
You can concatenate columns using type cast operator and get resulted string length.
Like this:
len(col1::string||col1::string||colN::string)
len(variant) also works

Related

Store JSON in multiple columns and concatenate them to query them

I have a table that currently has a extended column 32,768 bytes in size and so the database uses that space no matter if we put in 1 byte or all 32,768.
I store json in this column.
I am needing to reduce the size this column is taking.
Can I store the json in multiple columns and then concatenate the columns to work with the complete JSON?
For example
column has data:
'{"REC_TYPE_IND":"1","ID":"999999","2nd ID":"1111","location":"0003","BEGIN_DT":"20000101","END_DT":"20991231"}'
I want to split it out like
column1:
'{"REC_TYPE_IND":"1","ID":"999999","2nd '
column2:
'ID":"1111","location":"0003","BEGIN_DT":"20000101","END_DT":"20991231"}'
The how to I use built in functions like json_value(column1 || column2,'location') to get a value?
The error I get when trying the above is:
ORA-00932: inconsistent datatypes: expected - got CHAR

Change datatype of SSIS flat file data with string "NULL" values

In my SSIS project I have to retrieve my data from a flat csv file. The data itself looks something like this:
AccountType,SID,PersonID,FirstName,LastName,Email,Enabled
NOR,0001,0001,Test,Test0001,Test1#email.com,TRUE
NOR,1001,NULL,Test,Test1002,Test2#email.com,FALSE
TST,1002,NULL,Test,Test1003,Test3#email.com,TRUE
I need to read this data and make sure it has the correct datatypes for future checks. Meaning SID and PersonID should have a numeric datatype, Enabled should be a boolean. But I would like to keep the same columns and names as my source file.
It seems like the only correct way to read this data trough the 'Flat File Source'-Task is as String. Otherwise I keep getting errors because "NULL" is literally a String and not a NULL value.
Next I perform a Derived Column transformation to get rid of all "NULL" values. For example, I use the following expression for PersonId:
(TRIM(PersonID) == "" || UPPER(PersonID) == "NULL") ? (DT_WSTR,50)NULL(DT_WSTR,50) : PersonID
I would like to immediatly convert it to the correct datatype by adding it in the expression above, but it seems impossible to select another datatype for the same column when I select 'Replace 'PersonId'' in the Derived Column dropdown box.
So next up I thought of using the Data Conversion task next to change the datatypes of these columns, but when I use this it only creates new columns, even when I enter the output alias to remain the same.
How could I alter my solution to efficiently and correctly read this data and convert its values to the correct datatypes?

How to set value in MySQL(5.6) column if that contains json document as string

How to set value in MySQL(5.6) column if that contains JSON document as a string
For example, if we have a table - user in that we have three columns id, name and jsonConfig and column jsonConfig contains data as a JSON document
{"key1":"val1","key2":"val2","key3":"val3"}
I would like to replace the value of val1 let's say to val4 for jsonConfig column
Can we do that using MySQL(5.6) queries?
I don't thing their is direct way to do this like in later version alot of json support was added like JSON_EXTRACT, JSON_CONTAINS etc.You might have to write your own custom function.
With MySQL 5.6, since it does not have the JSON data type or the supporting functions, you are going to have to replace the entire string via an UPDATE query if you want to change any part of the JSON document in your string.

Add value from every row in a table and output (Cast JSON string to int)

I'm querying an SQL database that I have read only access to (Cannot edit tables/create columns etc)
My table contains a column with JSON strings that have (Actual strings are much larger, this is just an example) the following syntax
{"value":"442","country":"usa"}
I would like to add the values contained in the JSON string from each row together and output it as readable, if this is possible?
The values are in the same point of the JSON, as shown above. The values vary in length also, most are 3/4 characters long.
Try the following (for MySQL v5.7+):
select sum(json_extract(jsonString, '$.value')) from mytable;
An example of this is here.

Preserving decimal values in SSIS

I have a column from my .csv file coming in with values as 1754625.24 etc,. where as we have to save it as integer in our database. So am trying to split number with '.' and divide second part with 1000 (24/1000) as i want 3 digit number.
so i get 0.024. But i am having issues storing/preserving that value as a decimal.
I tried (DT_DECIMAL,3) conversion but i get result as '0'.
My idea is to then append '024' part to original first part. So my final result should look like 1754625024
Please help
I am not convinced why would you store 1754625.24 as 1754625024 when storing it as int.
But still for your case , we can use a derived column task and
use Replace command on the source column of csv. E.g.
Replace('1754625.24','.',0)