Talend Casting of JSON string to JSON or JSONB in PostgreSQL - mysql

I'm trying to use Talend to get JSON data that is stored in MySQL as a VARCHAR datatype and export it into PostgreSQL 9.4 table of the following type:
CREATE TABLE myTable( myJSON as JSONB)
When I try running the job I get the following error:
ERROR: column "json_string" is of type json but expression is of type
character varying
Hint: You will need to rewrite or cast the expression. Position:
54
If I use python or just plain SQL with PostgreSQL insert I can insert a string such as '{"Name":"blah"}' and it understands it.
INSERT INTO myTable(myJSON) VALUES ('{"Name":"blah"}');
Any Idea's how this can be done in Talend?

You can add a type-cast by opening the "Advanced Settings" tab on you "tPostgresqlOutput" component. Consider the following example:
In this case, the input row to "tPostgresqlOutput_1" has one column data. This column is of type String and is mapped to the database column data of type VARCHAR (as by the default suggested by Talend):
Next, open the component settings for tPostgresqlOutput_1 and locate the "Advanced settings" tab:
On this tab, you can replace the existing data column by a new expression:
In the name column, specify the target column name.
In the SQL Expression column, do your type casting. In this case: "?::json"`. Note the usage of the placeholder character?`` which will be replaced with the original value.
In Position, specify Replace. This will replace the value proposed by Talend with your SQL expression (including the type cast).
As Reference Column use the source value.
This should do the trick.

Here is a sample schema for where in i have the input row 'r' which has question_json and choice_json columns which are json strings. From which i know the key what i wanted to extract and here is how i do
you should look at the columns question_value and choice_value. Hope this helps you

Related

Change datatype of SSIS flat file data with string "NULL" values

In my SSIS project I have to retrieve my data from a flat csv file. The data itself looks something like this:
AccountType,SID,PersonID,FirstName,LastName,Email,Enabled
NOR,0001,0001,Test,Test0001,Test1#email.com,TRUE
NOR,1001,NULL,Test,Test1002,Test2#email.com,FALSE
TST,1002,NULL,Test,Test1003,Test3#email.com,TRUE
I need to read this data and make sure it has the correct datatypes for future checks. Meaning SID and PersonID should have a numeric datatype, Enabled should be a boolean. But I would like to keep the same columns and names as my source file.
It seems like the only correct way to read this data trough the 'Flat File Source'-Task is as String. Otherwise I keep getting errors because "NULL" is literally a String and not a NULL value.
Next I perform a Derived Column transformation to get rid of all "NULL" values. For example, I use the following expression for PersonId:
(TRIM(PersonID) == "" || UPPER(PersonID) == "NULL") ? (DT_WSTR,50)NULL(DT_WSTR,50) : PersonID
I would like to immediatly convert it to the correct datatype by adding it in the expression above, but it seems impossible to select another datatype for the same column when I select 'Replace 'PersonId'' in the Derived Column dropdown box.
So next up I thought of using the Data Conversion task next to change the datatypes of these columns, but when I use this it only creates new columns, even when I enter the output alias to remain the same.
How could I alter my solution to efficiently and correctly read this data and convert its values to the correct datatypes?

Alter column varchar to json in psql

Hi i am trying to convert a column in my table from varchar to json and the table already had some string data. I tried doing that with the below command.
Database=# alter table table_name alter column message type json using
message::json;
But the command failed with the below error.
ERROR: invalid input syntax for type json
DETAIL: Token "This" is invalid.
CONTEXT: JSON data, line 1: This...
Note : The message column has a set of words with spaces like below.
"This is a message"
I am not sure what went wrong. Thanks in advance..
You can use to_jsonb() rather than casting:
alter table table_name
alter column message type jsonb using to_jsonb(message);
If you really want to use json (although jsonb is recommended), then cast the result back to a json type:
alter table table_name
alter column message type json using to_jsonb(message)::json;
But this seems rather strange for a column that doesn't contain "real" json values, only plain strings.
In my case, I wanna split varchar that have separator each n-character. For example
varchar "ab,cd,ef,gh" -> json ["ab","cd","ef","gh"]
First that I do, convert varchar into array with delimiter of comma (",")
ALTER table my_table ALTER column my_column TYPE text[] USING string_to_array(my_column,',')
Then, convert array of varchar into json (maybe will you use for GraphQL database)
ALTER table my_table ALTER column my_column TYPE json USING array_to_json(my_column)
That gives me a result of json(array) like ["ab","cd","ef","gh"]
You can simply run the following command below, it will surely convert the field type as well as the previous records, you might also need to update your models file of respective framework so you can work accordingly.
In my case, I was converting saves_states table's column named prev_month_access_counts type from text to json.
ALTER TABLE saved_states ALTER COLUMN prev_months_access_counts TYPE jsonb
using to_jsonb(prev_months_access_counts);

How to set value in MySQL(5.6) column if that contains json document as string

How to set value in MySQL(5.6) column if that contains JSON document as a string
For example, if we have a table - user in that we have three columns id, name and jsonConfig and column jsonConfig contains data as a JSON document
{"key1":"val1","key2":"val2","key3":"val3"}
I would like to replace the value of val1 let's say to val4 for jsonConfig column
Can we do that using MySQL(5.6) queries?
I don't thing their is direct way to do this like in later version alot of json support was added like JSON_EXTRACT, JSON_CONTAINS etc.You might have to write your own custom function.
With MySQL 5.6, since it does not have the JSON data type or the supporting functions, you are going to have to replace the entire string via an UPDATE query if you want to change any part of the JSON document in your string.

postgres: how to convert hstore to JSON datatypes

I'm trying to write a migration to convert an existing hstore column to JSON (not JSONB).
I tried different solutions json USING cast(hstore_column as json), some functions found over github, but nothing really worked out.
Main issue is that there's no direct conversion, second is that even if I cast the column to text as an intermediate step I need to change the default column value to json as well.
Anyone already did this?
You can simply use
alter table my_table alter column h_store_column type json using hstore_to_json(h_store_column)
Of course you will need to drop any defaults set on the column that don't align with the json data type first.

Cast JSON to HSTORE in Postgres 9.3+?

I've read the docs and it appears that there's no discernible way to perform an ALTER TABLE ... ALTER COLUMN ... USING statement to directly convert a json type column to an hstore type. There's no function available (that I'm aware of) to perform the cast.
The next best alternative I have is to create a new column of type hstore, copy my JSON data to that new column using some external tool, drop the old json column and rename the new hstore column to the old column's name.
Is there a better way?
What I have so far is:
$ CREATE TABLE blah (unstructured_data JSON);
$ ALTER TABLE blah ALTER COLUMN unstructured_data
TYPE hstore USING CAST(unstructured_data AS hstore);
ERROR: cannot cast type json to hstore
Unfortunately, PostgreSQL doesn't allow all kind of expressions within the USING clause of ALTER TABLE ... SET DATA TYPE ... (f.ex. sub-queries are disallowed).
But, you can write a function to overcome this, you just need to decide what to do with advanced types (in object's values), like arrays & objects. Here is an example, which simply converts them to string:
CREATE OR REPLACE FUNCTION my_json_to_hstore(json)
RETURNS hstore
IMMUTABLE
STRICT
LANGUAGE sql
AS $func$
SELECT hstore(array_agg(key), array_agg(value))
FROM json_each_text($1)
$func$;
After that, you can use this in your ALTER TABLE, like:
ALTER TABLE blah
ALTER COLUMN unstructured_data
SET DATA TYPE hstore USING my_json_to_hstore(unstructured_data);
There is "trap" for repeated keys - allowed by both json and hstore input, but unfortunately resolved differently (!). Consider this example value:
json '{"double_key":"key1","foo":null,"double_key":"key2"}'
In json, 'double_key is effectively 'key2'. The manual:
Because the json type stores an exact copy of the input text, it will
preserve semantically-insignificant white space between tokens, as
well as the order of keys within JSON objects. Also, if a JSON object
within the value contains the same key more than once, all the
key/value pairs are kept. (The processing functions consider the last value as the operative one.)
Bold emphasis mine.
In hstore, however, for the same order of key/value pairs, 'double_key' might effectively be 'key1'. The manual:
Each key in an hstore is unique. If you declare an hstore with
duplicate keys, only one will be stored in the hstore and there is no guarantee as to which will be kept:
Typically, the first instance of a key, but that's an implementation details that might change.
A simple and fast option to always preserve the effective, operative value: cast to jsonb before the conversion. The manual again:
[...] jsonb does not preserve white space, does not preserve
the order of object keys, and does not keep duplicate object keys.
If duplicate keys are specified in the input, only the last value is kept.
Modifying #pozs's conversion function:
CREATE OR REPLACE FUNCTION json2hstore(json)
RETURNS hstore AS
$func$
SELECT hstore(array_agg(key), array_agg(value))
FROM jsonb_each_text($1::jsonb) -- !
$func$ LANGUAGE sql IMMUTABLE STRICT;
Requires Postgres 9.4 or later. Postgres 9.3 has the json type, but not jsonb, yet. A no-op in PL/v8 might be alternative there, like #jpmc mentioned.