Alter column varchar to json in psql - json

Hi i am trying to convert a column in my table from varchar to json and the table already had some string data. I tried doing that with the below command.
Database=# alter table table_name alter column message type json using
message::json;
But the command failed with the below error.
ERROR: invalid input syntax for type json
DETAIL: Token "This" is invalid.
CONTEXT: JSON data, line 1: This...
Note : The message column has a set of words with spaces like below.
"This is a message"
I am not sure what went wrong. Thanks in advance..

You can use to_jsonb() rather than casting:
alter table table_name
alter column message type jsonb using to_jsonb(message);
If you really want to use json (although jsonb is recommended), then cast the result back to a json type:
alter table table_name
alter column message type json using to_jsonb(message)::json;
But this seems rather strange for a column that doesn't contain "real" json values, only plain strings.

In my case, I wanna split varchar that have separator each n-character. For example
varchar "ab,cd,ef,gh" -> json ["ab","cd","ef","gh"]
First that I do, convert varchar into array with delimiter of comma (",")
ALTER table my_table ALTER column my_column TYPE text[] USING string_to_array(my_column,',')
Then, convert array of varchar into json (maybe will you use for GraphQL database)
ALTER table my_table ALTER column my_column TYPE json USING array_to_json(my_column)
That gives me a result of json(array) like ["ab","cd","ef","gh"]

You can simply run the following command below, it will surely convert the field type as well as the previous records, you might also need to update your models file of respective framework so you can work accordingly.
In my case, I was converting saves_states table's column named prev_month_access_counts type from text to json.
ALTER TABLE saved_states ALTER COLUMN prev_months_access_counts TYPE jsonb
using to_jsonb(prev_months_access_counts);

Related

Storing non-json data into a column with type not NVARCHAR(MAX), which is used for a JSON property index, throws an error

I've a problem storing non-json data into a column which is used for a JSON property index. After the index has been created, storing non-json data into the column results in the following error: [S0001][13609] Line 1: JSON text is not properly formatted. Unexpected character 'n' is found at position 0.
I've created the computed column needed for the index and the index itself like this:
ALTER TABLE foo ADD vBar AS IIF(ISJSON(DETAILS_JSON) > 0, JSON_VALUE(DETAILS_JSON, '$.bar'), NULL)
CREATE INDEX IDX_FOO_BAR_ID ON foo (vBar)
When trying to store non-json data in the column (e.g. UPDATE foo SET bar = 'simple text') it results in the error mentioned above.
However, when I execute the example and store non-json data in the column it works...
The problem was the length of the column where the JSON data is stored. The length has to be MAX - if it is different (e.g. NVARCHAR(4000) or VARCHAR(4000)) it results in the mentioned error.
I. e. the type of the column has to be NVARCHAR(MAX) or VARCHAR(MAX).

json-ize string column in postgresql

I have the following data structure:
create table test (tags VARCHAR, tags_json VARCHAR);
insert into test (tags, tags_json)
values ('A B', '["A", "B"]')
And I want to convert the column tags to a JSON column. If I were to do it with the tags_json column is pretty easy:
select tags_json::JSON from test
But when I run it using the tags column,
select tags::JSON from test
I get
SQL Error [22P02]: ERROR: invalid input syntax for type json
How can I convert the column tags to a JSON column in postgresql?
You need to first convert your "plain text" to an array, then you can use to_jsonb() to convert that to a proper JSON value:
select to_jsonb(regexp_split_to_array(tags, '\s+'))
from test;
If you want to permanently change the column's data type, you can use that expression in an ALTER statement:
alter table test
alter tags type jsonb
using to_jsonb(regexp_split_to_array(tags, '\s+'));

Talend Casting of JSON string to JSON or JSONB in PostgreSQL

I'm trying to use Talend to get JSON data that is stored in MySQL as a VARCHAR datatype and export it into PostgreSQL 9.4 table of the following type:
CREATE TABLE myTable( myJSON as JSONB)
When I try running the job I get the following error:
ERROR: column "json_string" is of type json but expression is of type
character varying
Hint: You will need to rewrite or cast the expression. Position:
54
If I use python or just plain SQL with PostgreSQL insert I can insert a string such as '{"Name":"blah"}' and it understands it.
INSERT INTO myTable(myJSON) VALUES ('{"Name":"blah"}');
Any Idea's how this can be done in Talend?
You can add a type-cast by opening the "Advanced Settings" tab on you "tPostgresqlOutput" component. Consider the following example:
In this case, the input row to "tPostgresqlOutput_1" has one column data. This column is of type String and is mapped to the database column data of type VARCHAR (as by the default suggested by Talend):
Next, open the component settings for tPostgresqlOutput_1 and locate the "Advanced settings" tab:
On this tab, you can replace the existing data column by a new expression:
In the name column, specify the target column name.
In the SQL Expression column, do your type casting. In this case: "?::json"`. Note the usage of the placeholder character?`` which will be replaced with the original value.
In Position, specify Replace. This will replace the value proposed by Talend with your SQL expression (including the type cast).
As Reference Column use the source value.
This should do the trick.
Here is a sample schema for where in i have the input row 'r' which has question_json and choice_json columns which are json strings. From which i know the key what i wanted to extract and here is how i do
you should look at the columns question_value and choice_value. Hope this helps you

PostgreSQL convert column_1 text[] type to column_2 json type

Is there a reasonably simple approach to copy column_1 (data type ttext[]) to column_2 (data type JSON)?
...or...
Is there a reasonably simple approach to directly convert a column's data type from text[] to JSON?
The table parts_bak1 I'm working with has two columns named material_size (text[]) and material_size_json (json).
I tried directly converting the column the following:
ALTER TABLE parts_bak1 ALTER COLUMN material_size TYPE JSON USING material_size::text[];
ERROR: column "material_size" cannot be cast automatically t
HINT: Specify a USING expression to perform the conversion.
I'm not sure how or even if I should approach the challenge using USING?
Input is welcome, this seems to work:
UPDATE parts_bak1
SET material_size_json = subq.material_size
FROM (SELECT id, array_to_json(material_size) AS material_size FROM parts_bak1) AS subq
WHERE parts_bak1.id=subq.id;

Cast JSON to HSTORE in Postgres 9.3+?

I've read the docs and it appears that there's no discernible way to perform an ALTER TABLE ... ALTER COLUMN ... USING statement to directly convert a json type column to an hstore type. There's no function available (that I'm aware of) to perform the cast.
The next best alternative I have is to create a new column of type hstore, copy my JSON data to that new column using some external tool, drop the old json column and rename the new hstore column to the old column's name.
Is there a better way?
What I have so far is:
$ CREATE TABLE blah (unstructured_data JSON);
$ ALTER TABLE blah ALTER COLUMN unstructured_data
TYPE hstore USING CAST(unstructured_data AS hstore);
ERROR: cannot cast type json to hstore
Unfortunately, PostgreSQL doesn't allow all kind of expressions within the USING clause of ALTER TABLE ... SET DATA TYPE ... (f.ex. sub-queries are disallowed).
But, you can write a function to overcome this, you just need to decide what to do with advanced types (in object's values), like arrays & objects. Here is an example, which simply converts them to string:
CREATE OR REPLACE FUNCTION my_json_to_hstore(json)
RETURNS hstore
IMMUTABLE
STRICT
LANGUAGE sql
AS $func$
SELECT hstore(array_agg(key), array_agg(value))
FROM json_each_text($1)
$func$;
After that, you can use this in your ALTER TABLE, like:
ALTER TABLE blah
ALTER COLUMN unstructured_data
SET DATA TYPE hstore USING my_json_to_hstore(unstructured_data);
There is "trap" for repeated keys - allowed by both json and hstore input, but unfortunately resolved differently (!). Consider this example value:
json '{"double_key":"key1","foo":null,"double_key":"key2"}'
In json, 'double_key is effectively 'key2'. The manual:
Because the json type stores an exact copy of the input text, it will
preserve semantically-insignificant white space between tokens, as
well as the order of keys within JSON objects. Also, if a JSON object
within the value contains the same key more than once, all the
key/value pairs are kept. (The processing functions consider the last value as the operative one.)
Bold emphasis mine.
In hstore, however, for the same order of key/value pairs, 'double_key' might effectively be 'key1'. The manual:
Each key in an hstore is unique. If you declare an hstore with
duplicate keys, only one will be stored in the hstore and there is no guarantee as to which will be kept:
Typically, the first instance of a key, but that's an implementation details that might change.
A simple and fast option to always preserve the effective, operative value: cast to jsonb before the conversion. The manual again:
[...] jsonb does not preserve white space, does not preserve
the order of object keys, and does not keep duplicate object keys.
If duplicate keys are specified in the input, only the last value is kept.
Modifying #pozs's conversion function:
CREATE OR REPLACE FUNCTION json2hstore(json)
RETURNS hstore AS
$func$
SELECT hstore(array_agg(key), array_agg(value))
FROM jsonb_each_text($1::jsonb) -- !
$func$ LANGUAGE sql IMMUTABLE STRICT;
Requires Postgres 9.4 or later. Postgres 9.3 has the json type, but not jsonb, yet. A no-op in PL/v8 might be alternative there, like #jpmc mentioned.