I have been struggling in how to insert into a table in PG that has a JSON column...
I am extracting a bit column from another table, and now I want to insert that bit to the json column, for example:
INSERT INTO TABLE (JSON) VALUES( '{"IsAllowed":"' || is_allowed::TEXT || '"}'::JSON )
is_allowed represents to the bit from another table...
And I get:
"column "XXX" is of type json but expression is of type text"
Does anyone know how can I achieve this?
Thanks!
You should be using jsonb instead of json, but that is not part of your question.
PostgreSQL does not allow bidirectional casting between bit and boolean types. The same goes for bit and int.
Please try this:
insert into table (json) values
(json_build_object('IsAllowed', is_allowed = 1::bit));
Your command:
INSERT INTO TABLE (JSON) VALUES( '{"IsAllowed":"' || is_allowed::TEXT || '"}'::JSON )
is casting '"}' to JSON. Try adding some brackets:
INSERT INTO TABLE (JSON) VALUES( ('{"IsAllowed":"' || is_allowed::TEXT || '"}')::JSON )
Note: I could not replicate the error you are receiving (got "Token ""}" is invalid.") but with the brackets this worked fine.
Related
I have a table in Bigquery which has 2 columns - job_id and json_column(string which is in JSON format). When I tried to read the data and identify some objects it gives me error as below:
SyntaxError:Unexpected end of JSON input at undefined line XXXX, columns xx-xx
It Always gives me line 5931 and second time I execute again it gives line 6215.
If it's related to JSON structure issue, how can I know which row/job_id that number 5931 corresponds to? If I subset for a specific job_id, it returns the values but when I tried to execute on the complete table, I got this error. I tried looking at the job_id at the row_numbers mentioned and code works fine for those job_ids.
Do you think its JSON structure issue and how to identify which row/job_id has this Issue?
Table Structure:
Code:
CREATE TEMPORARY FUNCTION CUSTOM_JSON_EXTRACT(json STRING, json_path STRING)
RETURNS ARRAY<STRING>
LANGUAGE js AS """
var result = jsonPath(JSON.parse(json), json_path);
if(result){return result;}
else {return [];}
"""
OPTIONS (
library="gs://json_temp/jsonpath-0.8.0.js"
);
SELECT job_id,dist,gm,sub_gm
FROM lz_fdp_op.fdp_json_file,
UNNEST(CUSTOM_JSON_EXTRACT(trim(conv_column), '$.Project.OpsLocationInfo.iDistrictId')) dist ,
UNNEST(CUSTOM_JSON_EXTRACT(trim(conv_column), '$.Project.GeoMarketInfo.Geo')) gm,
UNNEST(CUSTOM_JSON_EXTRACT(trim(conv_column), '$.Project.GeoMarketInfo.SubGeo')) sub_gm
Would this work for you?
WITH
T AS (
SELECT
'1000149.04.14' AS job_id,
'{"Project":{"OpsLocationInfo":{"iDistrictId":"A"},"GeoMarketInfo":{"Geo":"B","SubGeo":"C"}}}' AS conv_column
)
SELECT
JSON_EXTRACT_SCALAR(conv_column, '$.Project.OpsLocationInfo.iDistrictId') AS dist,
JSON_EXTRACT_SCALAR(conv_column, '$.Project.GeoMarketInfo.Geo') AS gm,
JSON_EXTRACT_SCALAR(conv_column, '$.Project.GeoMarketInfo.SubGeo') AS sub_gm
FROM
T
BigQuery JSON Functions docs:
https://cloud.google.com/bigquery/docs/reference/standard-sql/json_functions
how can I read multiple arrays in an object in JSON without using
unnest?
Can you explain better with an input sample your comment?
I am trying YugaByte's Cassandra API (YCQL) and interested in using the JSONB data type extensions.
But I am having trouble both updating an attribute in an existing JSONB column as well as adding a new attribute to an existing JSONB column.
Is this supported in YugaByte? Here is what I tried:
Consider the following example whichhas have one row with a simple key and JSONB column.
cqlsh:k> CREATE TABLE T (key int PRIMARY KEY, value jsonb);
cqlsh:k> INSERT INTO T(key, value) VALUES(1, '{"author": "Charles", "title": "Hello World"}');
cqlsh:k> SELECT * FROM T;
key | value
-----+--------------------------------------------
1 | {"author":"Charles","title":"Hello World"}
(1 rows)
So far so good.
If I try to update an existing attribute inside the doc, I see the following error:
cqlsh:k> UPDATE T SET value->'author' = 'Bruce' WHERE key=1;
InvalidRequest: Error from server: code=2200 [Invalid query] message="SQL error: \
Invalid Arguments. Corruption: JSON text is corrupt: Invalid value.
If I try to add a new attribute into an existing JSONB attribute, I get the following error;
cqlsh:k> UPDATE T SET value->'price' = '10' WHERE key=1;
InvalidRequest: Error from server: code=2200 [Invalid query] message="SQL error: \
Execution Error. Could not find member:
Is this supported, and if so what is the correct syntax?
When updating a string value you must enclose the new value in double quotes inside the single quotes. For example:
cqlsh:k> UPDATE T SET value->'author' = '"Bruce"' WHERE key=1;
cqlsh:k> SELECT * FROM T;
key | value
-----+------------------------------------------
1 | {"author":"Bruce","title":"Hello World"}
(1 rows)
Regarding the second question on ability to add new attributes:
For UPDATE, currently (as of 1.1) YugaByte DB allows updating specific attributes if that attribute/field already exists, but does not allow addition of new attributes into an existing JSONB column. If you need to the latter, you need to read the old value into the app, and write the new json in its entirety.
The table definition is:
chat_id serial primary key, last_update timestamp, messages JSON[]
and I want to insert a record like this:
insert into chats (messages) values ('{{"sender":"pablo","body":"they are on to us"}}');
with error:
ERROR: malformed array literal: "{{"sender":"pablo","body":"they are on to us"}}"
LINE 1: insert into chats (messages) values ('{{"sender":"pablo","bo...
I have also tried this approach :
insert into chats (messages) values (ARRAY('{"sender":"pablo","body":"they are on to us"}'));
Note that updating the row and inserting with array_append works OK.
I think this is a clash between the JSON notation that starts with { and the short hand array notation in Postgres where the string representation of an array is also denoted by an {.
The following works:
insert into chats
(messages)
values
(array['{"sender":"pablo","body":"they are on to us"}']::json[]);
This avoids the ambiguity of the {{ by using an explicit array constructor.
To make the array a json array you need to either cast the string to a json or the resulting array to a json[] (see the example above). Casting the whole array makes it easier if you have more than one JSON document in that row:
insert into chats
(messages)
values
(array['{"sender":"pablo","body":"they are on to us"}',
'{"sender":"arthur"}']::json[]);
alternatively:
insert into chats
(messages)
values
(array['{"sender":"pablo","body":"they are on to us"}'::json,
'{"sender":"arthur"}'::json]);
It is quite complicate to escape special characters. I would use insert from select to make it easier.
Create JSON array.
Convert it to set of JSON variables.
Convert the set to SQL array.
The select part below.
WITH bu AS (SELECT json_array_elements('[{"name":"obj1"},{"name":"obj2"},{"name":"obj3"},{"name":"obj4"}]'::json) AS bu)
SELECT array_agg(bu) FROM bu;
I want to use where condition on a json object in a table, in postgreSql. how i need to do this for example: i have a table 'test' it has three columns name(varchar),url(varchar),more(json). i need to retrive date where css21Colors = Purple.
more is a json type and below is the values of more field.
Please let me know what should be syntax of querying for the same?
more = {"colorTree":{"Purple":[{"Spanish Violet":"#522173"}],
"Brown":[{"Dark Puce":"#4e3347"}],"White":[{"White":"#ffffff"}],
"Black":[{"Eerie Black":"#1d0d27"}],"Gray":[{"Rose Quartz":"#a091a4"}]},
"sizeoutscount":0,"css21Colors":{"Purple":69,"Brown":5,"White":4,"Black":17,"Gray":3},
"sizeins": [],"sizeinscount":0,"sizeouts":[],"allsizes":["8","10","16"],
"css3Colors": {"Rose Quartz":3,"White":4,"Dark Puce":5,"Eerie Black":17,"Spanish
Violet":69},"hexColors":{"#522173":69,"#4e3347":5,"#ffffff":4,"#1d0d27":17,"#a091a4":3}}
SELECT more->'css21Colors'->'Purple' FROM test;
Additionally you can query only the rows containing that key.
SELECT
more->'css21Colors'->'Purple'
FROM
test
WHERE
(more->'css21Colors')::jsonb ? 'Purple';
Mind switching to the jsonb data type.
My table has a column with a JSON string that has nested objects (so a simple REPLACE function cannot solve this problem) . For example like this: {'name':'bob', 'blob': {'foo':'bar'}, 'age': 12}. What is the easiest query to append a value to the end of the JSON string? So for the example, I want the end result to look like this: {'name':'bob', 'blob': {'foo':'bar'}, 'age': 12, 'gender': 'male'} The solution should be generic enough to work for any JSON values.
What about this
UPDATE table SET table_field1 = CONCAT(table_field1,' This will be added.');
EDIT:
I personally would have done the manipulation with a language like PHP before inserting it. Much easier. Anyway, Ok is this what you want? This should work providing your json format that is being added is in the format {'key':'value'}
UPDATE table
SET col = CONCAT_WS(",", SUBSTRING(col, 1, CHAR_LENGTH(col) - 1),SUBSTRING('newjson', 2));
I think you can use REPLACE function to achieve this
UPDATE table
SET column = REPLACE(column, '{\'name\':\'bob\', \'blob\': {\'foo\':\'bar\'}, \'age\': 12}', '{\'name\':\'bob\', \'blob\': {\'foo\':\'bar\'}, \'age\': 12, \'gender\': \'male\'}')
Take care to properly escape all quotes inside json
Upon you request of nested json, i think you can just remove last character of the string with SUBSTRING function and then append whatever you need with CONCAT
UPDATE table
SET column = CONCAT(SUBSTRING(column, 0, -1), 'newjsontoappend')
modify Jack's answer. Works perfectly even column value is empty on first update.
update table
set column_name = case when column_name is null or column_name =''
then "{'foo':'bar'}"
else CONCAT_WS(",", SUBSTRING(column_name, 1, CHAR_LENGTH(column_name) - 1),SUBSTRING("{'foo':'bar'}", 2))
end