How to use condition WHERE json hierarchy column in Postgres? - json

I've created a small DW in Postgres that contains an XML column, JSON column and GEOmetry column , I want to use a clause group by in json column just.
let's focus on this table contains XML column, JSON column and GEOmetry column
I insert some line in the table.
here is the table view in postgres :
I want to display the lines with which the city is Tanger
note : I used the following query but it did not work :

based on documentation the JSON operator are:
-The operator -> returns JSON object field by key.
-The operator ->> returns JSON object field by text.
Then You have a mistake with the operator, You must use -> for addres key, for example like this example
https://dbfiddle.uk/?rdbms=postgres_10&fiddle=67df669c90741366f89671ea9494b287

columnName->>'field'
Accesses the field and returns the value as a string; whereas:
columnName->'field'
accesses the field and returns the values as JSONB, which allows you to continue traversing the object to lower levels.
So -> gives you JSONB ->> gives you strings

Related

Storing non-json data into a column with type not NVARCHAR(MAX), which is used for a JSON property index, throws an error

I've a problem storing non-json data into a column which is used for a JSON property index. After the index has been created, storing non-json data into the column results in the following error: [S0001][13609] Line 1: JSON text is not properly formatted. Unexpected character 'n' is found at position 0.
I've created the computed column needed for the index and the index itself like this:
ALTER TABLE foo ADD vBar AS IIF(ISJSON(DETAILS_JSON) > 0, JSON_VALUE(DETAILS_JSON, '$.bar'), NULL)
CREATE INDEX IDX_FOO_BAR_ID ON foo (vBar)
When trying to store non-json data in the column (e.g. UPDATE foo SET bar = 'simple text') it results in the error mentioned above.
However, when I execute the example and store non-json data in the column it works...
The problem was the length of the column where the JSON data is stored. The length has to be MAX - if it is different (e.g. NVARCHAR(4000) or VARCHAR(4000)) it results in the mentioned error.
I. e. the type of the column has to be NVARCHAR(MAX) or VARCHAR(MAX).

how to select a part of a text column as new column in mysql query

I have a text column that save a json string on it.
I want to select specific element of json as new column and i do not want to change type of this column to json.
Is it possible?
How can i do that?
My table name is 'logs' and my column name is 'response' and my target element in JSON string is 'server_response_time'.
If you have a valid JSON string stored in a string column, you can directly use JSON functions on it. MySQL will happily convert it to JSON under the hood.
So:
with t as (select '{"foo": "bar", "baz": "zoo"}' col)
select col, col ->> '$.foo' as foo
from t
If your string is not valid JSON, this generates a runtime error. This is one of the reasons why I would still recommend storing your data as JSON rather than string: that way, data integrity is enforced at the time when your data is stored, rather than delayed until it is read.

Postgres how to turn a field into a json array

I saved a json like this in a column of my db:
{"a":137,"b":"28","c":"1","d":"5","e":19,"f":true}
is it possible with a query to transform "e" into an array without removing the value?
{"a":137,"b":"28","c":"1","d":"5","e":[19],"f":true}
update the_table
set the_column = the_column||jsonb_build_object('e', array_to_json(array[the_column -> 'e']))
where ...
array[the_column -> 'e'] creates a "native" array out of the single element referenced by the key 'e'. This array is converted to JSON and a new JSON value is created using jsonb_build_object() which is then concatenated to the existing value. This will overwrite the existing key "e".
The above assumes that the column is defined as jsonb (which it should be). If it's only json you need to cast it to make the replacement work with ||

Talend Casting of JSON string to JSON or JSONB in PostgreSQL

I'm trying to use Talend to get JSON data that is stored in MySQL as a VARCHAR datatype and export it into PostgreSQL 9.4 table of the following type:
CREATE TABLE myTable( myJSON as JSONB)
When I try running the job I get the following error:
ERROR: column "json_string" is of type json but expression is of type
character varying
Hint: You will need to rewrite or cast the expression. Position:
54
If I use python or just plain SQL with PostgreSQL insert I can insert a string such as '{"Name":"blah"}' and it understands it.
INSERT INTO myTable(myJSON) VALUES ('{"Name":"blah"}');
Any Idea's how this can be done in Talend?
You can add a type-cast by opening the "Advanced Settings" tab on you "tPostgresqlOutput" component. Consider the following example:
In this case, the input row to "tPostgresqlOutput_1" has one column data. This column is of type String and is mapped to the database column data of type VARCHAR (as by the default suggested by Talend):
Next, open the component settings for tPostgresqlOutput_1 and locate the "Advanced settings" tab:
On this tab, you can replace the existing data column by a new expression:
In the name column, specify the target column name.
In the SQL Expression column, do your type casting. In this case: "?::json"`. Note the usage of the placeholder character?`` which will be replaced with the original value.
In Position, specify Replace. This will replace the value proposed by Talend with your SQL expression (including the type cast).
As Reference Column use the source value.
This should do the trick.
Here is a sample schema for where in i have the input row 'r' which has question_json and choice_json columns which are json strings. From which i know the key what i wanted to extract and here is how i do
you should look at the columns question_value and choice_value. Hope this helps you

Get data type of JSON field in Postgres

I have a Postgres JSON column where some columns have data like:
{"value":90}
{"value":99.9}
...whereas other columns have data like:
{"value":"A"}
{"value":"B"}
The -> operator (i.e. fields->'value') would cast the value to JSON, whereas the ->> operator (i.e. fields->>'value') casts the value to text, as reported by pg_typeof. Is there a way to find the "actual" data type of a JSON field?
My current approach would be to use Regex to determine whether the occurrence of fields->>'value' in fields::text is surrounded by double quotes.
Is there a better way?
As #pozs mentioned in comment, from version 9.4 there are available json_typeof(json) and jsonb_typeof(jsonb) functions
Returns the type of the outermost JSON value as a text string. Possible types are object, array, string, number, boolean, and null.
https://www.postgresql.org/docs/current/functions-json.html
Applying to your case, an example of how this could be used for this problem:
SELECT
json_data.key,
jsonb_typeof(json_data.value) AS json_data_type,
COUNT(*) AS occurrences
FROM tablename, jsonb_each(tablename.columnname) AS json_data
GROUP BY 1, 2
ORDER BY 1, 2;
I ended up getting access to PLv8 in my environment, which made this easy:
CREATE FUNCTION value_type(fields JSON) RETURNS TEXT AS $$
return typeof fields.value;
$$ LANGUAGE plv8;
As mentioned in the comments, there will be a native function for this in 9.4.