I've imported a raw data export from Unity into PostgreSQL using the JSON file that Unity offers.
Sample of data:
{"name":"EVENT1","ts":1534117312648,"userid":"1e77723b38980460ea307db5fca875fd","sessionid":"3188654687037448331","platform":"AndroidPlayer","sdk_ver":"u2017.4.1f1","debug_device":false,"user_agent":"Dalvik/2.1.0 (Linux; U; Android 7.0; LG-H820 Build/NRD90U)","submit_time":1534118874283,"custom_params":{"the Daily Bonus":"RewardGems"},"country":"US","city":"Fayetteville","appid":"50d97d88-096c-4a4b-8daa-390e239974f8","type":"custom"}
{"name":"GAME1","ts":1534107814910,"userid":"f029c3982539e4eeea171132bd9cf8c9","sessionid":"220388644753439310","platform":"AndroidPlayer","sdk_ver":"u2017.4.1f1","debug_device":false,"user_agent":"Dalvik/2.1.0 (Linux; U; Android 7.0; SM-G570M Build/NRD90M)","submit_time":1534118931705,"custom_params":{"Heavy Slam":"1","Flame Breath":"1","Cure":"1","Chop":"1","Bite":"7","Fang Fireball":"10","Axe Throw":"2"},"country":"BR","city":"Várzea Grande","appid":"50d97d88-096c-4a4b-8daa-390e239974f8","type":"custom"}
The JSON data is in the values column in the temp_json table as a text data type. When trying to CAST the column as a JSON datatype I get the following error:
ERROR: invalid input syntax for type json
DETAIL: Character with value 0x0a must be escaped.
CONTEXT: JSON data, line 1: ...tric Jacket":"1","Bolt":"5","Axe Throw":"1","Toss
SQL state: 22P02
As a result I've had to use to_json(values)to convert the text column into JSON. When I attempt to run the following query:
SELECT to_json(values) -> 'name'
FROM temp_json
My query results in NULL. I've searched around and found an answer that stated to try the following query:
select json_array_elements(to_json(values)) ->> 'name'
from temp_json
Although that results in the following error:
ERROR: cannot call json_array_elements on a scalar
SQL state: 22023
I'm extremely new to JSON and PostgreSQL so apologies for the noob question. Any help would be very much appreciated. I feel like this should be an easy thing to figure out, but I can't seem to find a solution.
If you are trying to return the "name" parameter from the JSON string you could use the JSON_EXTRACT_PATH_TEXT function. See docs:
https://docs.aws.amazon.com/redshift/latest/dg/JSON_EXTRACT_PATH_TEXT.html
Here is an example of how to use it assuming the JSON string is in the values column in the temp_json table:
SELECT JSON_EXTRACT_PATH_TEXT(values, 'name') AS name
FROM temp_json;
Hope this helps!
Related
The Postgresql Version is 10.8
This is sql:
update nt_order set common_field='{"bind_channel":"company","bind_status":"binding","version":1234}' where id=1 and (common_field is null or (common_field::json->>'version')::bigint is null or (common_field::json->>'version')::bigint < 12345);
When executing this sql, there is an error:
ERROR: invalid input syntax for type json
DETAIL: The input string ended unexpectedly.
CONTEXT: JSON data, line 1
If the Postgresql version is 9.5,there is no problem executing the above sql.
So how to solve this problem in Postgresql 10.8?
Based on the DETAIL and CONTEXT, I would say you have the empty string in your field for some row. This cannot be cast to json, not even in version 9.5, so your 9.5 server must have different data in it.
Presumably you want to do the same thing for '' as you do for NULL, so just add that test into your OR list.
Magento 2 table 'sales_order_payment' have column 'additional_information' which is in data type 'text'. Now data inside this field I see that is json (but as string - text)
{"raw_details_info":{"CustomerFirstname":"Mary",...
How can I select this as json?
I have try this. First I create view where I cast this column as json:
CREATE VIEW JSONTEST as SELECT cast(additional_information as json) as test FROM sales_order_payment;
Then I try to select this as json:
SELECT test->>'$.CustomerFirstname' from JSONTEST
But result is null. Any idea what is wrong?
Your path must start from the top of the document.
SELECT test->>'$.raw_details_info.CustomerFirstname' ...
Also the JSON stored in your TEXT column must be valid JSON, or else your cast will return an error. This is an advantage of the JSON data type in MySQL 5.7 and later, because it will require the content to be valid JSON.
I want to select the data from my table which is json data so to show my table data is like this :
user_id: 1
metaname: mymetaname
meta_value: a:1:{i:0;a:10:{s:7:"street1";s:36:"shiv plaza";s:4:"city";s:5:"surat";s:5:"state";s:7:"gujarat";s:7:"zipcode";s:6:"395010";s:14:"dollet_country";s:2:"IN";s:10:"tostreet1l";s:5:"surat";s:7:"tocityl";s:5:"surat";s:8:"tostatel";s:5:"surat";s:10:"tozipcodel";s:6:"395000";s:17:"todollet_countryl";s:2:"IN";}}
And i am trying to run this query :
SELECT user_id,JSON_EXTRACT(meta_value, '$."city"') FROM `usermetatable`
But it's showing error :
[Invalid JSON text in argument 1 to function json_extract: "Invalid
value." at position 0.]
My json data in table can not be changed to other and it's correct JSON for sure, Could anyone correct above query ?
That's not JSON data. It looks like a serialized PHP object. See http://php.net/serialize
There's no MySQL function for extracting a field from that serialized object. You should fetch the whole object into a PHP app, and call unserialize() on it, then access the object members.
I have a MySQL table with a JSON column called sent. The entries in the column have information like below:
{
"data": {
"12":"1920293"
}
}
I'm trying to use the mysql query:
select sent->"$.data.12" from mytable
but I get an exception:
Invalid JSON path expression. The error is around character position 9.
Any idea How I can extract the information? The query works fine for non-numeric subfields.
#Ibrahim,
You have an error in your code. If you use number (or spaced words) as key in a JSON data type in MySQL, you'll need to double-quote it.
Therefore, the correct MySQL statement in your case is:
select sent->'$.data."12"' FROM mytable;
Thanks,
#JeffreyKilelo
I have some JSON data stored in a JSON (not JSONB) column in my postgresql database (9.4.1). Some of these JSON structures contain unicode sequences in their attribute values. For example:
{"client_id": 1, "device_name": "FooBar\ufffd\u0000\ufffd\u000f\ufffd" }
When I try to query this JSON column (even if I'm not directly trying to access the device_name attribute), I get the following error:
ERROR: unsupported Unicode escape sequence
Detail: \u0000 cannot be converted to text.
You can recreate this error by executing the following command on a postgresql server:
select '{"client_id": 1, "device_name": "FooBar\ufffd\u0000\ufffd\u000f\ufffd" }'::json->>'client_id'
The error makes sense to me - there is simply no way to represent the unicode sequence NULL in a textual result.
Is there any way for me to query the same JSON data without having to perform "sanitation" on the incoming data? These JSON structures change regularly so scanning a specific attribute (device_name in this case) would not be a good solution since there could easily be other attributes that might hold similar data.
After some more investigations, it seems that this behavior is new for version 9.4.1 as mentioned in the changelog:
...Therefore \u0000 will now also be rejected in json values when conversion to de-escaped form is required. This change does not break the ability to store \u0000 in json columns so long as no processing is done on the values...
Was this really the intention? Is a downgrade to pre 9.4.1 a viable option here?
As a side note, this property is taken from the name of the client's mobile device - it's the user that entered this text into the device. How on earth did a user insert NULL and REPLACEMENT CHARACTER values?!
\u0000 is the one Unicode code point which is not valid in a string. I see no other way than to sanitize the string.
Since json is just a string in a specific format, you can use the standard string functions, without worrying about the JSON structure. A one-line sanitizer to remove the code point would be:
SELECT (regexp_replace(the_string::text, '\\u0000', '', 'g'))::json;
But you can also insert any character of your liking, which would be useful if the zero code point is used as some form of delimiter.
Note also the subtle difference between what is stored in the database and how it is presented to the user. You can store the code point in a JSON string, but you have to pre-process it to some other character before processing the value as a json data type.
The solution by Patrick didn't work out of the box for me. Regardless there was always an error thrown. I then researched a little more and was able to write a small custom function that fixed the issue for me.
First I could reproduce the error by writing:
select json '{ "a": "null \u0000 escape" }' ->> 'a' as fails
Then I added a custom function which I used in my query:
CREATE OR REPLACE FUNCTION null_if_invalid_string(json_input JSON, record_id UUID)
RETURNS JSON AS $$
DECLARE json_value JSON DEFAULT NULL;
BEGIN
BEGIN
json_value := json_input ->> 'location';
EXCEPTION WHEN OTHERS
THEN
RAISE NOTICE 'Invalid json value: "%". Returning NULL.', record_id;
RETURN NULL;
END;
RETURN json_input;
END;
$$ LANGUAGE plpgsql;
To call the function do this. You should not receive an error.
select null_if_invalid_string('{ "a": "null \u0000 escape" }', id) from my_table
Whereas this should return the json as expected:
select null_if_invalid_string('{ "a": "null" }', id) from my_table
You can fix all entries with SQL like this:
update ___MY_TABLE___
set settings = REPLACE(settings::text, '\u0000', '' )::json
where settings::text like '%\u0000%'
I found solution that works for me
SELECT (regexp_replace(the_string::text, '(?<!\\)\\u0000', '', 'g'))::json;
Note the match pattern '(?<!\)\u0000'.
Just for websearchers, who strand here:
This is not a solution to the exact question, but in some similar cases the solution, if you just don't want those datasets containing nullbytes in your json. Just add:
AND json NOT LIKE '%\u0000%'
in your WHERE statement.
You could also use the REPLACE SQL-syntax to sanitize the data:
REPLACE(source_field, '\u0000', '' );