I've ended up with some json data imported into the database which contains unicode escapes in the json key. And I can't seem to find a way to address the data. The simplest example is:
select '{"test\u0007":123}'::json->'test\u0007'
Instead of getting 123 back, I get NULL. Can anyone help?
The operator takes json and text operands. In text, per the SQL spec, backslash escapes have no significance.
If you want to match that key, you'll need to insert the unescaped text literally in the string, or use a PostgreSQL extension, the E'' string, e.g.:
regress=> select '{"test\u0007":123}'::json ->> E'test\u0007';
?column?
----------
123
(1 row)
The json gets compared in its decoded form, which is why the original didn't work.
Related
I have one column(varchar) containing only json string within one table. I want replace all keys with "" on that column. How can I do that using sql? My database is MySQL.
For example:
|--------------------------------------------------------------------|
| t_column |
|--------------------------------------------------------------------|
| {"name":"mike","email":"xxx#example.com","isManage":false,"age":22}|
|--------------------------------------------------------------------|
SELECT replace(t_column, regexp, "") FROM t_table
I expect:
mikexxx#example.comfalse22
How to write that regexp?
Start from
select t_column->'$.*' from test
This will return a JSON array of attribute values:
[22, "mike", "xxx#example.com", false]
This might be already all you need, and you can try something like
select *
from test
where t_column->'$.*' like '%mike%';
Unfortunately there seems to be no native way to join array values to a single string like JSON_ARRAY_CONCAT(). In MySQL 8.0 you can try REGEXP_REPLACE() and strip all JSON characters:
select regexp_replace(t_column->'$.*', '[" ,\\[\\]]', '') from test
which will return '22mikexxx#example.comfalse'.
If the values can contain one of those characters, they will also be removed.
Note: That isn't very reliable. But it's all I can do in a "simple" way.
See demo on db-fiddle.
I could be making it too simplistic, but this is just a mockup based on your comment. I can formalize it into a query if it fits your requirement.
Let's say you get your JSON string to this format where you replace all the double quotes and curly brackets and then add a comma at the end. After playing with replace and concat_ws, you are now left with:
name:mike,email:xxx#example.com,isManage:false,age:22,
With this format, every value is now preceded by a semicolon and followed by a comma, which is not true for the key. Let's say you now want to see if this JSON string has the value "mike" in it. This, you could achieve using
select * from your_table where json_col like '%:mike,%';
If you really want to solve the problem with your approach then the question becomes
What is the regex that selects all the undesired text from the string {"name":"mike","email":"xxx#example.com","isManage":false,"age":22} ?
Then the answer would be: {\"name\":\"|\"email\":\"|\",\"isManage\":|,\"age\":|}
But as others let you notice I would actually approach the problem parsing JSONs. Look up for functions json_value and json_query
Hope I helped
PS: Keep close attention on how I structured the bolded sentence. Any difference changes the problem.
EDIT:
If you want a more generic expression, something like select all the text that is not a value on a json-formatted string, you can use this one:
{|",|"\w+\":|"|,|}
Using MySQL 5.7, how to set the value of a JSON key in a JSON column to a JSON object rather than a string.
I used this query:
SELECT json_set(profile, '$.twitter', '{"key1":"val1", "key2":"val2"}')
from account WHERE id=2
Output:
{"twitter": "{\"key1\":\"val1\", \"key2\":\"val2\"}", "facebook": "value", "googleplus": "google_val"}
But it seems like it considers it as a string since the output escapes the JSON characters in it. Is it possible to do that without using JSON_OBJECT()?
There's a couple of options that I know of:
Use the JSON_UNQUOTE function to unquote the output (ie not cast it to string) as documented here
Possibly use the ->> operator and select a specific path, documented here
Has a lot of implications, but you could disable backslashes as an escape character. I haven't tried this, so I don't even know if that works, but it's mentioned in the docs
On balance, I'd either use the ->> operator, or handle the conversion on the client side, depending on what you want to do.
Using PostgreSQL 9.3, the json_array_elements function returns each string element in an array as json strings.
select value from json_array_elements('["a", "b"]');
value
-------
"a"
"b"
I would like to convert these to regular Postgres TEXT values but I'm at a loss. I tried value::TEXT but they are still double quoted, i.e. json strings.
As simple as:
select value from json_array_elements_text('["a", "b"]');
I think u want this.
select REPLACE(value::TEXT,'"','') from json_array_elements('["a", "b"]');
I have some JSON data stored in a JSON (not JSONB) column in my postgresql database (9.4.1). Some of these JSON structures contain unicode sequences in their attribute values. For example:
{"client_id": 1, "device_name": "FooBar\ufffd\u0000\ufffd\u000f\ufffd" }
When I try to query this JSON column (even if I'm not directly trying to access the device_name attribute), I get the following error:
ERROR: unsupported Unicode escape sequence
Detail: \u0000 cannot be converted to text.
You can recreate this error by executing the following command on a postgresql server:
select '{"client_id": 1, "device_name": "FooBar\ufffd\u0000\ufffd\u000f\ufffd" }'::json->>'client_id'
The error makes sense to me - there is simply no way to represent the unicode sequence NULL in a textual result.
Is there any way for me to query the same JSON data without having to perform "sanitation" on the incoming data? These JSON structures change regularly so scanning a specific attribute (device_name in this case) would not be a good solution since there could easily be other attributes that might hold similar data.
After some more investigations, it seems that this behavior is new for version 9.4.1 as mentioned in the changelog:
...Therefore \u0000 will now also be rejected in json values when conversion to de-escaped form is required. This change does not break the ability to store \u0000 in json columns so long as no processing is done on the values...
Was this really the intention? Is a downgrade to pre 9.4.1 a viable option here?
As a side note, this property is taken from the name of the client's mobile device - it's the user that entered this text into the device. How on earth did a user insert NULL and REPLACEMENT CHARACTER values?!
\u0000 is the one Unicode code point which is not valid in a string. I see no other way than to sanitize the string.
Since json is just a string in a specific format, you can use the standard string functions, without worrying about the JSON structure. A one-line sanitizer to remove the code point would be:
SELECT (regexp_replace(the_string::text, '\\u0000', '', 'g'))::json;
But you can also insert any character of your liking, which would be useful if the zero code point is used as some form of delimiter.
Note also the subtle difference between what is stored in the database and how it is presented to the user. You can store the code point in a JSON string, but you have to pre-process it to some other character before processing the value as a json data type.
The solution by Patrick didn't work out of the box for me. Regardless there was always an error thrown. I then researched a little more and was able to write a small custom function that fixed the issue for me.
First I could reproduce the error by writing:
select json '{ "a": "null \u0000 escape" }' ->> 'a' as fails
Then I added a custom function which I used in my query:
CREATE OR REPLACE FUNCTION null_if_invalid_string(json_input JSON, record_id UUID)
RETURNS JSON AS $$
DECLARE json_value JSON DEFAULT NULL;
BEGIN
BEGIN
json_value := json_input ->> 'location';
EXCEPTION WHEN OTHERS
THEN
RAISE NOTICE 'Invalid json value: "%". Returning NULL.', record_id;
RETURN NULL;
END;
RETURN json_input;
END;
$$ LANGUAGE plpgsql;
To call the function do this. You should not receive an error.
select null_if_invalid_string('{ "a": "null \u0000 escape" }', id) from my_table
Whereas this should return the json as expected:
select null_if_invalid_string('{ "a": "null" }', id) from my_table
You can fix all entries with SQL like this:
update ___MY_TABLE___
set settings = REPLACE(settings::text, '\u0000', '' )::json
where settings::text like '%\u0000%'
I found solution that works for me
SELECT (regexp_replace(the_string::text, '(?<!\\)\\u0000', '', 'g'))::json;
Note the match pattern '(?<!\)\u0000'.
Just for websearchers, who strand here:
This is not a solution to the exact question, but in some similar cases the solution, if you just don't want those datasets containing nullbytes in your json. Just add:
AND json NOT LIKE '%\u0000%'
in your WHERE statement.
You could also use the REPLACE SQL-syntax to sanitize the data:
REPLACE(source_field, '\u0000', '' );
I have a Postgres JSON column where some columns have data like:
{"value":90}
{"value":99.9}
...whereas other columns have data like:
{"value":"A"}
{"value":"B"}
The -> operator (i.e. fields->'value') would cast the value to JSON, whereas the ->> operator (i.e. fields->>'value') casts the value to text, as reported by pg_typeof. Is there a way to find the "actual" data type of a JSON field?
My current approach would be to use Regex to determine whether the occurrence of fields->>'value' in fields::text is surrounded by double quotes.
Is there a better way?
As #pozs mentioned in comment, from version 9.4 there are available json_typeof(json) and jsonb_typeof(jsonb) functions
Returns the type of the outermost JSON value as a text string. Possible types are object, array, string, number, boolean, and null.
https://www.postgresql.org/docs/current/functions-json.html
Applying to your case, an example of how this could be used for this problem:
SELECT
json_data.key,
jsonb_typeof(json_data.value) AS json_data_type,
COUNT(*) AS occurrences
FROM tablename, jsonb_each(tablename.columnname) AS json_data
GROUP BY 1, 2
ORDER BY 1, 2;
I ended up getting access to PLv8 in my environment, which made this easy:
CREATE FUNCTION value_type(fields JSON) RETURNS TEXT AS $$
return typeof fields.value;
$$ LANGUAGE plv8;
As mentioned in the comments, there will be a native function for this in 9.4.