postgres 9.3 json array of strings to text - json

Using PostgreSQL 9.3, the json_array_elements function returns each string element in an array as json strings.
select value from json_array_elements('["a", "b"]');
value
-------
"a"
"b"
I would like to convert these to regular Postgres TEXT values but I'm at a loss. I tried value::TEXT but they are still double quoted, i.e. json strings.

As simple as:
select value from json_array_elements_text('["a", "b"]');

I think u want this.
select REPLACE(value::TEXT,'"','') from json_array_elements('["a", "b"]');

Related

SQL regex to check if within JSON array there exists at least one item

I have a table in MySQL 5.6 with a text type field "custom_info" which exists JSON as string. Such a JSON's exist field "clients" with array.
In order to select records where "clients" is empty I use query:
select custom_info from users where custom_info like '%"clients":[]%'
How could I retrieve records where "clients" array exist at least one element?
MySQL natively supports JSON. Instead of storing as a string literal, I recommend storing it as a JSON data type. In doing so, it will open up several native functions like JSON_CONTAINS, JSON_EXTRACT, JSON_ARRAY, and JSON_OBJECT.
You could then use JSON_CONTAINS or JSON_EXTRACT to evaluate your results; something like:
SELECT *
FROM custom_info
WHERE JSON_CONTAINS(clients, 'some_value_youd_expect_to_be_here')
Or:
SELECT *
FROM custom_info
WHERE JSON_EXTRACT('clients', '$[0]') is not null
Since you want to only return entries having no ] after "clients":[ you may use
where custom_info REGEXP '"clients":\\[[^]]'
The \\[ is actually a \[ pattern matching a literal [ char, and [^]] is a negated bracket expression matching any char butr a ].
In the case you want to keep your field as text type you could use _% which will match at least one character
select custom_info from users where custom_info like '%"clients":[_%]%'

MySQL 5.7 - Query to set the value of a JSON key to a JSON Object

Using MySQL 5.7, how to set the value of a JSON key in a JSON column to a JSON object rather than a string.
I used this query:
SELECT json_set(profile, '$.twitter', '{"key1":"val1", "key2":"val2"}')
from account WHERE id=2
Output:
{"twitter": "{\"key1\":\"val1\", \"key2\":\"val2\"}", "facebook": "value", "googleplus": "google_val"}
But it seems like it considers it as a string since the output escapes the JSON characters in it. Is it possible to do that without using JSON_OBJECT()?
There's a couple of options that I know of:
Use the JSON_UNQUOTE function to unquote the output (ie not cast it to string) as documented here
Possibly use the ->> operator and select a specific path, documented here
Has a lot of implications, but you could disable backslashes as an escape character. I haven't tried this, so I don't even know if that works, but it's mentioned in the docs
On balance, I'd either use the ->> operator, or handle the conversion on the client side, depending on what you want to do.

Query Postgres 9.3 JSON to check if array contains a string?

A line in my JSON column looks something like this:
{"general": {
"somekey": "somevalue",
"tags": ["first_tag", "second_tag", "third_tag"]}}
And I need to return lines with tags list that contains certain tag (e.g. "first_tag"). Is there a way to do this in PostgreSQL 9.3?
Assuming that the table is called t and the column is called x:
SELECT *
FROM t
WHERE exists(
SELECT 1
FROM json_array_elements(x#>'{general,tags}')
WHERE array_to_json(array[value])->>0='first_tag'
);
This does not use jsonb or other newer stuff so it should work on 9.3. See also sqlfiddle.
The idea is to use the json_array_elements function, which converts a json array into a sql table. That table has a single column called value of type json.
In this case, we use json_array_elements in a subquery to iterate through the list of all tags. One complication is that value (which here represents a tag) is of type json, so we have to convert it to a text. Unfortunately, postgresql doesn't have a function to do that directly, so we have to convert it into a single element array and then extract that single element as text. Then we can compare that text with the tag we are look for (first_tag in the above example). The result of this column is not significant, we are only interested in whether it is empty or not.
You can use:
CREATE OR REPLACE FUNCTION fn_json_array_contains(a_json json, a_e anyelement)
RETURNS BOOLEAN AS $BODY$
BEGIN
RETURN to_json(a_e)::TEXT IN (SELECT value::TEXT FROM json_array_elements(a_json) e);
END;
$BODY$ LANGUAGE plpgsql IMMUTABLE
;
SELECT * FROM t WHERE fn_json_array_contains(x#>'{general,tags}', 'first_tag');

postgres 9.4 and unicode in json

I've ended up with some json data imported into the database which contains unicode escapes in the json key. And I can't seem to find a way to address the data. The simplest example is:
select '{"test\u0007":123}'::json->'test\u0007'
Instead of getting 123 back, I get NULL. Can anyone help?
The operator takes json and text operands. In text, per the SQL spec, backslash escapes have no significance.
If you want to match that key, you'll need to insert the unescaped text literally in the string, or use a PostgreSQL extension, the E'' string, e.g.:
regress=> select '{"test\u0007":123}'::json ->> E'test\u0007';
?column?
----------
123
(1 row)
The json gets compared in its decoded form, which is why the original didn't work.

Get data type of JSON field in Postgres

I have a Postgres JSON column where some columns have data like:
{"value":90}
{"value":99.9}
...whereas other columns have data like:
{"value":"A"}
{"value":"B"}
The -> operator (i.e. fields->'value') would cast the value to JSON, whereas the ->> operator (i.e. fields->>'value') casts the value to text, as reported by pg_typeof. Is there a way to find the "actual" data type of a JSON field?
My current approach would be to use Regex to determine whether the occurrence of fields->>'value' in fields::text is surrounded by double quotes.
Is there a better way?
As #pozs mentioned in comment, from version 9.4 there are available json_typeof(json) and jsonb_typeof(jsonb) functions
Returns the type of the outermost JSON value as a text string. Possible types are object, array, string, number, boolean, and null.
https://www.postgresql.org/docs/current/functions-json.html
Applying to your case, an example of how this could be used for this problem:
SELECT
json_data.key,
jsonb_typeof(json_data.value) AS json_data_type,
COUNT(*) AS occurrences
FROM tablename, jsonb_each(tablename.columnname) AS json_data
GROUP BY 1, 2
ORDER BY 1, 2;
I ended up getting access to PLv8 in my environment, which made this easy:
CREATE FUNCTION value_type(fields JSON) RETURNS TEXT AS $$
return typeof fields.value;
$$ LANGUAGE plv8;
As mentioned in the comments, there will be a native function for this in 9.4.