I updated a few fields like this:
UPDATE designs
SET prices = '{ "at": 507, "ch": 751, "de": 447 }'
WHERE prices IS NULL;
Now I want to find all those rows:
SELECT * FROM designs
WHERE prices = '{ "at": 507, "ch": 751, "de": 447 }';
But I get this error:
ERROR: operator does not exist: json = unknown
Variations like WHERE prices LIKE '%"at": 507, "ch": 751, "de": 447%' doesn't work neither.
The field prices is from type json and used PG version is 9.3
There is jsonb in Postgres 9.4, which has an equality operator. This data type effectively ignores insignificant white space (and some other insignificant details), so your query would work as is:
SELECT *
FROM designs
WHERE prices = '{ "at": 507, "ch": 751, "de": 447 }';
The same is not possible with json that preserves insignificant white space, so "equality" between two json values is hard to establish. You could compare text representations, but that's not reliable:
How to query a json column for empty objects?
Using pg 9.4 once more, you could also make this work with a json column, by casting the value to jsonb on the fly:
SELECT *
FROM designs
WHERE prices::jsonb = '{ "at": 507, "ch": 751, "de": 447 }'::jsonb;
Unfortunately the operator = is not defined for JSON fields. If you really want to do this, your only option is to cast as TEXT, but I'm sure you understand the potential problems with that approach, e.g.,
SELECT * FROM designs WHERE prices::TEXT = '{ "x": 3 }';
However, it just occurred to me that a safe approach to that would be:
SELECT * FROM designs WHERE prices::TEXT = '{ "x": 3 }'::JSON::TEXT;
Nope, this doesn't work. Apparently, the JSON data type preserves the whitespace of the original JSON, so if the whitespace in the two strings is different, it won't work. (I regard this as a bug, but others might disagree.)
My answer is correct for 9.3, which the questioner is using, but if you are using 9.4+, Erwin Brandstetter's answer is the better choice.
Related
I have a table like this (with an jsonb column):
https://dbfiddle.uk/lGuHdHEJ
If I load this json with python into a dataframe:
import pandas as pd
import json
data={
"id": [1, 2],
"myyear": [2016, 2017],
"value": [5, 9]
}
data=json.dumps(data)
df=pd.read_json(data)
print (df)
I get this result:
id myyear value
0 1 2016 5
1 2 2017 9
How can a get this result directly from the json column via sql in a postgres view?
Note: This assumes that your id, my_year, and value array are consistent and have the same length.
This answer uses PostgresSQL's json_array_elements_text function to explode array elements to the rows.
select jsonb_array_elements_text(payload -> 'id') as "id",
jsonb_array_elements_text(payload -> 'bv_year') as "myyear",
jsonb_array_elements_text(payload -> 'value') as "value"
from main
And this gives the below output,
id myyear value
1 2016 5
2 2017 9
Although this is not the best design to store the properties in jsonb object and could lead to data inconsistencies later. If it's in your control I would suggest storing the data where each property's mapping is clear. Some suggestions,
You can instead have separate columns for each property.
If you want to store it as jsonb only then consider [{id: "", "year": "", "value": ""}]
I have found sometimes a jsonb object:
{"a": 1, "b": 2}
will get re-encoded and stored as a jsonb string:
"{\"a\": 1, \"b\": 2}"
is there a way to write a function that will reparse the string when input is not a jsonb object?
The #>> operator (Extracts JSON sub-object at the specified path as text) does the job:
select ('"{\"a\": 1, \"b\": 2}"'::jsonb #>> '{}')::jsonb
This operator behavior is not officially documented. It appears to be a side effect of its underlying function. Oddly enough, its twin operator #> doesn't work that way, though it would be even more logical. It's probably worth asking Postgres developers to solve this, preferably by adding a new decoding function. While waiting for a system solution, you can define a simple SQL function to make queries clearer in cases where the problem occurs frequently.
create or replace function jsonb_unescape(text)
returns jsonb language sql immutable as $$
select ($1::jsonb #>> '{}')::jsonb
$$;
Note that the function works well both on escaped and plain strings:
with my_data(str) as (
values
('{"a": 1, "b": 2}'),
('"{\"a\": 1, \"b\": 2}"')
)
select str, jsonb_unescape(str)
from my_data;
str | jsonb_unescape
------------------------+------------------
{"a": 1, "b": 2} | {"a": 1, "b": 2}
"{\"a\": 1, \"b\": 2}" | {"a": 1, "b": 2}
(2 rows)
I have a very long sized table (over 4M of records,i work with MySql) and it has a lot of records with this string: \\"
I'm trying to export this table to mongodb, but when I import the JSON file mongodb throws to me this error:
Failed: error processing document #18: invalid character 't' after object key:value pair
this is my query:
MySQL
SELECT json_object(
"id", id,
"execution_id", execution_id,
"type", type,
"info", info,
"position", position,
"created_at", json_object("$date", DATE_FORMAT(created_at,'%Y-%m-%dT%TZ')),
"updated_at", json_object("$date", DATE_FORMAT(updated_at,'%Y-%m-%dT%TZ'))
)as 'json'
FROM myTable
INTO OUTFILE 'myPath';
I know the problem is the string, my question is: how can I change this certain string to \"? Manually change it´s not an option, and my knowledge about query is limited. Please help. Thank you for reading me .
The column that has this character is "info", here is an example:
{
"id": 30,
"execution_id": 2,
"type": "PHASE",
"info": "{ \\r\\n \\"title\\": \\"Phase\\",
\\r\\n \\"order\\": \\"1\\",
\\r\\n \\"description\\": \\"Example Phase 1\\",
\\r\\n \\"step\\": \\"end\\",
\\r\\n \\"status\\": \\"True\\"\\r\\n}",
"position": 24,
"created_at": {"$date": "2018-01-11T15:01:46Z"},
"updated_at": {"$date": "2018-01-11T15:01:46Z"}
}
You should be able to do this using the MySQL REPLACE() function.
The backslash is a bit of a special case in the MySQL REPLACE() function, so you will need to use \\ to represent each literal \, thus to replace \\ with \ you need to run something like this:
REPLACE(info,'\\\\','\\')
Your full query would look something like this:
SELECT json_object(
"id", id,
"execution_id", execution_id,
"type", type,
"info", REPLACE(info,'\\\\','\\'),
"position", position,
"created_at", json_object("$date", DATE_FORMAT(created_at,'%Y-%m-%dT%TZ')),
"updated_at", json_object("$date", DATE_FORMAT(updated_at,'%Y-%m-%dT%TZ'))
)as 'json'
FROM myTable
INTO OUTFILE 'myPath';
I have a NiFi flow that takes JSON files and evaluates a JSON Path argument against them. It work perfectly except when dealing with records that contain Korean text. The Jayway JSONPath evaluator does not seem to recognize the escape "\" in the headline field before the double quote character. Here is an example (newlines added to help with formatting):
{"data": {"body": "[이데일리 김관용 기자] 우리 군이 2018 남북정상회담을 앞두고 남
북간 군사적 긴장\r\n완화와 평화로운 회담 분위기 조성을 위해 23일 0시를 기해 군사분계선
(MDL)\r\n일대에서의 대북확성기 방송을 중단했다.\r\n\r\n국방부는 이날 남북정상회담 계기
대북확성기 방송 중단 관련 내용을 발표하며\r\n“이번 조치가 남북간 상호 비방과 선전활동을
중단하고 ‘평화, 새로운 시작’을\r\n만들어 나가는 성과로 이어지기를 기대한다”고 밝혔
다.\r\n\r\n전방부대 우리 군 장병이 대북확성기 방송을 위한 장비를 점검하고 있다.\r\n[사
진=국방부공동취재단]\r\n\r\n\r\n\r\n▶ 당신의 생활 속 언제 어디서나 이데일리 \r\n▶
스마트 경제종합방송 ‘이데일리 TV’ | 모바일 투자정보 ‘투자플러스’\r\n▶ 실시간 뉴스와
속보 ‘모바일 뉴스 앱’ | 모바일 주식 매매 ‘MP트래블러Ⅱ’\r\n▶ 전문가를 위한 국내 최상의
금융정보단말기 ‘이데일리 마켓포인트 3.0’ | ‘이데일리 본드웹 2.0’\r\n▶ 증권전문가방송
‘이데일리 ON’ 1666-2200 | ‘ON스탁론’ 1599-2203\n<ⓒ종합 경제정보 미디어 이데일리 -
무단전재 & 재배포 금지> \r\n",
"mimeType": "text/plain",
"language": "ko",
"headline": "국방부 \"軍 대북확성기 방송, 23일 0시부터 중단\"",
"id": "EDYM00251_1804232beO/5WAUgdlYbHS853hYOGrIL+Tj7oUjwSYwT"}}
If this object is in my file, the JSON path evaluates blanks for all the path arguments. Is there a way to force the Jayway engine to recognize the "\"? It appears to function correctly in other languages.
I was able to resolve this after understanding the difference between definite and indefinite paths. The Jayway github README points out the following will make a path indefinite and return a list:
When evaluating a path you need to understand the concept of when a
path is definite. A path is indefinite if it contains:
.. - a deep scan operator
?(<expression>) - an expression
[<number>, <number> (, <number>)] - multiple array indexes Indefinite paths
always returns a list (as represented by current JsonProvider).
My JSON looked like the following:
{
"Version":"14",
"Items":[
{"data": {"body": "[이데일리 ... \r\n",
"mimeType": "text/plain",
"language": "ko",
"headline": "국방부 \"軍 ... 중단\"",
"id": "1"}
},
{"data": {"body": "[이데일리 ... \r\n",
"mimeType": "text/plain",
"language": "ko",
"headline": "국방부 \"軍 ... 중단\"",
"id": "2"}
...
}
]
}
This JSON path selector I was using ($.data.headline) did not grab the values as I expected. It instead returned null values.
Changing it to $.Items[*].data.headline or $..data.headline returns a list of each headline.
I'm using PostgreSQL jsonb and have the following in my database record:
{"tags": "[\"apple\",\" orange\",\" pineapple\",\" fruits\"]",
"filename": "testname.jpg", "title_en": "d1", "title_ja": "1",
"description_en": "d1", "description_ja": "1"}
and both SELECT statements below retrived no results:
SELECT "photo"."id", "photo"."datadoc", "photo"."created_timestamp","photo"."modified_timestamp"
FROM "photo"
WHERE datadoc #> '{"tags":> ["apple"]}';
SELECT "photo"."id", "photo"."datadoc", "photo"."created_timestamp", "photo"."modified_timestamp"
FROM "photo"
WHERE datadoc -> 'tags' ? 'apple';
I wonder it is because of the extra backslash added to the json array string, or the SELECT statement is incorrect.
I'm running "PostgreSQL 10.1, compiled by Visual C++ build 1800, 64-bit" on Windows 10.
PostgreSQL doc is here.
As far as any JSON parser is concerned, the value of your tags key is a string, not an array.
"tags": "[\"apple\",\" orange\",\" pineapple\",\" fruits\"]"
The string itself happens to be another JSON document, like the common case in XML where the contents of a string happen to be an XML or HTML document.
["apple"," orange"," pineapple"," fruits"]
What you need to do is extract that string, then parse it as a new JSON object, and then query that new object.
I can't test it right now, but I think that would look something like this:
(datadoc ->> 'tags') ::jsonb ? 'apple'
That is, "extract the tags value as text, cast that text value as jsonb, then query that new jsonb value.
Hey there i know this is very late answer, but here is the good approach, with data i have.
initital data in db:
"{\"data\":{\"title\":\"test\",\"message\":\"string\",\"image\":\"string\"},\"registration_ids\":[\"s
tring\"],\"isAllUsersNotification\":false}"
to convert it to json
select (notificationData #>> '{}')::jsonb from sent_notification
result:
{"data": {"image": "string", "title": "string", "message": "string"}, "registration_ids": ["string"], "isAllUsersNotification": false}
getting a data object from json
select (notificationData #>> '{}' )::jsonb -> 'data' from sent_notification;
result:
{"image": "string", "title": "string", "message": "string"}
getting a field from above result:
select (notificationData #>> '{}' )::jsonb -> 'data' ->>'title' from sent_notification;
result:
string
performing where operations,
Q: get records where title ='string'
ans:
select * from sent_notification where (notificationData #>> '{}' )::jsonb -> 'data' ->>'title' ='string'