Cannot extract element from a scalar - postgresql error - json

I'm getting this error when trying to access data in a JSON object, does anybody know what it is causing it?
This is the query:
SELECT id, data FROM cities WHERE data->'location'->>'population' = '270816'
This is the JSON object:
location": {
"population": 270816,
"type": "city"
}
Any help would be really appreciate it. Thanks

I was able to get this SELECT to work in Postgres 9.3.1. Here's an sqlfiddle which illustrates that.
Here is the DDL and INSERT statement I used in the sqlfiddle:
create table cities
(
id serial,
data json
);
insert into cities (data) VALUES
('{
"location": {
"population": 270816,
"type": "city"
}
}'
);
What version of Postgres are you using? How are you inserting the JSON? What's the DDL for your cities table?
It suspect it may be an issue with the way you are inserting the JSON data. Try inserting it similar to the way I am doing in the sqlfiddle above and see if that works for you. i.e. as a pure SQL string, but one with valid JSON inside, into a column defined as json.

Just had what sounds like the same issue on Postgres 9.6.6. Improper string escaping caused mysterious JSONB behavior. Using pgAdmin4,
CREATE TABLE json_test (json_data JSONB);
INSERT INTO json_test (json_data) VALUES ('"{\"id\": \"test1\"}"');
INSERT INTO json_test (json_data) VALUES ('{"id": "test2"}');
SELECT json_data, json_data->>'id' as id FROM json_test;
returns the following pgAdmin4 output showing baffling failure to find id test2. Turns out the pgAdmin4 display is misleading. Situation becomes clear using text display from PSQL:
db=> CREATE TABLE json_test (json_data JSONB);
CREATE TABLE
db=> INSERT INTO json_test (json_data) VALUES ('"{\"id\": \"test1\"}"');
INSERT 0 1
db=> INSERT INTO json_test (json_data) VALUES ('{"id": "test2"}');
INSERT 0 1
db=> SELECT json_data, json_data->>'id' as id FROM json_test;
json_data | id
-----------------------+-------
"{\"id\": \"test1\"}" |
{"id": "test2"} | test2
(2 rows)
Where it is obvious that the first row was inserted as a string which just looks like JSON, not as a nested JSON object.

Related

MySql json reverse search

I have a MySQL table with a column of type json. The values of this columns are json array not json object. I need to find records of this table that at least one value of their json column is substring of the given string/phrase.
Let's suppose the table is looks like this:
create table if not exists test(id int, col json);
insert into test values (1, '["ab", "cd"]');
insert into test values (2, '["ef", "gh", "ij"]');
insert into test values (3, '["xyz"]');
If the input string/phrase is "acf ghi z" the second column must be returned as the result, because "gh" is a substring of the input. I read a lot about json_contains, json_extract, json_search and even json_overlaps but couldn't manage to solve this problem.
What is the correct sql syntax to retrieve the related rows?
MySQL version is 8.0.20
You can use json_table() to extract the JSON array as rows in a table. Then just filter:
select *
from test t cross join
json_table(t.col, '$[*]' columns (str varchar(255) path '$')) j
where 'acf ghi z' like concat('%', j.str, '%');
Here is a db<>fiddle.

Import to PostgreSQL from partly csv, partly JSON

I'm running PostgreSQL 9.6, and I've got a table consisting of lots of columns.
I've got a csv-file containing the following format:
id, insert_time, JSON-object
The JSON-object has the following format:
{ column_nameY: valueX, column_nameY: valueY, ... }
The column_names in the JSON-object matches the columns in my PostgreSQL-table.
Is there a dynamic way to import such file, so I'll get the id, insert_time, and the remaining column values from the JSON object?
The order of columns in the JSON object might not match the order of the columns in the PostgreSQL table.
I am assuming you know how to get that csv file imported into postgresql and that you know the fields inside that json object field.
first we create a table to store the contents of the csv file. Notice the datatype of that JSON field is jsonb.
create table test11 (id int, insert_time timestamp, json_object jsonb )
Know you import the csv file. but for illustration purposes, i will insert sample data into this Table.
insert into test11 (id, insert_time, json_object) values (1, '2017-11-14'::timestamp, '{ "column_nameX": "3", "column_nameY": "4" }'::jsonb);
insert into test11 (id, insert_time, json_object) values (1, '2017-11-14'::timestamp, '{ "column_nameX": "13", "column_nameY": "14" }'::jsonb);
we now select from that table;
Select id, insert_time, json_object->>'column_nameY' as Column_NameY, json_object->>'column_nameX' as Column_NameX from test11
your results should look like this...
id |insert_time |column_namey |column_namex
1 |11/14/2017 |4 |3
1 |11/14/2017 |14 |13
-HTH

Using json as column name in cassandra table does not work after cassandra upgrade

Recently there was a driver upgrade for Cassandra from 2.0.4 to 3.1.0 in my project. I found that the insert statement does not work on the tables that has 'json' as column name. The same insert statement used to work on previous cassandra version.
Sample insert statement:
insert into table_name (name, id, json) values ("AAA", "123", '{"display-order":"1","product-id":"QWERTY"}');
When I tried the query in Datastax DevCenter, I am getting the error as "no viable alternative at input 'json'"
Does that mean that we have to change the insert statements?
Please help!
Edit:
I made a mistake in the example query, json type is not the issue. When I enabled debug I found that the issue is with timestamp column.
I did not mention that in my example as I thought json could be the reason.
This is because of the recent driver update to 3.1.0, as the same query does not work now.
When I tried to fix it, I could not find the timestamp converter function (toTimestamp) or the dateOf() function in 3.1.0 version.
I see only now() but it returns only of type timeuuid.
Is there any other way I can convert the current date to timestamp?
It's not because of json field, your insert format is not correct
Use string as enclosed with single quote :
cassandra#cqlsh:test> insert into test_json (name, id, json) values ('AAA', '123', '{"display-order":"1","product-id":"QWERTY"}');
cassandra#cqlsh:test> SELECT * FROM test_json ;
id | json | name
-----+---------------------------------------------+------
123 | {"display-order":"1","product-id":"QWERTY"} | AAA
Or Double dollar signs to enclose a string with quotes, backslashes, or other characters that would normally need to be escaped
cassandra#cqlsh:test> insert into test_json (name, id, json) values ('AAA', '123', $${"display-order":"1","product-id":"QWERTY"}$$);
cassandra#cqlsh:test> SELECT * FROM test_json ;
id | json | name
-----+---------------------------------------------+------
123 | {"display-order":"1","product-id":"QWERTY"} | AAA
By the way, if you want to insert current timestamp use dateof(now()) or if you use java use new Date()

How to create postgres table with JSON?

Latest release of PostgreSQL have capabilities to work like document oriented databases (e.g. MongoDB). There Is promising benchmarks that says postgres x times faster then mongo. Can someone give me advice how to work with postgres as with MongoDB. I'm seeking for step by step simple example concerned on
1) How to create simpliest table that contain JSON/JSONB objects like documents in mongodb
2) How to make search on it at least by id, like I can do in mongodb with collection.find({id: 'objectId'}) for example
3) How to create new object or overwrite existing at least by id, like I can do in mongodb with
collection.update(
{id: objectId},
{$set: someSetObject, $unset: someUnsetObject}
{upsert: true, w: 1}
)
4) How to delete object if it exists at leas by id, like I can do in mongodb with collection.remove({id: 'objectId'})
It's too large topic to be covered in one answer. So there is just some examples as requested. For more information see documentation:
8.14. JSON Types
9.15. JSON Functions and Operators
Create table:
create table test(
id serial primary key,
data jsonb);
Search by id:
select * from test where id = 1;
Search by json value:
select * from test where data->>'a' = '1';
Insert and update data:
insert into test(id, data) values (1, '{"a": 1, "b": 2, "c": 3}');
update test set data = data - 'a' || '{"c": 5}' where id = 1;
Delete data by id:
delete from test where id = 1;
Delete data by json value:
delete from test where data->>'b' = '2';

Is there a way to address all elements of JSON array when creating a constraint in PostgreSQL?

Does PostgreSQL provide any notation/method for putting a constraint on each element of a JSON array?
An example:
create table orders(data json);
insert into orders values ('
{
"order_id": 45,
"products": [
{
"product_id": 1,
"name": "Book"
},
{
"product_id": 2,
"name": "Painting"
}
]
}
');
I can easily add a constraint on the order_id field:
alter table orders add check ((data->>'order_id')::integer >= 1);
Now I need to do the same with product_id. I can put constraint on idividual array items:
alter table orders add check ((data->'products'->0->>'product_id')::integer >= 1);
alter table orders add check ((data->'products'->1->>'product_id')::integer >= 1);
-- etc.
So obviously what I'm looking for is some kind of wildcard operator for matching any JSON array element:
alter table orders add check ((data->'products'->*->>'product_id')::integer >= 1);
-- ^ like this
I know that this can be done by extracting products to a separate products table with a foreign key to orders. But I want to know if this is possible within single JSON column, so I can keep that in mind when designing a database schema.
So I asked this question on PostgreSQL mailing list, as suggested by Craig Ringer, and I've got the answer.
In short the solution is to write a procedure which materializes JSON array to PostgreSQL array:
create function data_product_ids(JSON) returns integer[] immutable as $$
select array_agg((a->>'product_id')::integer) from
json_array_elements($1->'products') as a $$ language sql ;
and use that procedure in CHECK statment:
alter table orders add check (1 <= ALL(data_product_ids(data)));
For more details on how this works se the answer on PostgreSQL mailing list. Credits to Joel Hoffman.
From one of the developers of JSON for Postgres
The path stuff does not support wildcards.