MySQL: how to create table with multidimentional integer array? - mysql

I'd like to know if it's possible to create a table with multidimentional integer array,
I tried this syntax, but it didn't work for me unfortuantly:
create table testarray(testarr INT(20)(10));
so what to do in this case? thanks.

You can't. You could do something like convert your array to a comma-separated string, and store this. Use serialize function and store it into the varchar field

No. But you could create a test array table and link to it.
CREATE TABLE testarray(ID NT, RowId INT, Element_0 INT, Element_1 INT..Element_9 INT);
Not you reference the array with testarray.ID as a foreign key from wherever you wanted to have the array original.
That's arguable better then using a comma-separated.
Or stick a json syntax array in a varchar field.

Related

Read Json Value from a SQL Server table

I have a Json value stored in SQL server table as ntext:
JSON (column: json_val):
[{"prime":{"image":{"id":"123","logo":"","productId":"4000","enable":true},"accountid":"78","productId":"16","parentProductId":"","aprx":"4.599"}}]
select JSON_VALUE(cast(json_val as varchar(8000)), '$.prime.aprx') as px
from table_1
where id = 1
Whenever I execute it, i receive a null. What's wrong with the query?
Thanks for your help!
The JSON string is an array with a single item. You need to specify the array index to retrieve a specific item, eg :
declare #t table (json_val nvarchar(4000))
insert into #t
values ('[{"prime":{"image":{"id":"123","logo":"","productId":"4000","enable":true},"accountid":"78","productId":"16","parentProductId":"","aprx":"4.599"}}]')
select JSON_VALUE(cast(json_val as varchar(8000)), '$[0].prime.aprx') as px
from #t
This returns 4.599
If you want to search all array entries, you'll have to use OPENJSON. If you need to do that though ...
Avoid JSON if possible
JSON storage is not an alternative to using a proper table design though. JSON fields can't be indexed, so filtering by a specific field will always result in a full table scan. Given how regular this JSON string is, you should consider using proper tables instead
As Panagiotis said in the comments:
As for the JSON path, this JSON string is an array with a single element
Instead, therefore, you can use OPENJSON which would inspect each array:
DECLARE #JSON nvarchar(MAX) = N'[{"prime":{"image":{"id":"123","logo":"","productId":"4000","enable":true},"accountid":"78","productId":"16","parentProductId":"","aprx":"4.599"}}]';
SELECT aprx
FROM (VALUES(#JSON))V(json_val)
CROSS APPLY OPENJSON(V.json_val)
WITH (aprx decimal(4,3) '$.prime.aprx');
As also mentioned, your JSON should already be a string data type (should be/probably an nvarchar(MAX)) so there's no reason to CAST it.

Update the value of JSON elements in Postgresql

I have table with following table structure:
create table instances(
id bigint,
createdate timestamp,
createdby bigint,
lastmodifieddate timestamp,
lastmodifiedby bigint,
context text
)
Field context contains a JSON data i.e.
insert into instances values
(1, '2020-06-01 22:10:04', 20112,'2020-06-01 22:10:04',20112,
'{"id":1,"details":[{"binduserid":90182}]}')
I need to replace all values of JSON element binduserid with value 90182 using postgres query.
I have achieved this by using REPLACE function:
update instances
set context = replace(context, '"binduserid":90182','"binduserid":1000619')
Is there any other way to do this by using Postgres JSON Functions
Firstly, let's consider storing the column as JSON or JSONB those are already defined to hold the data properly and use in a productive manner such as no needed conversions among types like holding a DATE value in DATE format rather than a STRING.
In this case I consider context column in JSONB data type.
You can use JSONB_SET() function in order to get the desired result where the first argument(target) might be in array format through use of JSONB_BUILD_ARRAY() function with indexes (as 0 in '{0,details}' for this case ) to manipulate easily by the below DML Statement :
UPDATE instances
SET context =
JSONB_SET(JSONB_BUILD_ARRAY(context), '{0,details}','[{"binduserid":1000619}]')
Demo

Alter column varchar to json in psql

Hi i am trying to convert a column in my table from varchar to json and the table already had some string data. I tried doing that with the below command.
Database=# alter table table_name alter column message type json using
message::json;
But the command failed with the below error.
ERROR: invalid input syntax for type json
DETAIL: Token "This" is invalid.
CONTEXT: JSON data, line 1: This...
Note : The message column has a set of words with spaces like below.
"This is a message"
I am not sure what went wrong. Thanks in advance..
You can use to_jsonb() rather than casting:
alter table table_name
alter column message type jsonb using to_jsonb(message);
If you really want to use json (although jsonb is recommended), then cast the result back to a json type:
alter table table_name
alter column message type json using to_jsonb(message)::json;
But this seems rather strange for a column that doesn't contain "real" json values, only plain strings.
In my case, I wanna split varchar that have separator each n-character. For example
varchar "ab,cd,ef,gh" -> json ["ab","cd","ef","gh"]
First that I do, convert varchar into array with delimiter of comma (",")
ALTER table my_table ALTER column my_column TYPE text[] USING string_to_array(my_column,',')
Then, convert array of varchar into json (maybe will you use for GraphQL database)
ALTER table my_table ALTER column my_column TYPE json USING array_to_json(my_column)
That gives me a result of json(array) like ["ab","cd","ef","gh"]
You can simply run the following command below, it will surely convert the field type as well as the previous records, you might also need to update your models file of respective framework so you can work accordingly.
In my case, I was converting saves_states table's column named prev_month_access_counts type from text to json.
ALTER TABLE saved_states ALTER COLUMN prev_months_access_counts TYPE jsonb
using to_jsonb(prev_months_access_counts);

How to create index on JSON field in Postgres?

In PostgreSQL 9.3 Beta 2 (?), how do I create an index on a JSON field? I tried it using the -> operator used for hstore but got the following error:
CREATE TABLE publishers(id INT, info JSON);
CREATE INDEX ON publishers((info->'name'));
ERROR: data type json has no default operator class for access method
"btree" HINT: You must specify an operator class for the index or
define a default operator class for the data type.
Found:
CREATE TABLE publishers(id INT, info JSON);
CREATE INDEX ON publishers((info->>'name'));
As stated in the comments, the subtle difference here is ->> instead of ->. The former one returns the value as text, the latter as a JSON object.

Create index on json field in PostgreSQL 9.2

I have a few tables with json columns and would like to build an index on these columns. However, I get a default operator class (http://www.postgresql.org/docs/9.2/static/sql-createopclass.html). Has someone already done that or give me an alternative?
To reproduce the problem, try:
>> create table foo (json json, id int);
CREATE TABLE
>> create index on foo (id);
CREATE INDEX
>> create index on foo (json);
ERROR: data type json has no default operator class for access method "btree"
HINT: You must specify an operator class for the index or define a default operator class for the data type.
The same applies for gist or gil indexes.
Interestingly, I don't get the error when I try the following:
>> create type "nested" as (json json, extra text);
CREATE TYPE
>> create table bar (id int, json nested);
CREATE TABLE
>> create index on bar (json);
CREATE INDEX
Is that because no index is created for the components?
Okay, the main issue is the default operator. Any help or shared experience with that is appreciated.
Thanks.
There is a workaround for this if you feel like installing the PLV8 JavaScript language:
http://people.planetpostgresql.org/andrew/index.php?/archives/249-Using-PLV8-to-index-JSON.html
The solution that works best for my case is the following:
I simply treat json as text and create an index on the text.
>> create index on foo ((json::text));
A query would then have to be transformed so that it uses the index.
Explain reveals whether the index is used or not.
>> explain select * from foo where json::text = 'foo';
there are no internal index type for JSON or XML type. This fields can hold a values, bat it cannot be indexes - you need to auxiliary columns with hstore columns or similar.