how do I update semi structured data in snowflake? - json

We store a fair amount of value/pairs in a json object within a snowflake column.
The quantity of value/pairs is not pre-defined (hence semi structured data)
What are my options to update one of the value pairs?
Am I supposed to extract the entire JSON, convert it to string, modify the string and update the object column entirely?
Or is there a nice little function where I could update just the pair(s) I want?
create or replace table TB as
select $1 TB_ID, parse_json($2) my_json
from values
(1, '{ "FruitShape":"Round", "FruitSize":55 } '),
(2, '{ "FruitShape":"Square" } '),
(3, '{ "FruitShape":"Oblong", "FruitSize":22, "FruitColor":"Chartreuse" })
;
This created 3 rows with up to 3 value/pairs per row.
Let's say I want to change the property on the first row from "round" to "square"
UPDATE TB
SET my_json = parse_json('{ "FruitShape":"square", "FruitSize":55 }')
WHERE TB_ID = 1;
Is this what I am supposed to do?

You should not try to update, just append a new row and then use a view to access the current values.

You can do it with a javascript UDF. For example:
create or replace function merge_objects("a" object, "b" object)
returns object
language javascript
as
$$
return {...a, ...b}
$$;
And call like this:
select merge_objects(
object_construct('a',1,'b',2),
object_construct('b',3,'c',4)
);
select merge_objects(
parse_json('{"a":1,"b":[1,2,3]}')::object,
parse_json('{"c":2,"b":[2,3,4]}')::object
);

Related

Wrap/Convert json object into array of objects MySQL

I have a column named data and I have to update its content from something like {} to [{}] for each record in table A, I tried to use JSON_ARRAY() but it gives me a quoted
["{\"something\": \"true\"}"]
but I'd like to have something like
[{ "something": "true" }]
How I do it now?
SELECT JSON_ARRAY(data) FROM A;
How should I update it either using JSON_SET() or UPDATE?
You need to use a path to get the data as JSON, rather than referring to the column by itself. The path $ means the top-level object.
update A
SET data = CASE
WHEN data IS NULL THEN '[]' -- NULL becomes empty array
WHEN LEFT(data, 1) = '[' THEN data -- leave existing array alone
ELSE JSON_ARRAY(data->"$") -- put object inside array
END
DEMO
Try using
SELECT JSON_ARRAY_AGG(JSON_OBJECT(data)) from A;

How to check if a certain string is contained within an array returned by JSON_QUERY

I'm trying to write a SQL statement that will parse some JSON and return only rows where one of the arrays in the JSON Object contains a given value.
Example JSON:
Object 1:
{
"Key1": ["item1", "item2", "item3"]
}
Object 2:
{
"Key1": ["item1", "item3"]
}
I would like to only return rows where JSON_QUERY(object, '$.Key1').Contains("item2") is true (in this example, Object 1).
Of course, this magical function 'Contains()' does not exist in tsql, and I can't find any documentation of a function that performs as I'd like.
EDIT:
My current solution (which I'm not very fond of and would like to replace) checks if the string literal '"item1"' is contained within the value returned by JSON_QUERY. I don't like this, because it's possible an entry in the array could have a value like '123123"item1"123123', and then the conditional would return true.
Christian,
Sounds to me like your SQL Query could use a where clause using a LIKE comparison on a column
WHERE col1 LIKE 'some%funky%string'
The % is a wildcard declaritive
You can use OPENJSON to get a derived result set out of a json list or array:
The followin query uses JSON_QUERY to retrieve the list of strings within Key1. This is passed as argument into OPENJSON to retrive the list of strings as derived table:
DECLARE #jsonTable TABLE(ID INT IDENTITY, JsonString NVARCHAR(MAX));
INSERT INTO #jsonTable VALUES
(N'{
"Key1": ["item1", "item2", "item3"]
}')
,(N'{
"Key1": ["item1", "item3"]
}');
DECLARE #LookFor NVARCHAR(100)='Item2'
SELECT jt.ID
,jt.JsonString
,A.value
FROM #jsonTable jt
OUTER APPLY OPENJSON(JSON_QUERY(jt.JsonString,N'$.Key1')) AS A
--WHERE A.value=#LookFor
Uncomment the final WHERE to reduce the list to the rows with a value of Item2 (as defined in the variable #LookFor).

MariaDB COLUMN_JSON query returns binary

I've been trying to use dynamic columns with an instance of MariaDB v10.1.12.
First, I send the following query:
INSERT INTO savedDisplays (user, name, body, dataSource, params) VALUES ('Marty', 'Hey', 'Hoy', 'temp', COLUMN_CREATE('type', 'tab', 'col0', 'champions', 'col1', 'averageResults'));
Where params' type was defined as a blob, just like the documentation suggests.
The query is accepted, the table updated. If I COLUMN_CHECK the results, it tells me it's fine.
But when I try to select:
"SELECT COLUMN_JSON(params) AS params FROM savedDisplays;
I get a {type: "Buffer", data: Array} containing binary returned to me, instead of the {"type":"tab", "col0":"champions", "col1":"averageResults"} I expect.
EDIT: I can use COLUMN_GET just fine, but I need every column inside the params field, and I need to check the type property first to know what kind of and how many columns there are in the JSON / params field. I could probably make it work still, but that would require multiple queries, as opposed to only one.
Any ideas?
Try:
SELECT CONVERT(COLUMN_JSON(params) USING utf8) AS params FROM savedDisplays
In MariaDB 10 this works at every table:
SELECT CONVERT(COLUMN_JSON(COLUMN_CREATE('t', text, 'v', value)) USING utf8)
as json FROM test WHERE 1 AND value LIKE '%12345%' LIMIT 10;
output in node.js
[ TextRow { json: '{"t":"test text","v":"0.5339044212345805"}' } ]

PostgreSQL Auto-increment inside a JSON

Is it possible to auto-increment inside PostgreSQL's new JSON type using just SQL (like serial) and not server code?
I can't really imagine why you'd want to, but sure.
CREATE SEQUENCE whywouldyou_jsoncol_seq;
CREATE TABLE whywouldyou (
jsoncol json not null default json_object(ARRAY['id'], ARRAY[nextval('whywouldyou_jsoncol_seq')::text]),
dummydata text;
);
ALTER SEQUENCE whywouldyou_jsoncol_seq OWNED BY whywouldyou.jsoncol;
insert into whywouldyou(dummydata) values('');
select * from whywouldyou;
jsoncol | dummydata
--------------+-----------
{"id" : "1"} |
(1 row)
Note that with this particular formulation it's the string "1" not the number 1 in the json. You might want to form the json object another way if you want to avoid that. This is just an example.

how PostgreSQL update the column type json with json array

In PostgreSQL, my column type is json , the data is json array like:
[{"attsId": "42a2ce04-52ab-4a3c-8dfb-98c3d14b307d", "planId": 46, "filePath": "fileOperate\\upload", "cfileName": "潜在客户名单 (1).xls", "ufileName": "42a2ce04-52ab-4a3c-8dfb-98c3d14b307d.xls"}, {"attsId": "1adb2f13-00b0-4780-ae76-7a068dc3289c", "planId": 46, "filePath": "fileOperate\\upload", "cfileName": "潜在客户名单.xls", "ufileName": "1adb2f13-00b0-4780-ae76-7a068dc3289c.xls"}, {"attsid": "452f6c62-28df-47c7-8c30-038339f7b223", "planid": 48.0, "filepath": "fileoperate\\upload", "cfilename": "技术市场印花税.xls", "ufilename": "452f6c62-28df-47c7-8c30-038339f7b223.xls"}]
i want update one of the array date like:
UPDATE plan_base set atts->1='{"planId":"71"}' where id= 46;
how to do it? help me please
Here are two helper functions, to achieve your goal (requires PostgreSQL 9.3+):
This one can be used like UPDATEs (only updates an index, if it's already exists):
CREATE OR REPLACE FUNCTION "json_array_update_index"(
"json" json,
"index_to_update" INTEGER,
"value_to_update" anyelement
)
RETURNS json
LANGUAGE sql
IMMUTABLE
STRICT
AS $function$
SELECT concat('[', string_agg("element"::text, ','), ']')::json
FROM (SELECT CASE row_number() OVER () - 1
WHEN "index_to_update" THEN to_json("value_to_update")
ELSE "element"
END "element"
FROM json_array_elements("json") AS "element") AS "elements"
$function$;
This one can be used, like an UPSERT (updates an index, if it exists, or creates, if not -- using some default value to fill up unused indexes):
CREATE OR REPLACE FUNCTION "json_array_set_index"(
"json" json,
"index_to_set" INTEGER,
"value_to_set" anyelement,
"default_to_fill" json DEFAULT 'null'
)
RETURNS json
LANGUAGE sql
IMMUTABLE
STRICT
AS $function$
SELECT concat('[', string_agg((CASE "index"
WHEN "index_to_set" THEN to_json("value_to_set")
ELSE COALESCE("json" -> "index", "default_to_fill")
END)::text, ','), ']')::json
FROM generate_series(0, GREATEST("index_to_set", json_array_length("json") - 1)) AS "index"
$function$;
With these, you can UPDATE any json data, like:
UPDATE plan_base
SET atts = json_array_update_index(atts, 1, '{"planId":"71"}'::json)
WHERE id = 46;
Important! Json arrays are indexed from 0 (unlike other PostgreSQL arrays). My functions respect this kind of indexing.
SQLFiddle
More about updating a JSON object:
How do I modify fields inside the new PostgreSQL JSON datatype?
Update: functions are now compacted.