I want to query JSON data with Redshift Spectrum to find out if a field in the JSON exists.
So for example. Given the data:
{ "field1" : { "one" : 1, "two" : 2}, "field2" : true }
{ "field2" : false }
And given I have defined my table as:
CREATE TABLE stackoverflow_sample AS (
field1 struct<
one:varchar,
two:varchar
>,
field2 boolean
)
I want to be able to query it with something like:
SELECT field2 FROM stackoverflow_sample WHERE field1 IS NOT NULL;
And get the result:
TRUE
However I keep getting the error column field1 does not exist
Any idea how to do this?
Related
What would be the right way of getting Ajax, i.e. the value for the last occurence for key child1Dob1, from a json field that has a data structure that looks like the below,
{
"data": {
"data": {
"data": {
"child1Dob1": "Andy"
},
"child1Dob1": "Bob"
},
"child1Dob1": "Rick"
},
"child1Dob1": "Ajax"
}
The below query was an attempt from a similar question but i am getting a null value, so obviously i am missing something.
SELECT JSON_EXTRACT(`containerValue`,CONCAT("$.data[",JSON_LENGTH(`containerValue` ->> '$.data')-1,"]")) from myTable where containerKey = 'theContainer';
For CREATE TABLE test (data JSON):
WITH RECURSIVE
cte AS (
SELECT data, data -> '$.data' subdata
FROM test
UNION ALL
SELECT subdata, subdata -> '$.data'
FROM cte
WHERE subdata IS NOT NULL
)
SELECT data ->> '$.child1Dob1'
FROM cte
WHERE subdata IS NULL;
We are exploring the JSON feature in SQL Sever and for one of the scenarios we want to come up with a SQL which can return a JSON like below
[
{
"field": {
"uuid": "uuid-field-1"
},
"value": {
"uuid": "uuid-value" //value is an object
}
},
{
"field": {
"uuid": "uuid-field-2"
},
"value": "1". //value is simple integer
}
... more rows
]
The value field can be a simple integer/string or a nested object.
We are able to come up with a table which looks like below:
field.uuid | value.uuid | value|
------------|---------- | -----|
uuid-field-1| value-uuid | null |
uuid-field-2| null | 1 |
... more rows
But as soon as we apply for json path, it fails saying
Property 'value' cannot be generated in JSON output due to a conflict with another column name or alias. Use different names and aliases for each column in SELECT list.
Is it possible to do it somehow generate this? The value will either be in the value.uuid or value not both?
Note: We are open to possibility of if we can convert each row to individual JSON and add all of them in an array.
select
json_query((select v.[field.uuid] as 'uuid' for json path, without_array_wrapper)) as 'field',
value as 'value',
json_query((select v.[value.uuid] as 'uuid' where v.[value.uuid] is not null for json path, without_array_wrapper)) as 'value'
from
(
values
('uuid-field-1', 'value-uuid1', null),
('uuid-field-2', null, 2),
('uuid-field-3', 'value-uuid3', null),
('uuid-field-4', null, 4)
) as v([field.uuid], [value.uuid], value)
for json auto;--, without_array_wrapper;
The reason for this error is that (as is mentioned in the documentation) ... FOR JSON PATH clause uses the column alias or column name to determine the key name in the JSON output. If an alias contains dots, the PATH option creates nested objects. In your case value.uuid and value both generate a key with name value.
I can suggest an approach (probably not the best one), which uses JSON_MODIFY() to generate the expected JSON from an empty JSON array:
Table:
CREATE TABLE Data (
[field.uuid] varchar(100),
[value.uuid] varchar(100),
[value] int
)
INSERT INTO Data
([field.uuid], [value.uuid], [value])
VALUES
('uuid-field-1', 'value-uuid', NULL),
('uuid-field-2', NULL, 1),
('uuid-field-3', NULL, 3),
('uuid-field-4', NULL, 4)
Statement:
DECLARE #json nvarchar(max) = N'[]'
SELECT #json = JSON_MODIFY(
#json,
'append $',
JSON_QUERY(
CASE
WHEN [value.uuid] IS NOT NULL THEN (SELECT d.[field.uuid], [value.uuid] FOR JSON PATH, WITHOUT_ARRAY_WRAPPER)
WHEN [value] IS NOT NULL THEN (SELECT d.[field.uuid], [value] FOR JSON PATH, WITHOUT_ARRAY_WRAPPER)
END
)
)
FROM Data d
SELECT #json
Result:
[
{
"field":{
"uuid":"uuid-field-1"
},
"value":{
"uuid":"value-uuid"
}
},
{
"field":{
"uuid":"uuid-field-2"
},
"value":1
},
{
"field":{
"uuid":"uuid-field-3"
},
"value":3
},
{
"field":{
"uuid":"uuid-field-4"
},
"value":4
}
]
i'm create a table have one json column and data of inserted has below structure:
{
"options" : {
"info" : [
{"data" : "data1", "verified" : 0},
{"data" : "data2", "verified" : 1},
... and more
],
"otherkeys" : "some data..."
}
}
i want to run a query to get data of verified = 1 "info"
this is for mysql 5.7 comunity running on windows 10
select id, (meta->"$.options.info[*].data") AS `data`
from tbl
WHERE meta->"$.options.info[*].verified" = 1
is expect the output of "data2" but the actual output is nothing.
below query worked perfectly
select id, (meta->"$.options.info[*].data") AS `data`
from tbl
WHERE meta->"$.options.info[1].verified" = 1
but i need to search all item in array not only index 1
how can fix it ?
(sorry for bad english)
Try:
SELECT `id`, (`meta` -> '$.options.info[*].data') `data`
FROM `tbl`
WHERE JSON_CONTAINS(`meta` -> '$.options.info[*].verified', '1');
See dbfiddle.
In my postgres database I have json that looks similar to this:
{
"myArray": [
{
"myValue": 1
},
{
"myValue": 2
},
{
"myValue": 3
}
]
}
Now I want to rename myValue to otherValue. I can't be sure about the length of the array! Preferably I would like to use something like set_jsonb with a wildcard as the array index, but that does not seem to be supported. So what is the nicest solution?
You have to decompose a whole jsonb object, modify individual elements and build the object back.
The custom function will be helpful:
create or replace function jsonb_change_keys_in_array(arr jsonb, old_key text, new_key text)
returns jsonb language sql as $$
select jsonb_agg(case
when value->old_key is null then value
else value- old_key || jsonb_build_object(new_key, value->old_key)
end)
from jsonb_array_elements(arr)
$$;
Use:
with my_table (id, data) as (
values(1,
'{
"myArray": [
{
"myValue": 1
},
{
"myValue": 2
},
{
"myValue": 3
}
]
}'::jsonb)
)
select
id,
jsonb_build_object(
'myArray',
jsonb_change_keys_in_array(data->'myArray', 'myValue', 'otherValue')
)
from my_table;
id | jsonb_build_object
----+------------------------------------------------------------------------
1 | {"myArray": [{"otherValue": 1}, {"otherValue": 2}, {"otherValue": 3}]}
(1 row)
Using json functions are definitely the most elegant, but you can get by on using character replacement. Cast the json(b) as text, perform the replace, then change it back to json(b). In this example I included the quotes and colon to help the text replace target the json keys without conflict with values.
CREATE TABLE mytable ( id INT, data JSONB );
INSERT INTO mytable VALUES (1, '{"myArray": [{"myValue": 1},{"myValue": 2},{"myValue": 3}]}');
INSERT INTO mytable VALUES (2, '{"myArray": [{"myValue": 4},{"myValue": 5},{"myValue": 6}]}');
SELECT * FROM mytable;
UPDATE mytable
SET data = REPLACE(data :: TEXT, '"myValue":', '"otherValue":') :: JSONB;
SELECT * FROM mytable;
http://sqlfiddle.com/#!17/1c28a/9/4
I am working on converting XML to j son string using the PostgreSQL.
we have attributecentric XML and would like to know how to convert it to j son.
Example XML:
<ROOT><INPUT id="1" name="xyz"/></ROOT>
Need to get the j son as follows:
{ "ROOT": { "INPUT": { "id": "1", "name": "xyz" }}}
got the above json format from an online tool.
any help or guidance will be appreciated.
Regards
Abdul Azeem
Basically, breakdown of this problem is the following:
extract values from given XML using xpath() function
generate and combine JSON entries using json_object_agg() function
The only tricky thing here is to combine key,value pairs together from xpath()/json_object_agg() results, which are{ "id": "1"} and { "name" : "xyz"}.
WITH test_xml(data) AS ( VALUES
('<ROOT><INPUT id="1" name="xyz"/></ROOT>'::XML)
), attribute_id(value) AS (
-- get '1' value from id
SELECT (xpath('//INPUT/#id',test_xml.data))[1] AS value
FROM test_xml
), attribute_name(value) AS (
-- get 'xyz' value from name
SELECT (xpath('//INPUT/#name',test_xml.data))[1] AS value
FROM test_xml
), json_1 AS (
-- generate JSON 1 {"id": "1"}
SELECT json_object_agg('id',attribute_id.value) AS payload
FROM attribute_id
), json_2 AS (
-- generate JSON 2 {"name" : "xyz"}
SELECT json_object_agg('name',attribute_name.value) AS payload FROM attribute_name
), merged AS (
-- Generate INPUT result - Step 1 - combine JSON 1 and 2 as single key,value source
SELECT key,value
FROM json_1,json_each(json_1.payload)
UNION ALL
SELECT key,value
FROM json_2,json_each(json_2.payload)
), input_json_value AS (
-- Generate INPUT result - Step 2 - use json_object_agg to create JSON { "id" : "1", "name" : "xyz" }
SELECT json_object_agg(merged.key,merged.value) AS data
FROM merged
), input_json AS (
-- Generate INPUT JSON as expected { "INPUT" : { "id" : "1", "name" : "xyz" } }
SELECT json_object_agg('INPUT',input_json_value.data) AS data
FROM input_json_value
)
-- Generate final reult
SELECT json_object_agg('ROOT',input_json.data)
FROM input_json;