I have a column myColumn in myTable table with this value:
"6285":[
{
"75963":{"lookupId":"54","value":"0","version":null},
"75742":{"lookupId":"254","value":"991","version":null}
}
]
I need to write select query using JSON_VALUE or JSON_QUERY functions (my sql server version does not support OPENJSON). The query should return this result:
"75963-0, 75742-991"
As you can see I need values of value parameter. Also note that I don't know what elements will an object inside 6285 array contain. I mean I wouldn't know in advance that there will be 2 elements (75963 and 75742) in it. There could be more or less elements and they could be different of course. However there will always be only one object in 6285 array.
What kind of select can I write to achieve it?
It's strange, I think that your version supports OPENJSON() and you may try to use the following statement. Note, that the JSON in the question is not valid, it should be inside {}.
Table:
CREATE TABLE Data (JsonColumn varchar(1000))
INSERT INTO Data (JsonColumn)
VALUES ('{"6285":[{"75963":{"lookupId":"54","value":"0","version":null},"75742":{"lookupId":"254","value":"991","version":null}}]}')
Statement:
SELECT CONCAT(j2.[key], '-', JSON_VALUE(j2.[value], '$.value')) AS JsonValue
FROM Data d
CROSS APPLY OPENJSON(d.JsonColumn) j1
CROSS APPLY OPENJSON(j1.[value], '$[0]') j2
Result:
JsonValue
---------
75963-0
75742-991
If you need an aggregated result, you may use STRING_AGG():
SELECT STRING_AGG(CONCAT(j2.[key], '-', JSON_VALUE(j2.[value], '$.value')), ',') AS JsonValue
FROM Data d
CROSS APPLY OPENJSON(d.JsonColumn) j1
CROSS APPLY OPENJSON(j1.[value], '$[0]') j2
Result:
JsonValue
-----------------
75963-0,75742-991
Related
I have table with numeric/decimal columns and I am converting the rows to json
select to_jsonb(t.*) from my_table t
I need to have the numeric columns casted to text before converted to json.
The reason why I need this is JavaScript don't handle really big numbers well so I may loose a precision. I use decimal.js and the string representation is best to construct the decimal.js number from.
I know I can do this
select to_jsonb(t.*) || jsonb_build_object('numeric_column', numeric_column::text) from my_table t
But I want to have it done automatically. Is there a way to somehow cast all numeric columns to text before passing to to_jsonb function?
It can be user-defined postgres function.
EDIT: Just to clarify my question. What I need is some function similar to to_jsonb except all columns of the type numeric/decimal are stored as string in the resulting JSON.
Thanks
You can run a query like:
select row_to_json(row(t.column1,t.column2,t.column_numeric::text)) from my_table t
Result here
This solution converts all the json values into text :
SELECT jsonb_object_agg(d.key, d.value)
FROM my_table AS t
CROSS JOIN LATERAL jsonb_each_text(to_jsonb(t.*)) AS d
GROUP BY t
whereas this solution only converts json numbers into text :
SELECT jsonb_object_agg(d.key, CASE WHEN jsonb_typeof(d.value) = 'number' THEN to_jsonb(d.value :: text) ELSE d.value END)
FROM my_table AS t
CROSS JOIN LATERAL jsonb_each(to_jsonb(t.*)) AS d
GROUP BY t
test result in dbfiddle.
I have a JSON column and the data stored looks like:
{"results":{"made":true,"cooked":true,"eaten":true}}
{"results":{"made":true,"cooked":true,"eaten":false}}
{"results":{"made":true,"eaten":true,"a":false,"b":true,"c":false}, "more": {"ignore":true}}
I need to find all rows where 1+ values in $.results is false.
I tried using JSON_CONTAINS() but didn't find a way to get it to compare to a boolean JSON value, or to look at all values in $.results.
This needs to work with MySQL 5.7 but if it's not possible I will accept a MySQL 8+ answer.
I don't know the way for to search for a JSON true/false/null value using JSON functions - in practice these values are treated as string-type values during the search with JSON_CONTAINS, JSON_SEARCH, etc.
Use regular expression for the checking. Something like
SELECT id,
JSON_PRETTY(jsondata)
FROM test
WHERE jsondata REGEXP '"results": {[^}]+: false.*}';
DEMO
You could simply search the JSON_EXTRACT using the LIKE condition this way.
SELECT * FROM table1 WHERE JSON_EXTRACT(json_dict, '$.results') LIKE '%: false%';
Check this DB FIDDLE
An alternative to the pattern matching in other answers, is to extract all values from $.results and check each entry with a helper table with running numbers
SELECT DISTINCT v.id, v.json_value
FROM (
SELECT id, json_value, JSON_EXTRACT(json_value, '$.results.*') value_array
FROM json_table
) v
JOIN seq ON seq.n < JSON_LENGTH(v.value_array)
WHERE JSON_EXTRACT(v.value_array, CONCAT('$[', seq.n, ']')) = false
Here is the demo
I have the following table:
I need to create a select that returns me something like this:
I have tried this code:
SELECT Code, json_extract_path(Registers::json,'sales', 'name')
FROM tbl_registers
The previous code returns me a NULL in json_extract_path, I have tried the operator ::json->'sales'->>'name', but doesn't work too.
You need to unnest the array, and the aggregate the names back. This can be done using json_array_elements with a scalar sub-query:
select code,
(select string_agg(e ->> 'name', ',')
from json_array_elements(t.products) as x(e)) as products
from tbl_registers t;
I would also strongly recommend to change your column's type to jsonb
step-by-step demo:db<>fiddle
SELECT
code,
string_agg( -- 3
elems ->> 'name', -- 2
','
) as products
FROM tbl_registers,
json_array_elements(products::json) as elems -- 1
GROUP BY code
If you have type text (strictly not recommended, please use appropriate data type json or jsonb), then you need to cast it into type json (I guess you have type text because you do the cast already in your example code). Afterwards you need to extract the array elements into one row per element
Fetch the name value
Reaggregate by grouping and use string_agg() to create the string list
I want to find multiple rows where a JSON array contains a specific value or values. Sometimes all match items will need to match (ANDs), sometimes only some (ORs) and sometimes a combination of both (ANDs and ORs).
This is in Microsoft SQL Server 2017.
I've tried doing an AS statement in the select but that resulted in the alias created for the subquery not being recognised later on in the subquery.
The bellow example works, it just seems innificent and has code duplication.
How would I only specify SELECT VALUE FROM OPENJSON(JsonData, '$.categories' once? Or perhaps there is some other way to do this?
DECLARE #TestTable TABLE
(
Id int,
JsonData nvarchar(4000)
);
INSERT INTO #TestTable
VALUES
(1,'{"categories":["one","two"]}'),
(2,'{"categories":["one"]}'),
(3,'{"categories":["two"]}'),
(4,'{"categories":["one","two","three"]}');
SELECT [Id]
FROM #TestTable
WHERE ISJSON(JsonData) = 1
-- These two lines are the offending parts of code
AND 'one' in (SELECT VALUE FROM OPENJSON(JsonData, '$.categories'))
AND 'two' in (SELECT VALUE FROM OPENJSON(JsonData, '$.categories'));
The table format cannot change, though I can add computed columns - if need be.
Well, I'm not sure if this helps you...
It might help to transform the nested array to a derived table to use it as a CTE. Check this out:
DECLARE #TestTable TABLE
(
Id int,
JsonData nvarchar(4000)
);
INSERT INTO #TestTable
VALUES
(1,'{"categories":["one","two"]}'),
(2,'{"categories":["one"]}'),
(3,'{"categories":["two"]}'),
(4,'{"categories":["one","two","three"]}');
--This is the query
WITH JsonAsTable AS
(
SELECT Id
,JsonData
,cat.*
FROM #TestTable tt
CROSS APPLY OPENJSON(tt.JsonData,'$.categories') cat
)
SELECT *
FROM JsonAsTable
The approach is very close to the query you formed yourself. The result is a table with one line per array entry. The forme Id is a repeated grouping key, the key is the ordinal position within the array, while the value is one of the words you are searching for.
In your query you can use JsonAsTable like you'd use any other table in this place.
But - instead of the repeated FROM OPENJSON queries - you will need repeated EXISTS() predicates...
A hacky solution might be this:
SELECT Id
,JsonData
,REPLACE(REPLACE(REPLACE(JsonData,'{"categories":[','",'),']}',',"'),'","',',')
FROM #TestTable
This will return all nested array values in one string, separated by a comma. You can query this using a LIKE pattern... You could return this as computed column though...
Given a table that contains a column of JSON like this:
{"payload":[{"type":"b","value":"9"}, {"type":"a","value":"8"}]}
{"payload":[{"type":"c","value":"7"}, {"type":"b","value":"3"}]}
How can I write a Presto query to give me the average b value across all entries?
So far I think I need to use something like Hive's lateral view explode, whose equivalent is cross join unnest in Presto.
But I'm stuck on how to write the Presto query for cross join unnest.
How can I use cross join unnest to expand all array elements and select them?
Here's an example of that
with example(message) as (
VALUES
(json '{"payload":[{"type":"b","value":"9"},{"type":"a","value":"8"}]}'),
(json '{"payload":[{"type":"c","value":"7"}, {"type":"b","value":"3"}]}')
)
SELECT
n.type,
avg(n.value)
FROM example
CROSS JOIN
UNNEST(
CAST(
JSON_EXTRACT(message,'$.payload')
as ARRAY(ROW(type VARCHAR, value INTEGER))
)
) as x(n)
WHERE n.type = 'b'
GROUP BY n.type
with defines a common table expression (CTE) named example with a column aliased as message
VALUES returns a verbatim table rowset
UNNEST is taking an array within a column of a single row and returning the elements of the array as multiple rows.
CAST is changing the JSON type into an ARRAY type that is required for UNNEST. It could easily have been an ARRAY<MAP< but I find ARRAY(ROW( nicer as you can specify column names, and use dot notation in the select clause.
JSON_EXTRACT is using a jsonPath expression to return the array value of the payload key
avg() and group by should be familiar SQL.
As you pointed out, this was finally implemented in Presto 0.79. :)
Here is an example of the syntax for the cast from here:
select cast(cast ('[1,2,3]' as json) as array<bigint>);
Special word of advice, there is no 'string' type in Presto like there is in Hive.
That means if your array contains strings make sure you use type 'varchar' otherwise you get an error msg saying 'type array does not exist' which can be misleading.
select cast(cast ('["1","2","3"]' as json) as array<varchar>);
The problem was that I was running an old version of Presto.
unnest was added in version 0.79
https://github.com/facebook/presto/blob/50081273a9e8c4d7b9d851425211c71bfaf8a34e/presto-docs/src/main/sphinx/release/release-0.79.rst