SQL Server : problem in finding values in an Json object array - json

This is my JSON array object saved in a column
{
"relationalKeys": [
{
"job_type": [
"8",
"5"
],
"job_speciality": [
"50",
"51"
],
"job_department": [
"70",
"71"
],
"job_grade": [
],
"job_work_pattern_id": [
"43"
],
"pay_band_id": [
"31"
],
"staff_group_id": [
"27"
]
}
]
}
I want to extract records matching with such as Jobtype in (8,5). Problem is that when we use CROSS APPLY in a huge database, results are returned very slowly.
This is my query and its working fine, however I want to know if there any solution available to find values without referring to a particular array index i.e [0] but value can exist in anywhere in the json array.
SELECT Json_Value(jsonData, '$.relationalKeys[0].job_type[0]'), *
FROM tbl_jobs_details_v2
WHERE Json_Value(jsonData, '$.relationalKeys[0].job_type[0]') IN (8, 5)
I want something matching in all elements of an array like adding a * but SQL Server doesn't support this.
SELECT Json_Value(jsonData, '$.relationalKeys[0].job_type[*]'),*
FROM tbl_jobs_details_v2
WHERE Json_Value(jsonData, '$.relationalKeys[0].job_type[*]') IN (8, 5)

Related

Snowflake Get MIN/MAX values from Nested JSON

I have the following sample structure in snowflake column, indicating a series of coordinates for a map polygon. Each pair of values is formed of "longitude, latitude" like this:
COLUMN ZONE
{
"coordinates": [
[
[
-58.467372,
-34.557908
],
[
-58.457565,
-34.569341
],
[
-58.446836,
-34.573511
],
[
-58.43482,
-34.553367
],
[
-58.441944,
-34.547923
]
]
],
"type": "POLYGON"
}
I need to get the smallest longitud and the smallest latitude and the biggest longitud and latitude for every table row.
So, for this example, I need to get something like:
MIN: -58.467372,-34.573511
MAX: -58.43482,-34.547923
Do you guys know if this is possible using a query?
I got as far as to navigate the json to get the sets of coordinates, but I'm not sure how to proceed from there. I tried doing a MIN to the coordinates column, but I'm not sure how to reference only the "latitude" or "longitude" value.
This obviously doesn't work:
MIN(ZONE['coordinates'][0])
Any suggestions?
You can do this with some Snowflake GIS functions, and massaging the input data for an easier parsing:
with data as (
select '
{
"coordinates": [
[
[
-58.467372,
-34.557908
],
[
-58.457565,
-34.569341
],
[
-58.446836,
-34.573511
],
[
-58.43482,
-34.553367
],
[
-58.441944,
-34.547923
]
]
],
"type": "POLYGON"
}
' x
)
select st_xmin(g), st_xmax(g), st_ymin(g), st_ymax(g)
from (
select to_geography(replace(x, 'POLYGON', 'MultiLineString')) g
from data
)

Retrieve specific value from a JSON blob in MS SQL Server, using a property value?

In my DB I have a column storing JSON. The JSON looks like this:
{
"views": [
{
"id": "1",
"sections": [
{
"id": "1",
"isToggleActive": false,
"components": [
{
"id": "1",
"values": [
"02/24/2021"
]
},
{
"id": "2",
"values": []
},
{
"id": "3",
"values": [
"5393",
"02/26/2021 - Weekly"
]
},
{
"id": "5",
"values": [
""
]
}
]
}
]
}
]
}
I want to create a migration script that will extract a value from this JSON and store them in its own column.
In the JSON above, in that components array, I want to extract the second value from the component with an ID of "3" (among other things, but this is a good example). So, I want to extract the value "02/26/2021 - Weekly" to store in its own column.
I was looking at the JSON_VALUE docs, but I only see examples for specifing indexes for the json properties. I can't figure out what kind of json path I'd need. Is this even possible to do with JSON_VALUE?
EDIT: To clarify, the views and sections components can have static array indexes, so I can use views[0].sections[0] for them. Currently, this is all I have with my SQL query:
SELECT
*
FROM OPENJSON(#jsonInfo, '$.views[0].sections[0]')
You need to use OPENJSON to break out the inner array, then filter it with a WHERE and finally select the correct value with JSON_VALUE
SELECT
JSON_VALUE(components.value, '$.values[1]')
FROM OPENJSON (#jsonInfo, '$.views[0].sections[0].components') components
WHERE JSON_VALUE(components.value, '$.id') = '3'

Using Power Query to extract data from nested arrays in JSON

I'm relatively new to Power Query, but I'm pulling in this basic structure of JSON from a web api
{
"report": "Cost History",
"dimensions": [
{
"time": [
{
"name": "2019-11",
"label": "2019-11",
…
},
{
"name": "2019-12",
"label": "2019-12",
…
},
{
"name": "2020-01",
"label": "2020-01",
…
},
…
]
},
{
"Category": [
{
"name": "category1",
"label": "Category 1",
…
},
{
"name": "category2",
"label": "Category 2",
…
},
…
]
}
],
"data": [
[
[
40419.6393798211
],
[
191.44
],
…
],
[
[
2299.652439184997
],
[
0.0
],
…
]
]
}
I actually have 112 categories and 13 "times". I figured out how to do multiple queries to turn the times into column headers and the categories into row labels (I think). But the data section is alluding me. Because each item is a list within a list I'm not sure how to expand it all out. Each object in the date array will have 112 numbers and there will be 13 objects. If that all makes sense.
So ultimately I want to make it look like
2019-11 2019-20 2020-01 ...
Category 1 40419 2299
Category 2 191 0
...
First time asking a question on here, so hopefully this all makes sense and is clear. Thanks in advance for any help!
i am also researching this exact thing and looking for a solution. In PQ, it displays nested arrays as a list and there is a function to extract values choosing a separating characterenter image description here
So this becomes, this
enter image description here
= Table.TransformColumns(#"Filtered Rows", {"aligned_to_ids", each Text.Combine(List.Transform(_, Text.From), ","), type text})
However the problem i'm trying to solve is when the nested json has multiple values like this: enter image description here
And when these LIST are extracted then an error message is caused, = Table.TransformColumns(#"Extracted Values1", {"collaborators", each Text.Combine(List.Transform(_, Text.From), ","), type text})
Expression.Error: We cannot convert a value of type Record to type Text.
Details:
Value=
id=15890
goal_id=323
role_id=15
Type=[Type]
It seems the multiple values are not handled and PQ does not recognise the underlying structure to enable the columns to be expanded.

How to filter in queries of wso2 sp complex json processing

I have a requirement to execute a complex json processing.
I have almost completed but struck with a logic of filtering the events based on an attribute. I know filtering on a normal basis but this is a case where tokenizeAsObject function is written next to the from statement.
Where and how he filtering part should be written in this case?
Your help is greatly appreciated.
I could find normal filtering for queries but couldn' find anything having tokenizing method in the querying/analysing data part of the code.
--THE CODE GOES HERE
#App:name('CompanyClientDetailingApp3')
#App:description('Description of the client details of the company')
#source(type='http', receiver.url='http://localhost:5005/clientDetails',
#map(type='json', #attributes(json = '$')
)
)
define stream CompanyClientStream3 (json string);
#sink(type = 'log',
#map(type = 'passThrough'))
define stream EachProjectStream3 (Client string, clientContractTerm string, projectName string, projectContractTerm int);
#info(name = 'clientProjectquery')
from CompanyClientStream3#json:tokenizeAsObject(json, '$.CompanyClients')
select json:getString(jsonElement, '$.Client') as Client, json:getString(jsonElement, '$.Invoice.ContractTerm') as clientContractTerm, json:getObject(jsonElement, '$.Invoice.Projects') as projectList
insert into EachClientProjectStream3;
#info(name = 'projectSttreamQuery')
from EachClientProjectStream3#json:tokenizeAsObject(projectList, "$")
select Client, clientContractTerm, json:getString(jsonElement, '$.ProjectName') as projectName, json:getInt(jsonElement, '$.ProjectTerm') as projectContractTerm
insert into EachProjectStream3;
Filtering is based on ProjectTerm. i.e. Projects having Projecterm > 5 years must be streamed out.
--Inputs for the same
{
"CompanyClients": [
{
"Client": "C1",
"Invoice": {
"ContractTerm": "5",
"Unit": "years",
"Projects": [
{"ProjectName":"C1P1", "ProjectTerm":"5", "TermUnit": "years"},
{"ProjectName":"C1P2", "ProjectTerm":"3", "TermUnit": "years"},
{"ProjectName":"C1P3", "ProjectTerm":"2", "TermUnit": "years"}
]
}
},
{
"Client": "C3",
"Invoice": {
"ContractTerm": "10",
"Unit": "years",
"Projects": [
{"ProjectName":"C3P1", "ProjectTerm":"8", "TermUnit": "years"},
{"ProjectName":"C3P2", "ProjectTerm":"5", "TermUnit": "years"},
{"ProjectName":"C3P3", "ProjectTerm":"6", "TermUnit": "years"}
]
}
}
]
}
Thanks,
Kaushik.
Output from the EachProjectStream3 can be filtered in the consecutive query,
from EachProjectStream3[projectContractTerm < 5]
slect *
insert into FilteredStream;

Testing to see if a value exists in a nested JSON array

I have a SQL 2016 table that contains a column holding JSON data. A sample JSON document looks as follows:
{
"_id": "5a450f0383cac0d725cd6735",
"firstname": "Nanette",
"lastname": "Mccormick",
"registered": "2016-07-10T01:50:10 +04:00",
"friends": [
{
"id": 0,
"name": "Cote Collins",
"interests": [
"Movies",
"Movies",
"Cars"
]
},
{
"id": 1,
"name": "Ratliff Ellison",
"interests": [
"Birding",
"Birding",
"Chess"
]
},
{
"id": 2,
"name": "William Ratliff",
"interests": [
"Music",
"Chess",
"Software"
]
}
],
"greeting": "Hello, Nanette! You have 4 unread messages.",
"favoriteFruit": "apple"
}
I want to pull all documents in which the interests array of each friends object contains a certain value. I attempted this but got no results:
Select *
From <MyTable>
Where 'Chess' IN (Select value From OPENJSON(JsonValue, '$.friends.interests'))
I should have gotten several rows returned. I must not be referencing the interests array correctly or not understanding how SQL Server deals with a JSON array of this type.
Since Interests is a nested array, you need to parse your way through the array levels. To do this, you can use CROSS APPLY with OPENJSON(). The first CROSS APPLY will get you the friend names and the JSON array of interests, and then the second CROSS APPLY pulls the interests out of the array and corrolates them with the appropriate friend names. Here's an example query:
Select [name]
From #MyTable
CROSS APPLY OPENJSON(JsonValue, '$.friends')
WITH ([name] NVARCHAR(100) '$.name',
interests NVARCHAR(MAX) AS JSON)
CROSS APPLY OPENJSON(interests)
WITH (Interest NVARCHAR(100) '$')
WHERE Interest = 'Chess'