Snowflake Get MIN/MAX values from Nested JSON - json

I have the following sample structure in snowflake column, indicating a series of coordinates for a map polygon. Each pair of values is formed of "longitude, latitude" like this:
COLUMN ZONE
{
"coordinates": [
[
[
-58.467372,
-34.557908
],
[
-58.457565,
-34.569341
],
[
-58.446836,
-34.573511
],
[
-58.43482,
-34.553367
],
[
-58.441944,
-34.547923
]
]
],
"type": "POLYGON"
}
I need to get the smallest longitud and the smallest latitude and the biggest longitud and latitude for every table row.
So, for this example, I need to get something like:
MIN: -58.467372,-34.573511
MAX: -58.43482,-34.547923
Do you guys know if this is possible using a query?
I got as far as to navigate the json to get the sets of coordinates, but I'm not sure how to proceed from there. I tried doing a MIN to the coordinates column, but I'm not sure how to reference only the "latitude" or "longitude" value.
This obviously doesn't work:
MIN(ZONE['coordinates'][0])
Any suggestions?

You can do this with some Snowflake GIS functions, and massaging the input data for an easier parsing:
with data as (
select '
{
"coordinates": [
[
[
-58.467372,
-34.557908
],
[
-58.457565,
-34.569341
],
[
-58.446836,
-34.573511
],
[
-58.43482,
-34.553367
],
[
-58.441944,
-34.547923
]
]
],
"type": "POLYGON"
}
' x
)
select st_xmin(g), st_xmax(g), st_ymin(g), st_ymax(g)
from (
select to_geography(replace(x, 'POLYGON', 'MultiLineString')) g
from data
)

Related

How to flip lines and change keys in JQ

I'm creating a interactive map of my campus, the ideia is to replicate what I did on uMaps, in this link. The geojson was downloaded from UMap and I'm using the coordinates that came with it.
My first issue is my coordinates in the json, originally were a GeoJson, are sorted wrongly, my long came first then lat, thus when parse Google Maps can't read properly.
Json:
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"properties": {
"name": "Almoxarifado / Patrimônio"
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[
-52.163317,
-32.075462
],
[
-52.163884,
-32.075467
],
[
-52.163883,
-32.075336
],
[
-52.163321,
-32.075332
],
[
-52.163317,
-32.075462
]
]
]
}
},
{
...
},
{
...
},
...
]
}
So, I have to flip the coordinates lines to proper put in my Google Maps Api.
And my second issue is chaging the "type" key to "layer", for a better sepation layers in my app.
I've tried:
.features[] | .["type"] |= .["new value"]
How ever that changes the value and only accepts float values
Any help, advice or guidance would be greatly appreciated.
Part 1
flip the coordinates lines
For clarity and ease of testing, let's define a helper function:
# input: a JSON object with a coordinates array of arrays of pairs
def flip: .coordinates |= map(map(reverse)) ;
or even better - before invoking reverse, check the array has the expected length, e.g.:
def flip:
.coordinates
|= map(map(if length == 2 then reverse
else error("incorrect length: \(length)")
end)) ;
To flip the coordinates, we can now simply write:
.features[].geometry |= flip
Part 2
change the "type" key to "layer"
{layer: .type} + .
| del(.type)
Putting it together
{layer:.type} + .
| del(.type)
| .features[].geometry |= flip

SQL Server : problem in finding values in an Json object array

This is my JSON array object saved in a column
{
"relationalKeys": [
{
"job_type": [
"8",
"5"
],
"job_speciality": [
"50",
"51"
],
"job_department": [
"70",
"71"
],
"job_grade": [
],
"job_work_pattern_id": [
"43"
],
"pay_band_id": [
"31"
],
"staff_group_id": [
"27"
]
}
]
}
I want to extract records matching with such as Jobtype in (8,5). Problem is that when we use CROSS APPLY in a huge database, results are returned very slowly.
This is my query and its working fine, however I want to know if there any solution available to find values without referring to a particular array index i.e [0] but value can exist in anywhere in the json array.
SELECT Json_Value(jsonData, '$.relationalKeys[0].job_type[0]'), *
FROM tbl_jobs_details_v2
WHERE Json_Value(jsonData, '$.relationalKeys[0].job_type[0]') IN (8, 5)
I want something matching in all elements of an array like adding a * but SQL Server doesn't support this.
SELECT Json_Value(jsonData, '$.relationalKeys[0].job_type[*]'),*
FROM tbl_jobs_details_v2
WHERE Json_Value(jsonData, '$.relationalKeys[0].job_type[*]') IN (8, 5)

Using Power Query to extract data from nested arrays in JSON

I'm relatively new to Power Query, but I'm pulling in this basic structure of JSON from a web api
{
"report": "Cost History",
"dimensions": [
{
"time": [
{
"name": "2019-11",
"label": "2019-11",
…
},
{
"name": "2019-12",
"label": "2019-12",
…
},
{
"name": "2020-01",
"label": "2020-01",
…
},
…
]
},
{
"Category": [
{
"name": "category1",
"label": "Category 1",
…
},
{
"name": "category2",
"label": "Category 2",
…
},
…
]
}
],
"data": [
[
[
40419.6393798211
],
[
191.44
],
…
],
[
[
2299.652439184997
],
[
0.0
],
…
]
]
}
I actually have 112 categories and 13 "times". I figured out how to do multiple queries to turn the times into column headers and the categories into row labels (I think). But the data section is alluding me. Because each item is a list within a list I'm not sure how to expand it all out. Each object in the date array will have 112 numbers and there will be 13 objects. If that all makes sense.
So ultimately I want to make it look like
2019-11 2019-20 2020-01 ...
Category 1 40419 2299
Category 2 191 0
...
First time asking a question on here, so hopefully this all makes sense and is clear. Thanks in advance for any help!
i am also researching this exact thing and looking for a solution. In PQ, it displays nested arrays as a list and there is a function to extract values choosing a separating characterenter image description here
So this becomes, this
enter image description here
= Table.TransformColumns(#"Filtered Rows", {"aligned_to_ids", each Text.Combine(List.Transform(_, Text.From), ","), type text})
However the problem i'm trying to solve is when the nested json has multiple values like this: enter image description here
And when these LIST are extracted then an error message is caused, = Table.TransformColumns(#"Extracted Values1", {"collaborators", each Text.Combine(List.Transform(_, Text.From), ","), type text})
Expression.Error: We cannot convert a value of type Record to type Text.
Details:
Value=
id=15890
goal_id=323
role_id=15
Type=[Type]
It seems the multiple values are not handled and PQ does not recognise the underlying structure to enable the columns to be expanded.

Merging JSON with array to create new file

I am trying to make a map of the U.S. with Mapbox that shows median home price by county. I have a .json file that contains all the counties and is already accepted by Mapbox tileset -
{
"type": "Topology",
"transform": {
"scale": [
0.035896170617061705,
0.005347309530953095
],
"translate": [
-179.14734,
17.884813
]
},
"objects": {
"us_counties_20m": {
"type": "GeometryCollection",
"geometries": [
{
"type": "Polygon",
"arcs": [],
"id": "0500000US01001"
},
{
"type": "Polygon",
"arcs": [],
"id": "0500000US01009"
},
{
"type": "Polygon",
"arcs": [],
"id": "0500000US01017"
},
{
"type": "Polygon",
"arcs": [],
"id": "0500000US01021"
}
]
}
}
}
Basically, it's a json file with "type" (Polygon), "arcs" (to map the county), and "id", which is an ID for the county.
This is great and accepted by Mapbox Tilesets to give me a visualization by county, but I need to add in median home price by county (in order to get colors by county, based on price).
I have a second json file that is more like an array, which has
[
{
"0500000US01001": 51289.0,
"0500000US01009": 46793.0,
"0500000US01017": 39857.0,
"0500000US01021": 48859.0
}
]
and so on, but basically it has the ID -> median home price per county. The ID's are the same between these 2 files, and of the same quantity. So I need get a 3rd json file out of these, which has "type", "arcs", "id", and "PRICE" (the addition).
These files are huge - any suggestions? I tried using jq but received an error that
jq: error ... object ({"type":"To...) and array ([{"0500000U...) cannot be multiplied
Thanks in advance!
A straightforward approach would be saving the second file into a variable and using it as a reference while updating the first file. E.g:
jq 'add as $prices | input
| .objects.us_counties_20m.geometries[] |= . + {PRICE: $prices[.id]}' file2 file1
add can be substituted with .[0] if the array in file2 contains only one object.
Online demo

Invalid GeoJSON for polygon

I am trying to plot a polygon on a google map using geojson. Here is the PHP code where I am trying to build the polygon using four bounds values return from the query result array:
$arr3_poly = Array(
"type" => "Polygon",
"coordinates" => Array()
);
foreach ($q->result_array() as $row) {
$arr3_poly["coordinates"][] = Array(
floatval($row['v3_bounds_sw_lat']),
floatval($row['v3_bounds_sw_lng']),
floatval($row['v3_bounds_ne_lat']),
floatval($row['v3_bounds_ne_lng']),
);
}
When I then do json_encode($arr3_poly, JSON_PRETTY_PRINT);, this is the resulting output:
{
"type": "Polygon",
"coordinates": [
[
43.8526377,
-79.0499898,
43.8526509,
-79.0499877
],
[
43.8526546,
-79.0501977,
43.8526678,
-79.0501957
],
[
43.8526716,
-79.0504057,
43.8526848,
-79.0504037
]
]
}
There must be something wrong with this geojson because when I try to validate it at geojsonlint.com it returns with this error saying Failed to validate field 'coordinates' list schema.:
Any ideas what I am doing wrong? Thanks.
This works for me in geojsonlint.com (changed your points slightly to make it not look like a straight line):
{
"type": "Polygon",
"coordinates": [
[
[43.8526377,-79.0499898],
[43.854,-79.051],
[43.8526716,-79.0504057],
[43.8526377,-79.0499898]
]
]
}
However, looking closer at that map, they are in Antarctica, you probably wanted this, which is in Canada, near Toronto:
{
"type": "Polygon",
"coordinates": [
[
[-79.0499898,43.8526377],
[-79.051,43.854],
[-79.0504057,43.8526716],
[-79.0499898,43.8526377]
]
]
}
GeoJSON coordinates:
The order of elements must follow x, y, z order (longitude, latitude, altitude for
coordinates in a geographic coordinate reference system).
Which is the opposite order from a google.maps.LatLng object (that is Latitude, Longitude).