Fiware Orion-ld Polygon Data Insert Error - fiware

although the polygon data conforms to the standard structure, I get an error when I want to add it to the fiware side.
The polygons data we use is below;
I)
{"type":"Polygon","coordinates":[[[36.2461465,37.0648871],[36.2460744,37.0648355],[36.2459429,37.0647120],[36.2459742,37.0646980],[36.2457908,37.0645057],[36.2459484,37.0644000],[36.2460785,37.0643124],[36.2461040,37.0642925],[36.2462077,37.0642114],[36.2463421,37.0640831],[36.2464975,37.0642173],[36.2463498,37.0640466],[36.2466107,37.0642657],[36.2469810,37.0644730],[36.2474060,37.0642997],[36.2469520,37.0645424],[36.2469766,37.0645626],[36.2479976,37.0640023],[36.2480284,37.0640110],[36.2482431,37.0642928],[36.2482517,37.0642868],[36.2483085,37.0642595],[36.2480964,37.0639710],[36.2481014,37.0639406],[36.2483925,37.0637744],[36.2486570,37.0636293],[36.2487646,37.0638604],[36.2487989,37.0639320],[36.2488132,37.0639655],[36.2488716,37.0640896],[36.2486873,37.0642026],[36.2486031,37.0642577],[36.2485100,37.0643203],[36.2484413,37.0644238],[36.2483143,37.0645177],[36.2481813,37.0646141],[36.2480689,37.0646989],[36.2479859,37.0647609],[36.2477950,37.0648680],[36.2476609,37.0649424],[36.2476550,37.0649478],[36.2477896,37.0651105],[36.2478810,37.0652241],[36.2479723,37.0653378],[36.2481314,37.0655357],[36.2479478,37.0656056],[36.2479601,37.0656207],[36.2478733,37.0656585],[36.2477440,37.0657095],[36.2476484,37.0655959],[36.2476649,37.0655859],[36.2475542,37.0654415],[36.2474549,37.0653053],[36.2473347,37.0651292],[36.2472768,37.0650530],[36.2472231,37.0650896],[36.2465866,37.0654223],[36.2461465,37.0648871]],[[36.2478183,37.0636569],[36.2478183,37.0636569],[36.2478365,37.0636470],[36.2478183,37.0636569]]]}
II)
{"type":"Polygon","coordinates":[[[36.2453447,37.0638166],[36.2456068,37.0636280],[36.2457406,37.0635516],[36.2457631,37.0635874],[36.2461758,37.0633492],[36.2461813,37.0633044],[36.2462984,37.0632325],[36.2464379,37.0631695],[36.2467336,37.0630211],[36.2467560,37.0630524],[36.2470796,37.0628950],[36.2473921,37.0627600],[36.2477268,37.0625936],[36.2477325,37.0626071],[36.2478608,37.0625530],[36.2479334,37.0625395],[36.2479896,37.0626289],[36.2481179,37.0625749],[36.2481066,37.0625212],[36.2481569,37.0625211],[36.2482128,37.0625389],[36.2482465,37.0625836],[36.2482578,37.0626328],[36.2482750,37.0627671],[36.2483202,37.0629193],[36.2480245,37.0630498],[36.2475949,37.0632746],[36.2474387,37.0633376],[36.2472769,37.0634230],[36.2471207,37.0634995],[36.2468808,37.0636164],[36.2465963,37.0637827],[36.2463175,37.0639534],[36.2462952,37.0639848],[36.2461662,37.0638239],[36.2459464,37.0640333],[36.2459597,37.0639049],[36.2461662,37.0638239],[36.2462952,37.0639848],[36.2461728,37.0641329],[36.2459387,37.0643170],[36.2457045,37.0644607],[36.2456315,37.0643713],[36.2455532,37.0642045],[36.2454234,37.0639508],[36.2453447,37.0638166]]]}
Error;
Edit:
I solved the problem but I couldn't update here for a while. If there are overlapping lines or drawings such as dots in the Polygon data, it does not allow to add fiware.

Found another problem in the coordinates: [36.2461465,37.0648871]],[[36.2478183,37.0636569]. One array of points is ended and another array is started ... If you change the "]],[[" for "],[", that part should be OK. But, you still need to finish the polygon with the same point as the very first point of the polygon.

Polygons are supported as part of the location attribute (or indeed any type=GeoProperty. I'm not sure what your error is - the following request is working fine for me using the Orion-LD context broker - the version is 1.1.0 - maybe your NGSI request is incorrectly formatted.
curl -L -X POST 'http://localhost:1026/ngsi-ld/v1/entities/' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
--data-raw '{
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"category": {
"type": "Property",
"value": "sensor"
},
"temperature": {
"type": "Property",
"value": 25,
"unitCode": "CEL"
},
"location": {
"type": "Polygon",
"coordinates": [
[
[
36.2461465,
37.0648871
],
[
36.2460744,
37.0648355
],
[
36.2459429,
37.064712
],
[
36.2459742,
37.064698
],
[
36.2457908,
37.0645057
],
[
36.2459484,
37.0644
],
[
36.2460785,
37.0643124
],
[
36.246104,
37.0642925
],
[
36.2462077,
37.0642114
],
[
36.2463421,
37.0640831
],
[
36.2464975,
37.0642173
],
[
36.2463498,
37.0640466
],
[
36.2466107,
37.0642657
],
[
36.246981,
37.064473
],
[
36.247406,
37.0642997
],
[
36.246952,
37.0645424
],
[
36.2469766,
37.0645626
],
[
36.2479976,
37.0640023
],
[
36.2480284,
37.064011
],
[
36.2482431,
37.0642928
],
[
36.2482517,
37.0642868
],
[
36.2483085,
37.0642595
],
[
36.2480964,
37.063971
],
[
36.2481014,
37.0639406
],
[
36.2483925,
37.0637744
],
[
36.248657,
37.0636293
],
[
36.2487646,
37.0638604
],
[
36.2487989,
37.063932
],
[
36.2488132,
37.0639655
],
[
36.2488716,
37.0640896
],
[
36.2486873,
37.0642026
],
[
36.2486031,
37.0642577
],
[
36.24851,
37.0643203
],
[
36.2484413,
37.0644238
],
[
36.2483143,
37.0645177
],
[
36.2481813,
37.0646141
],
[
36.2480689,
37.0646989
],
[
36.2479859,
37.0647609
],
[
36.247795,
37.064868
],
[
36.2476609,
37.0649424
],
[
36.247655,
37.0649478
],
[
36.2477896,
37.0651105
],
[
36.247881,
37.0652241
],
[
36.2479723,
37.0653378
],
[
36.2481314,
37.0655357
],
[
36.2479478,
37.0656056
],
[
36.2479601,
37.0656207
],
[
36.2478733,
37.0656585
],
[
36.247744,
37.0657095
],
[
36.2476484,
37.0655959
],
[
36.2476649,
37.0655859
],
[
36.2475542,
37.0654415
],
[
36.2474549,
37.0653053
],
[
36.2473347,
37.0651292
],
[
36.2472768,
37.065053
],
[
36.2472231,
37.0650896
],
[
36.2465866,
37.0654223
],
[
36.2461465,
37.0648871
]
]
]
}
}'

Related

Load GeoJSON file into redshift using copy command

I have a file containing spatial data that I would like to load into Tableau using AWS. I found documentation that Amazon Redshift supports GeoJSON data through the GEOMETRY data type. I managed to upload the data into an S3 bucket and to create a table in my Redshift cluster through the following query:
CREATE TABLE public.data (
PC6 VARCHAR(100),
Aantal_adr INTEGER,
geometry GEOMETRY,
Gemeente2019 INTEGER,
Wijk2019 INTEGER,
Buurt2019 INTEGER
);
However, when I want to load the GeoJSON file from S3 into the table, the table remains empty and I don't even get an error. How do I get it to work?
I used the following command:
COPY public.data
FROM 's3://path/folder/data.json'
CREDENTIALS 'aws_access_key_id=x;aws_secret_access_key=y'
json 'auto'
region 'eu-west-1';
And this is a sample of my data:
{
"type": "FeatureCollection",
"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
"features": [
{ "type": "Feature", "properties": { "PC6": "1011AB", "Aantal_adr": 32, "Gemeente2019": 363, "Wijk2019": 36304, "Buurt2019": 3630400 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 4.905027952000069, 52.378425486000026 ], [ 4.905076113000064, 52.37841046300008 ], [ 4.905191411000033, 52.378376782000032 ], [ 4.905214767000075, 52.378369010000029 ], [ 4.905234707000034, 52.378360745000066 ], [ 4.905242522000037, 52.37835616600006 ], [ 4.905248247000031, 52.37835098000005 ], [ 4.905251343000032, 52.378345312000079 ], [ 4.905252578000045, 52.378339070000038 ], [ 4.905252531000031, 52.378332431000047 ], [ 4.905249474000072, 52.378312360000052 ], [ 4.905248898000025, 52.378300391000039 ], [ 4.905250074000037, 52.378294856000082 ], [ 4.905252799000039, 52.378289789000064 ], [ 4.905257110000036, 52.37828554400005 ], [ 4.905262918000062, 52.378281792000053 ], [ 4.905272242000024, 52.378277568000044 ], [ 4.905295016000025, 52.378270656000041 ], [ 4.905320961000029, 52.378264933000082 ], [ 4.90534841300007, 52.378259781000054 ], [ 4.905615848000025, 52.378213457000072 ], [ 4.905672094000067, 52.378204254000082 ], [ 4.905713623000054, 52.378198448000035 ], [ 4.905755373000034, 52.378193294000027 ], [ 4.905797334000056, 52.378188918000035 ], [ 4.905854017000024, 52.378184793000059 ], [ 4.906010385000059, 52.378175414000054 ], [ 4.906049546000077, 52.378171835000046 ], [ 4.906099170000061, 52.378165687000035 ], [ 4.906138697000074, 52.378159097000037 ], [ 4.906179196000039, 52.378151124000055 ], [ 4.906393302000026, 52.378105836000032 ], [ 4.906379850000064, 52.378077200000064 ], [ 4.906376915000067, 52.37806777000003 ], [ 4.906375354000033, 52.37805852300005 ], [ 4.906375845000071, 52.37805008600003 ], [ 4.906379084000037, 52.378043095000066 ], [ 4.906385125000043, 52.378038245000027 ], [ 4.906393308000077, 52.378034955000032 ], [ 4.906413366000038, 52.378029843000036 ], [ 4.906448001000058, 52.378023854000048 ], [ 4.906689757000038, 52.377990125000053 ], [ 4.906724868000026, 52.37798340900008 ], [ 4.906734994000033, 52.377980573000059 ], [ 4.90674369200002, 52.377977198000053 ], [ 4.906750343000056, 52.377973046000079 ], [ 4.906754179000075, 52.377967886000079 ], [ 4.90675559500005, 52.37796090900008 ], [ 4.906755038000028, 52.377952553000057 ], [ 4.906749900000023, 52.377933764000034 ], [ 4.906733622000047, 52.377896754000062 ], [ 4.906462646000023, 52.377214200000026 ], [ 4.906126387000029, 52.377292057000034 ], [ 4.905688946000055, 52.377410774000055 ], [ 4.905016945000057, 52.377587847000029 ], [ 4.90462009700002, 52.377724772000079 ], [ 4.904597594000052, 52.377732947000027 ], [ 4.904546418000052, 52.377751543000045 ], [ 4.904483645000028, 52.377774349000049 ], [ 4.904388632000064, 52.377808869000035 ], [ 4.904220272000032, 52.377870038000026 ], [ 4.904280897000035, 52.377932213000065 ], [ 4.904695869000022, 52.378344313000071 ], [ 4.904749813000024, 52.378331915000047 ], [ 4.904708505000031, 52.378356860000054 ], [ 4.904782821000026, 52.378430663000074 ], [ 4.904842127000052, 52.378489556000034 ], [ 4.905027952000069, 52.378425486000026 ] ] ] } } ] }
Loading GeoJSON directly into a GEOMETRY column is not currently supported. However, we will consider adding to our roadmap. When new features are released they are noted in our regular maintenance announcements at the top of the forum.
You can only COPY to GEOMETRY columns from data in text or CSV format. The data must be in the hexadecimal form of the extended well-known binary (EWKB) format …
https://docs.aws.amazon.com/redshift/latest/dg/copy-usage_notes-spatial-data.html

Extract from json with | jq by a given word

Can somebody help me to extract with | jq the following:
{
"status": "success",
"data": {
"resultType": "matrix",
"result": [
{
"metric": {
"pod": "dev-cds-5c97cf7f78-sw6b9"
},
"values": [
[
1588204800,
"0.3561394483796914"
],
[
1588215600,
"0.3607968456046861"
],
[
1588226400,
"0.3813882532417868"
],
[
1588237200,
"0.6264355815408573"
]
]
},
{
"metric": {
"pod": "uat-cds-66ccc9685-b5tvh"
},
"values": [
[
1588204800,
"0.9969746974696218"
],
[
1588215600,
"0.7400881057270005"
],
[
1588226400,
"1.2298959318837195"
],
[
1588237200,
"0.9482296838254507"
]
]
}
]
}
}
I need to obtain all-values individually by given word dev-cds and not all the name dev-cds-5c97cf7f78-sw6b9.
Result desired:
{
"metric": {
"pod": "dev-cds-5c97cf7f78-sw6b9"
},
"values": [
[
1588204800,
"0.3561394483796914"
],
[
1588215600,
"0.3607968456046861"
],
[
1588226400,
"0.3813882532417868"
],
[
1588237200,
"0.6264355815408573"
]
]
}
You should first iterate over the result array. Check if the pod inside, metric object has the value that contains "dev-cds".
.data.result[] | if .metric.pod | contains("dev-cds") then . else empty end
https://jqplay.org/s/54OH83qHKP

Jq iterating array objects to create new json objects

I've been thinking and searching for a long time, but I didn't find out what I'm looking for.
I'm using JQ to parse tshark (-ek) json output, but I'm a jq newby
When a frame is multivalue I have a JSON similar to this:
{
"timestamp": "1525627021656",
"layers": {
"frame_time_epoch": [
"1525627021.656417000"
],
"ip_src": [
"10.10.10.10"
],
"ip_src_host": [
"test"
],
"ip_dst": [
"10.10.10.11"
],
"ip_dst_host": [
"dest_test"
],
"diameter_Event-Timestamp": [
"May 6, 2018 19:17:02.000000000 CEST",
"May 6, 2018 19:17:02.000000000 CEST"
],
"diameter_Origin-Host": [
"TESTHOST",
"TESTHOST"
],
"diameter_Destination-Host": [
"DESTHOST",
"DESTHOST"
],
"diameter_CC-Request-Type": [
"2",
"2"
],
"diameter_CC-Request-Number": [
"10",
"3"
],
"diameter_Rating-Group": [
"9004",
"9001"
],
"diameter_Called-Station-Id": [
"testing",
"testing"
],
"diameter_User-Name": [
"testuser",
"testuser"
],
"diameter_Subscription-Id-Data": [
"66666666666",
"77777777777"
],
"gtp_qos_version": [
"0x00000008",
"0x00000005"
],
"gtp_qos_max_dl": [
"8640",
"42"
],
"diameter_Session-Id": [
"test1;sessionID1;test1",
"test2;sessionID2;test2"
]
}
}
As you can see, many keys are array and I want to iterate them to create different json objects in a result like this:
{
"frame_time_epoch": [
"1525627021.656417000"
],
"ip_src": [
"10.10.10.10"
],
"ip_src_host": [
"test"
],
"ip_dst": [
"10.10.10.11"
],
"ip_dst_host": [
"dest_test"
],
"diameter_Event-Timestamp": [
"May 6, 2018 19:17:02.000000000 CEST"
],
"diameter_Origin-Host": [
"TESTHOST"
],
"diameter_Destination-Host": [
"DESTHOST"
],
"diameter_CC-Request-Type": [
"2"
],
"diameter_CC-Request-Number": [
"3"
],
"diameter_Rating-Group": [
"9001"
],
"diameter_Called-Station-Id": [
"testing"
],
"diameter_User-Name": [
"testuser"
],
"diameter_Subscription-Id-Data": [
"77777777777"
],
"gtp_qos_version": [
"0x00000005"
],
"gtp_qos_max_dl": [
"42"
],
"diameter_Session-Id": [
"test2;sessionID2;test2"
]
}
{
"frame_time_epoch": [
"1525627021.656417000"
],
"ip_src": [
"10.10.10.10"
],
"ip_src_host": [
"test"
],
"ip_dst": [
"10.10.10.11"
],
"ip_dst_host": [
"dest_test"
],
"diameter_Event-Timestamp": [
"May 6, 2018 19:17:02.000000000 CEST"
],
"diameter_Origin-Host": [
"TESTHOST"
],
"diameter_Destination-Host": [
"DESTHOST"
],
"diameter_CC-Request-Type": [
"2"
],
"diameter_CC-Request-Number": [
"10"
],
"diameter_Rating-Group": [
"9004"
],
"diameter_Called-Station-Id": [
"testing"
],
"diameter_User-Name": [
"testuser"
],
"diameter_Subscription-Id-Data": [
"66666666666"
],
"gtp_qos_version": [
"0x00000008"
],
"gtp_qos_max_dl": [
"8640"
],
"diameter_Session-Id": [
"test1;sessionID1;test1"
]
}
Another hand made example:
INPUT:
{
"key_single": ["single_value"],
"key2": ["single_value"],
"multiple_value_key": ["value1" , "value2"],
"any_key_name": ["value4" ,"value5"]
}
{
"key_single": ["single_value"],
"key2": ["single_value"],
"multiple_value_key": ["value6" , "value7", "value8"],
"any_key_name": ["value9" ,"value10" , "value11"]
}
Desired output:
{
"key_single": ["single_value"],
"key2": ["single_value"],
"multiple_value_key": ["value1"],
"any_key_name": ["value4"],
}
{
"key_single": ["single_value"],
"key2": ["single_value"],
"multiple_value_key": ["value2"],
"any_key_name": ["value5"],
}
{
"key_single": ["single_value"],
"key2": ["single_value"],
"multiple_value_key": ["value6"],
"any_key_name": ["value9"],
}
{
"key_single": ["single_value"],
"key2": ["single_value"],
"multiple_value_key": ["value7"],
"any_key_name": ["value10"],
}
{
"key_single": ["single_value"],
"key2": ["single_value"],
"multiple_value_key": ["value8"],
"any_key_name": ["value11"],
}
Could you help Me?
Thanks in advance.
It looks like you want to take, in turn, the i-th element of the selected arrays. Using your second example, this could be done like so:
range(0; .multiple_value_key|length) as $i
| . + { multiple_value_key: [.multiple_value_key[$i]],
any_key_name: [.any_key_name[$i]] }
The output in compact form:
{"key_single":["single_value"],"key2":["single_value"],"multiple_value_key":["value1"],"any_key_name":["value4"]}
{"key_single":["single_value"],"key2":["single_value"],"multiple_value_key":["value2"],"any_key_name":["value5"]}
{"key_single":["single_value"],"key2":["single_value"],"multiple_value_key":["value6"],"any_key_name":["value9"]}
{"key_single":["single_value"],"key2":["single_value"],"multiple_value_key":["value7"],"any_key_name":["value10"]}
{"key_single":["single_value"],"key2":["single_value"],"multiple_value_key":["value8"],"any_key_name":["value11"]}
Here is a simple solution to the problem as described in the "comments", though the
output differs slightly from that shown in the Q.
For clarity, a helper function is defined for producing the $i-th slice of an object,
that is, for all array-valued keys with array-length greater than 1,
the value is replaced by the $i-th item in the array.
def slice($i):
map_values(if (type == "array" and length>1)
then [.[$i]]
else . end);
The solution is then simply:
.layers
| range(0; [.[] | length] | max) as $i
| slice($i)
Output
{
"frame_time_epoch": [
"1525627021.656417000"
],
"ip_src": [
"10.10.10.10"
],
"ip_src_host": [
"test"
],
"ip_dst": [
"10.10.10.11"
],
"ip_dst_host": [
"dest_test"
],
"diameter_Event-Timestamp": [
"May 6, 2018 19:17:02.000000000 CEST"
],
"diameter_Origin-Host": [
"TESTHOST"
],
"diameter_Destination-Host": [
"DESTHOST"
],
"diameter_CC-Request-Type": [
"2"
],
"diameter_CC-Request-Number": [
"10"
],
"diameter_Rating-Group": [
"9004"
],
"diameter_Called-Station-Id": [
"testing"
],
"diameter_User-Name": [
"testuser"
],
"diameter_Subscription-Id-Data": [
"66666666666"
],
"gtp_qos_version": [
"0x00000008"
],
"gtp_qos_max_dl": [
"8640"
],
"diameter_Session-Id": [
"test1;sessionID1;test1"
]
}
{
"frame_time_epoch": [
"1525627021.656417000"
],
"ip_src": [
"10.10.10.10"
],
"ip_src_host": [
"test"
],
"ip_dst": [
"10.10.10.11"
],
"ip_dst_host": [
"dest_test"
],
"diameter_Event-Timestamp": [
"May 6, 2018 19:17:02.000000000 CEST"
],
"diameter_Origin-Host": [
"TESTHOST"
],
"diameter_Destination-Host": [
"DESTHOST"
],
"diameter_CC-Request-Type": [
"2"
],
"diameter_CC-Request-Number": [
"3"
],
"diameter_Rating-Group": [
"9001"
],
"diameter_Called-Station-Id": [
"testing"
],
"diameter_User-Name": [
"testuser"
],
"diameter_Subscription-Id-Data": [
"77777777777"
],
"gtp_qos_version": [
"0x00000005"
],
"gtp_qos_max_dl": [
"42"
],
"diameter_Session-Id": [
"test2;sessionID2;test2"
]
}

add geojson to google map without Feature wrapper

Google Maps has the function addGeoJson to.. well... add geo JSON ;)
But it seems that it only accepts geoJSON with a wrapping Feature.
This polygon is directly from http://geojsonlint.com/
{
"type": "Polygon",
"coordinates": [
[
[
-84.32281494140625,
34.9895035675793
],
[
-84.29122924804688,
35.21981940793435
],
[
-84.24041748046875,
35.25459097465022
],
[
-84.22531127929688,
35.266925688950074
],
[
-84.20745849609375,
35.26580442886754
],
[
-84.19921875,
35.24674063355999
],
[
-84.16213989257812,
35.24113278166642
],
[
-84.12368774414062,
35.24898366572645
],
[
-84.09072875976562,
35.24898366572645
],
[
-84.08798217773438,
35.264683153268116
],
[
-84.04266357421875,
35.27701633139884
],
[
-84.03030395507812,
35.291589484566124
],
[
-84.0234375,
35.306160014550784
],
[
-84.03305053710936,
35.32745068492882
],
[
-84.03579711914062,
35.34313496028189
],
[
-84.03579711914062,
35.348735749472546
],
[
-84.01657104492188,
35.35545618392078
],
[
-84.01107788085938,
35.37337460834958
],
[
-84.00970458984374,
35.39128905521763
],
[
-84.01931762695312,
35.41479572901859
],
[
-84.00283813476562,
35.429344044107154
],
[
-83.93692016601562,
35.47409160773029
],
[
-83.91220092773438,
35.47632833265728
],
[
-83.88885498046875,
35.504282143299655
],
[
-83.88473510742186,
35.516578738902936
],
[
-83.8751220703125,
35.52104976129943
],
[
-83.85314941406249,
35.52104976129943
],
[
-83.82843017578125,
35.52104976129943
],
[
-83.8092041015625,
35.53446133418443
],
[
-83.80233764648438,
35.54116627999813
],
[
-83.76800537109374,
35.56239491058853
],
[
-83.7432861328125,
35.56239491058853
],
[
-83.71994018554688,
35.56239491058853
],
[
-83.67050170898438,
35.569097520776054
],
[
-83.6334228515625,
35.570214567965984
],
[
-83.61007690429688,
35.576916524038616
],
[
-83.59634399414061,
35.574682600980914
],
[
-83.5894775390625,
35.55904339525896
],
[
-83.55239868164062,
35.56574628576276
],
[
-83.49746704101562,
35.563512051219696
],
[
-83.47000122070312,
35.586968406786475
],
[
-83.4466552734375,
35.60818490437746
],
[
-83.37936401367188,
35.63609277863135
],
[
-83.35739135742188,
35.65618041632016
],
[
-83.32305908203124,
35.66622234103479
],
[
-83.3148193359375,
35.65394870599763
],
[
-83.29971313476561,
35.660643649881614
],
[
-83.28598022460938,
35.67180064238771
],
[
-83.26126098632811,
35.6907639509368
],
[
-83.25714111328125,
35.69968630125201
],
[
-83.25576782226562,
35.715298012125295
],
[
-83.23516845703125,
35.72310272092263
],
[
-83.19808959960936,
35.72756221127198
],
[
-83.16238403320312,
35.753199435570316
],
[
-83.15826416015625,
35.76322914549896
],
[
-83.10333251953125,
35.76991491635478
],
[
-83.08685302734375,
35.7843988251953
],
[
-83.0511474609375,
35.787740890986576
],
[
-83.01681518554688,
35.78328477203738
],
[
-83.001708984375,
35.77882840327371
],
[
-82.96737670898438,
35.793310688351724
],
[
-82.94540405273438,
35.820040281161
],
[
-82.9193115234375,
35.85121343450061
],
[
-82.9083251953125,
35.86902116501695
],
[
-82.90557861328125,
35.87792352995116
],
[
-82.91244506835938,
35.92353244718235
],
[
-82.88360595703125,
35.94688293218141
],
[
-82.85614013671875,
35.951329861522666
],
[
-82.8424072265625,
35.94243575255426
],
[
-82.825927734375,
35.92464453144099
],
[
-82.80670166015625,
35.927980690382704
],
[
-82.80532836914062,
35.94243575255426
],
[
-82.77923583984375,
35.97356075349624
],
[
-82.78060913085938,
35.99245209055831
],
[
-82.76138305664062,
36.00356252895066
],
[
-82.69546508789062,
36.04465753921525
],
[
-82.64465332031249,
36.060201412392914
],
[
-82.61306762695312,
36.060201412392914
],
[
-82.60620117187499,
36.033552893400376
],
[
-82.60620117187499,
35.991340960635405
],
[
-82.60620117187499,
35.97911749857497
],
[
-82.5787353515625,
35.96133453736691
],
[
-82.5677490234375,
35.951329861522666
],
[
-82.53067016601562,
35.97244935753683
],
[
-82.46475219726562,
36.006895355244666
],
[
-82.41668701171875,
36.070192281208456
],
[
-82.37960815429686,
36.10126686921446
],
[
-82.35488891601562,
36.117908916563685
],
[
-82.34115600585936,
36.113471382052175
],
[
-82.29583740234375,
36.13343831245866
],
[
-82.26287841796874,
36.13565654678543
],
[
-82.23403930664062,
36.13565654678543
],
[
-82.2216796875,
36.154509006695
],
[
-82.20382690429688,
36.15561783381855
],
[
-82.19009399414062,
36.144528857027744
],
[
-82.15438842773438,
36.15007354140755
],
[
-82.14065551757812,
36.134547437460064
],
[
-82.1337890625,
36.116799556445024
],
[
-82.12142944335938,
36.10570509327921
],
[
-82.08984375,
36.10792411128649
],
[
-82.05276489257811,
36.12678323326429
],
[
-82.03628540039062,
36.12900165569652
],
[
-81.91268920898438,
36.29409768373033
],
[
-81.89071655273438,
36.30959215409138
],
[
-81.86325073242188,
36.33504067209607
],
[
-81.83029174804688,
36.34499652561904
],
[
-81.80145263671875,
36.35605709240176
],
[
-81.77947998046874,
36.34610265300638
],
[
-81.76162719726562,
36.33835943134047
],
[
-81.73690795898438,
36.33835943134047
],
[
-81.71905517578125,
36.33835943134047
],
[
-81.70669555664062,
36.33504067209607
],
[
-81.70669555664062,
36.342784223707234
],
[
-81.72317504882812,
36.357163062654365
],
[
-81.73278808593749,
36.379279167407965
],
[
-81.73690795898438,
36.40028364332352
],
[
-81.73690795898438,
36.41354670392876
],
[
-81.72454833984374,
36.423492513472326
],
[
-81.71768188476562,
36.445589751779174
],
[
-81.69845581054688,
36.47541104282962
],
[
-81.69845581054688,
36.51073994146672
],
[
-81.705322265625,
36.53060536411363
],
[
-81.69158935546875,
36.55929085774001
],
[
-81.68060302734375,
36.56480607840351
],
[
-81.68197631835938,
36.58686302344181
],
[
-81.04202270507812,
36.56370306576917
],
[
-80.74264526367186,
36.561496993252575
],
[
-79.89120483398438,
36.54053616262899
],
[
-78.68408203124999,
36.53943280355122
],
[
-77.88345336914062,
36.54053616262899
],
[
-76.91665649414062,
36.54163950596125
],
[
-76.91665649414062,
36.55046568575947
],
[
-76.31103515625,
36.551568887374
],
[
-75.79605102539062,
36.54936246839778
],
[
-75.6298828125,
36.07574221562703
],
[
-75.4925537109375,
35.82226734114509
],
[
-75.3936767578125,
35.639441068973916
],
[
-75.41015624999999,
35.43829554739668
],
[
-75.43212890625,
35.263561862152095
],
[
-75.487060546875,
35.18727767598896
],
[
-75.5914306640625,
35.17380831799959
],
[
-75.9210205078125,
35.04798673426734
],
[
-76.17919921875,
34.867904962568744
],
[
-76.41540527343749,
34.62868797377061
],
[
-76.4593505859375,
34.57442951865274
],
[
-76.53076171875,
34.53371242139567
],
[
-76.5911865234375,
34.551811369170494
],
[
-76.651611328125,
34.615126683462194
],
[
-76.761474609375,
34.63320791137959
],
[
-77.069091796875,
34.59704151614417
],
[
-77.376708984375,
34.45674800347809
],
[
-77.5909423828125,
34.3207552752374
],
[
-77.8326416015625,
33.97980872872457
],
[
-77.9150390625,
33.80197351806589
],
[
-77.9754638671875,
33.73804486328907
],
[
-78.11279296875,
33.8521697014074
],
[
-78.2830810546875,
33.8521697014074
],
[
-78.4808349609375,
33.815666308702774
],
[
-79.6728515625,
34.8047829195724
],
[
-80.782470703125,
34.836349990763864
],
[
-80.782470703125,
34.91746688928252
],
[
-80.9307861328125,
35.092945313732635
],
[
-81.0516357421875,
35.02999636902566
],
[
-81.0516357421875,
35.05248370662468
],
[
-81.0516357421875,
35.137879119634185
],
[
-82.3150634765625,
35.19625600786368
],
[
-82.3590087890625,
35.19625600786368
],
[
-82.40295410156249,
35.22318504970181
],
[
-82.4688720703125,
35.16931803601131
],
[
-82.6885986328125,
35.1154153142536
],
[
-82.781982421875,
35.06147690849717
],
[
-83.1060791015625,
35.003003395276714
],
[
-83.616943359375,
34.99850370014629
],
[
-84.05639648437499,
34.985003130171066
],
[
-84.22119140625,
34.985003130171066
],
[
-84.32281494140625,
34.9895035675793
]
],
[
[
-75.69030761718749,
35.74205383068037
],
[
-75.5914306640625,
35.74205383068037
],
[
-75.5419921875,
35.585851593232356
],
[
-75.56396484375,
35.32633026307483
],
[
-75.69030761718749,
35.285984736065735
],
[
-75.970458984375,
35.16482750605027
],
[
-76.2066650390625,
34.994003757575776
],
[
-76.300048828125,
35.02999636902566
],
[
-76.409912109375,
35.07946034047981
],
[
-76.5252685546875,
35.10642805736423
],
[
-76.4208984375,
35.25907654252574
],
[
-76.3385009765625,
35.294952147406576
],
[
-76.0858154296875,
35.29943548054543
],
[
-75.948486328125,
35.44277092585766
],
[
-75.8660888671875,
35.53669637839501
],
[
-75.772705078125,
35.567980458012094
],
[
-75.706787109375,
35.634976650677295
],
[
-75.706787109375,
35.74205383068037
],
[
-75.69030761718749,
35.74205383068037
]
]
]
}
And it won't show up on the map with the error
Uncaught InvalidValueError: not a Feature or FeatureCollection
is there a way to add geoJSON polygons to a google map without wrapping them into a Feature/FeatureCollection?
It's easy to wrap your data (let's call it geom) in a Feature:
var feature = {
type: "Feature",
geometry: geom
};

Swift: Point in polygon? How can I check if a user's location falls within a Geo-JSON polygon?

I have a user's location coordinates. Given the location is in one of the polygons, how do I return in which polygon it falls, by OBJECTID, using Swift or Objective-C?
The sample JSON polygon data I am using looks like this:
{ "type": "Feature", "properties": { "OBJECTID": 10, "District_N": "10", "WARD": 10 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -87.52389, 41.742767 ], [ -87.524668, 41.644637 ], [ -87.578588, 41.64473 ], [ -87.580619, 41.65028 ], [ -87.601709, 41.676572 ], [ -87.60172, 41.685837 ], [ -87.592251, 41.703002 ], [ -87.592202, 41.707656 ], [ -87.586051, 41.707765 ], [ -87.585322, 41.709059 ], [ -87.585162, 41.720732 ], [ -87.563165, 41.698022 ], [ -87.563275, 41.706395 ], [ -87.559645, 41.706399 ], [ -87.559775, 41.715404 ], [ -87.569434, 41.715639 ], [ -87.566814, 41.716484 ], [ -87.565902, 41.722585 ], [ -87.556604, 41.722701 ], [ -87.556693, 41.726351 ], [ -87.554907, 41.726374 ], [ -87.557838, 41.729746 ], [ -87.552591, 41.728229 ], [ -87.552639, 41.730057 ], [ -87.555405, 41.730014 ], [ -87.555585, 41.737329 ], [ -87.556887, 41.737313 ], [ -87.555191, 41.744277 ], [ -87.551526, 41.742147 ], [ -87.550325, 41.744674 ], [ -87.541923, 41.744779 ], [ -87.541772, 41.741144 ], [ -87.529645, 41.741412 ], [ -87.530874, 41.748023 ], [ -87.544007, 41.753548 ], [ -87.540739, 41.757435 ], [ -87.524445, 41.760035 ], [ -87.52389, 41.742767 ] ] ] } }
,
{ "type": "Feature", "properties": { "OBJECTID": 11, "District_N": "11", "WARD": 11 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -87.647655, 41.874319 ], [ -87.645662, 41.874331 ], [ -87.644767, 41.867185 ], [ -87.639218, 41.867254 ], [ -87.639014, 41.860018 ], [ -87.64194, 41.859984 ], [ -87.641864, 41.857621 ], [ -87.639967, 41.857648 ], [ -87.644938, 41.852666 ], [ -87.638532, 41.852792 ], [ -87.640478, 41.849467 ], [ -87.631965, 41.848252 ], [ -87.629944, 41.846683 ], [ -87.629594, 41.83097 ], [ -87.631504, 41.830939 ], [ -87.631465, 41.829064 ], [ -87.633842, 41.827265 ], [ -87.633763, 41.823616 ], [ -87.636354, 41.823583 ], [ -87.635839, 41.809046 ], [ -87.640679, 41.808952 ], [ -87.641275, 41.805025 ], [ -87.645432, 41.80494 ], [ -87.645538, 41.808864 ], [ -87.65524, 41.808734 ], [ -87.655246, 41.81237 ], [ -87.665064, 41.812222 ], [ -87.665568, 41.830501 ], [ -87.657712, 41.830726 ], [ -87.66436, 41.839375 ], [ -87.664449, 41.844617 ], [ -87.658043, 41.847724 ], [ -87.646473, 41.84919 ], [ -87.646495, 41.852647 ], [ -87.648132, 41.852613 ], [ -87.647673, 41.859898 ], [ -87.656479, 41.859765 ], [ -87.656554, 41.862513 ], [ -87.650937, 41.863499 ], [ -87.652815, 41.873034 ], [ -87.647655, 41.874319 ] ] ] } }
Create a Bezier path (make sure you close it) then use the containsPoint method. You can choose the 'contains' algorithm using the usesEvenOddFillRule property. More info is available in apple's docs.