geoJSON formatting needs to be nested more deeply - json

I'm have an exported geoJSON format containing a long series of polygon coordinates. Attaching the first few rows, it continues in the same way:
{
"Type": 8,
"Features": [
{
"Type": 7,
"Id": null,
"Geometry": {
"Type": 4,
"Coordinates": [
{
"Type": 2,
"Coordinates": [
{
"Altitude": null,
"Latitude": 85.683948266763423,
"Longitude": 100.62897140939768
},
{
"Altitude": null,
"Latitude": 86.183185093020128,
"Longitude": 100.62897140939695
},
{
"Altitude": null,
"Latitude": 86.183185093020128,
"Longitude": 102.58500571589823
},
{
"Altitude": null,
"Latitude": 97.662303996119974,
"Longitude": 102.58500571589828
},
{
"Altitude": null,
"Latitude": 97.662303996119988,
"Longitude": 97.853903401585853
},
{
"Altitude": null,
"Latitude": 85.683948266763423,
"Longitude": 97.853903401585839
},
{
"Altitude": null,
"Latitude": 85.683948266763423,
"Longitude": 100.62897140939768
}
],
"BoundingBoxes": null,
"CRS": null
}
],
"BoundingBoxes": null,
"CRS": null
},
"Properties": {
"Name": "Shop",
"Area": 572.15969696515185
},
"BoundingBoxes": null,
"CRS": null
},
{
"Type": 7,
"Id": null,
"Geometry": {
"Type": 4,
"Coordinates": [
{
"Type": 2,
"Coordinates": [
{
"Altitude": null,
"Latitude": 91.298364266763443,
"Longitude": 86.631773715898134
},
{
I tried looking at various explanations of the geoJSON format online, but haven't found info about why the "Type" is numeric and not matching the actual type such as "Polygon".
In addition, while testing on GeoJSON Viewer & Validator, I tried debugging and converted some of the initial lines to:
{
"type": "MultiPolygon",
"coordinates": [
{
"type": 7,
"Id": null,
"Geometry": {
"Type": "Polygon",
"Coordinates": [
{
"Type": 2,
"Coordinates": [
{
"Altitude": null,
"Latitude": 85.683948266763423,
"Longitude": 100.62897140939768
},
I replaced Type with type and 8 with MultiPolygon etc.
With the above situation in place, I get the following error for each Polygon:
Line 5: a number was found where a coordinate array should have been
found: this needs to be nested more deeply
I'm not sure what to change in the format in order to make it coherent for the validator. I think the export format might be old but I'm not the one that built it so perhaps a manual replacement of some fields can solve the issue.
Any tips?

A bit late to the party, but it seems like it's not valid GeoJSON, you can find the official JSON schemas that describe the specification here:
https://github.com/geojson/schema.
It is for instance specified that type of a feature should be a JSON object and not an integer, that for each feature, type, properties and geometry are required, etc.
"$schema": "http://json-schema.org/draft-07/schema#",
"$id": "https://geojson.org/schema/Feature.json",
"title": "GeoJSON Feature",
"type": "object",
"required": [
"type",
"properties",
"geometry"
]
(etc.)
For a MultiPolygon here is the schema : https://geojson.org/schema/MultiPolygon.json
The way that coordinates are stored in your file aren't standard GeoJSON either, as they should be written as array of numbers:
"coordinates": {
"type": "array",
"items": {
"type": "array",
"items": {
"type": "array",
"minItems": 4,
"items": {
"type": "array",
"minItems": 2,
"items": {
"type": "number"
}
}
}
}
}

Related

Is there any way to define a scoping mechanism in JSON Schema for Arrays of Objects?

I would like to use JSON Schema to validate my data which exists as an array of objects. In this use-case, I have a list of people and I want to make sure they possess certain properties, but these properties aren't exhaustive.
For instance, if we have a person name Bob, I want to make sure that Bob's height, ethnicity and location is set to certain values. But I don't care much about Bob's other properties like hobbies, weight, relationshipStatus.
There is one caveat and it is that there can be multiple Bobs, so I don't want to check for all Bobs. It just so happens that each person has a unique ID given to them and I want to check properties of a person by the specified id.
Here is an example of all the people that exist:
{
"people": [
{
"name": "Bob",
"id": "ei75dO",
"age": "36",
"height": "68",
"ethnicity": "american",
"location": "san francisco",
"weight": "174",
"relationshipStatus": "married",
"hobbies": ["camping", "traveling"]
},
{
"name": "Leslie",
"id": "UMZMA2",
"age": "32",
"height": "65",
"ethnicity": "american",
"location": "pawnee",
"weight": "139",
"relationshipStatus": "married",
"hobbies": ["politics", "parks"]
},
{
"name": "Kapil",
"id": "HkfmKh",
"age": "27",
"height": "71",
"ethnicity": "indian",
"location": "mumbai",
"weight": "166",
"relationshipStatus": "single",
"hobbies": ["tech", "games"]
},
{
"name": "Arnaud",
"id": "xSiIDj",
"age": "42",
"height": "70",
"ethnicity": "french",
"location": "paris",
"weight": "183",
"relationshipStatus": "married",
"hobbies": ["cooking", "reading"]
},
{
"name": "Kapil",
"id": "fDnweF",
"age": "38",
"height": "67",
"ethnicity": "indian",
"location": "new delhi",
"weight": "159",
"relationshipStatus": "married",
"hobbies": ["tech", "television"]
},
{
"name": "Gary",
"id": "ZX43NI",
"age": "29",
"height": "69",
"ethnicity": "british",
"location": "london",
"weight": "172",
"relationshipStatus": "single",
"hobbies": ["parkour", "guns"]
},
{
"name": "Jim",
"id": "uLqbVe",
"age": "26",
"height": "72",
"ethnicity": "american",
"location": "scranton",
"weight": "179",
"relationshipStatus": "single",
"hobbies": ["parkour", "guns"]
}
]
}
And here is what I specifically want to check for in each person:
{
"$schema": "https://json-schema.org/draft/2019-09/schema",
"type": "object",
"properties": {
"people": {
"type": "array",
"contains": {
"anyOf": [
{
"type": "object",
"properties": {
"id": {
"const": "ei75dO"
},
"name": {
"const": "Bob"
},
"ethnicity": {
"const": "american"
},
"location": {
"const": "los angeles"
},
"height": {
"const": "68"
}
},
"required": ["id", "name", "ethnicity", "location", "height"]
},
{
"type": "object",
"properties": {
"id": {
"const": "fDnweF"
},
"name": {
"const": "Kapil"
},
"location": {
"const": "goa"
},
"height": {
"const": "65"
}
},
"required": ["id", "name", "location", "height"]
},
{
"type": "object",
"properties": {
"id": {
"const": "xSiIDj"
},
"name": {
"const": "Arnaud"
},
"location": {
"const": "paris"
},
"relationshipStatus": {
"const": "single"
}
},
"required": ["id", "name", "location", "relationshipStatus"]
},
{
"type": "object",
"properties": {
"id": {
"const": "uLqbVe"
},
"relationshipStatus": {
"const": "married"
}
},
"required": ["id", "relationshipStatus"]
}
]
}
}
},
"required": ["people"]
}
Note that for Bob, I only want to check that his name in the records is Bob, his ethnicity is american and that his location and height are set properly.
For Kapil, notice that there are 2 of them in the record. I only want to validate the array object pertaining to Kapil with the id fDnweF.
And for Jim, I only want to make sure that his relationshipStatus is set to married.
So my question would be, is there any way in JSON Schema to say hey, when you come across and array of objects instead of running validation across each element in the data, only run it against objects that match a specific identifier. In our instance, we would say that the identifier is id. You can imagine that this identifier can be anything, for example it could have been socialSecurity# if the list of people were all from America.
The issue with the current schema is that when it tries to validate the objects, it generates a giant list of errors with no clear indication of which object failed with which value.
In an ideal scenario AJV (which I currently use) would generate errors that should look something like:
---------Bob-------------
path: people[0].location
expected: "los angeles"
// Notice how this isn't Kapil at index 2 since we provided the id which matches kapil at index 4
---------Kapil-----------
path: people[4].location
expected: "goa"
---------Kapil-----------
path: people[4].height
expected: "65"
---------Arnaud----------
path: people[3].relationshipStatus
expected: "single"
-----------Jim-----------
path: people[6].relationshipStatus
expected: "married"
Instead, currently AJV spits our errors with no clear indication of where the failure might be. If bob failed to match the expected value of location, it says that every person including bob has an invalid location, which from our perspective is incorrect.
How can I define a schema that can resolve this use-case and we can use JSON Schema to pinpoint which elements in our data aren't in compliance with what our schema states. All so that we can store these schema errors cleanly for reporting purposes and come back to these reports to see exactly which people (represented by index values of array) failed which values.
Edit:
Assume that we would also like to check relatives for Bob as well. for instance we want to create a schema to check that their relative with the given ID ALSO is set to location: "los angeles" and another for "orange county".
{
"people": [{
"name": "Bob",
"id": "ei75d0",
"relationshipStatus": "married",
"height": "68",
"relatives": [
{
"name": "Tony",
"id": "UDX5A6",
"location": "los angeles",
},
{
"name": "Lisa",
"id": "WCX4AG",
"location": "orange county",
}
]
}]
}
My question then would be, can the if/then/else be applied over to nested elements as well? I'm not having success but I'll continue trying to get it to work and will post an update here if/once I do.
How can I define a schema that can resolve this use-case and we can use JSON Schema to pinpoint which elements in our data aren't in compliance with what our schema states
It's a little fiddly, but I've gone from "this isn't possible" to "you can just about do this.
If you re-structure your schema to the following...
{
"$schema": "https://json-schema.org/draft/2019-09/schema",
"type": "object",
"properties": {
"people": {
"type": "array",
"items": {
"allOf":[
{
"if": {
"properties": {
"id": {
"const": "uLqbVe"
}
}
},
"then": {
"type": "object",
"properties": {
"id": {
"const": "uLqbVe"
},
"relationshipStatus": {
"const": "married"
}
},
"required": ["id", "relationshipStatus"]
},
"else": true
}
]
}
}
},
"required": ["people"]
}
What we're doing here is, for each item in the array, if the object has the specific ID, then do the other validation, otherwise, it's valid.
It's wrapped in an allOf so you can do the same pattern multiple times.
The caveat is that, if you don't include all the IDs, or if you don't carefully check your schema, you will get told everything is valid.
You should ideally, additionaly check that the IDs you are expecting, are actually there. (It's fine to do so in the same schema.)
You can see this mostly working if you test it on https://jsonschema.dev by removing the $schema property. (This playground is only draft-07, but none of the keywords you use need anything above draft-07 anyway.)
You can test this working on https://json-everything.net/json-schema which then gives you full validation response.
AJV by default doesn't give you all the validaiton results. There's an option to enable it but I'm not in a position to test the result myself right now.

Filtering on GeoJSON data

I was wondering how filtering is normally done on a FeatureCollection for GeoJSON data. For example, take the following earthquake data:
{
"type": "FeatureCollection",
"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
"features": [
{ "type": "Feature", "properties": { "id": "ak16994521", "mag": 2.3, "time": 1507425650893, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -151.5129, 63.1016, 0.0 ] } },
{ "type": "Feature", "properties": { "id": "ak16994519", "mag": 1.7, "time": 1507425289659, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -150.4048, 63.1224, 105.5 ] } },
{ "type": "Feature", "properties": { "id": "ak16994517", "mag": 1.6, "time": 1507424832518, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -151.3597, 63.0781, 0.0 ] } },
{ "type": "Feature", "properties": { "id": "ci38021336", "mag": 1.42, "time": 1507423898710, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -118.497, 34.299667, 7.64 ] } },
{ "type": "Feature", "properties": { "id": "hv61900626", "mag": 2.91, "time": 1504833891990, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -155.011833, 19.399333, 2.609 ] } }
]
}
Now, if this data is all within a single FeatureCollection, how would anyone filter the data, for example to view earthquakes with magnitude > 2.5 ? It seems like when dealing with a FeatureCollection the first thing to do would be to extract each of the features into its own item: is that what is usually done, so that individual properties may be queried?
GeoJSON is a transfer format and as you have noticed any operation on it requires you to read and parse the whole file every time. If you plan to do anything with the data you should translate it into a more useful format which supports indexes. If you need to keep to a file based format then I recommend GeoPackage which is supported by the majority of modern GIS. Alternatively, you could use a spatially enabled database such as PostGIS.
In either case the easiest way to convert the data is to use ogr2ogr

How to verify that i get only one key from JSON response?

I get a response in JSON and need to make sure that i get only one 'signal' in signal_events. I'm using Ruby with rspec.
"signal_events": [
{
"id": "587e9ae969702d10bd5a0000",
"created_at": 1484692201,
"geo": {
"type": "Point",
"coordinates": [
-153.45703125,
59.67773438
]
},
"expires_at": 1484778601,
"geohashes": [
"bddg",
"bdeh"
],
"signal": {
"id": "587e9ae969702d0911060000",
"created_at": 1484692201.24,
"expires_at": 1484778601.24,
"signal_at": 1484691607,
"source": "usgs",
"updated_at": 1484692144,
"magnitude": 2,
"radius": 6.36107901750268,
"event_name": "earthquake",
"tsunami": "no"
},
"signal_type": "earthquake",
"centroid": {
"type": "Point",
"coordinates": [
-153.45703125,
59.67773438
]
},
"location": {
"country": "United States",
"country_code": "US",
"city": "Kenai Peninsula Borough",
"region": "Kenai Peninsula Borough",
"region_code": "AK"
}
},
Hash can not contain duplicated keys, so you will always have one value for any given key.

Fiware Context Broker with entities geolocated

I have a problem in retrieving entities using georeferenced queries.
Use the v2 syntax.
This is my query:
GET /v2/entities?georel=near;maxDistance:1000&geometry=point&coords=13.52,43.61
and this is my entity:
{
"id": "p1",
"type": "pm",
"address": {
"type": "Text",
"value": "Via Roma "
},
"allowedVehicleType": {
"type": "Text",
"value": "car"
},
"category": {
"type": "Text",
"value": "onstreet"
},
"location": {
"type": "geo:json",
"value": {
"type": "Point",
"coordinates": [ 13.5094, 43.6246 ]
}
},
"name": {
"type": "Text",
"value": "p1"
},
"totalSpotNumber": {
"type": "Number",
"value": 32
}
}
What is wrong?
I followed the official documentation but I can not get any results as well.
I also tried to reverse the coordinates, but the result does not change.
Any suggestion is welcome.
Note that longitude comes before latitude in GeoJSON coordinates, while the coords parameters does in the opposite way.
Thus, assuming that your entity is located in Ancona city, I think that using "coordinates": [ 43.6246, 13.5094 ] will solve the problem.

Can't fetching records from nested JSON objects in Sencha Touch Application

I'm stuck with issue for getting web-service response from JSON in Sencha Touch.
I'm getting response from server.
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"id": "business_poi_en.1",
"geometry": {
"type": "Point",
"coordinates": [
28.21962354993591,
36.452844361147314
]
},
"geometry_name": "geom",
"properties": {
"status": "M",
"score": 100,
"side": "R",
"ref_id": 0,
"id": null,
"business_p": "AQUARIOUM",
}
},
{
"type": "Feature",
"id": "business_poi_en.2",
"geometry": {
"type": "Point",
"coordinates": [
28.225417523605692,
36.436470953176716
]
},
"geometry_name": "geom",
"properties": {
"status": "M",
"score": 68.44,
"match_type": "A",
"side": "L",
"ref_id": 0,
"id": null,
"business_p": "ZIGOS",
}
},
.... So On ....
I want to fetch data for coordinates from geometry tag, so I can display it on map via these coordinates.I also want to fetch data for business_p from properties tag as displaying title on map.
But, I can't get both of values for coordinates & business_p at same time from same response.
Any idea for getting values at same time ?
Any suggestion will be appreciated.
Please help with this issue.
Thanks in advance.
Your Store's proxy's reader should have correct rootProperty which is common parent of both coordinates and business_p. For that your response should be wrapped into a top level tag like this
{
"featureResponse": [
{
"type": "FeatureCollection",
"features": []
},
{
"type": "Feature",
"features": []
},
]
}
Then you can define reader like this:
reader: {
type: 'json',
rootProperty: 'featureResponse'
}
once you get record from this store, you can go to data or raw children object to fetch required data.