Why is mapbox rejecting my geojson as invalid. - json

I have a function which is parsing JSON provided by the foursquare api to GeoJSON, which I then provide to the MapBox API using JSON.stringify() on the GeoJSON object then loading it to the map with the following code
MapBox API returns saying that my GeoJSON is invalid.
I checked the GeoJSON specification and it matches exactly !
Can anybody spot the error ?
loading function
this.map.on("load", () => {
this.map.addSource("venues", {
type: "geojson",
data: geojson
});
GeoJSON
{
"type": "FeatureCollection",
"features": [
{ "type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [48.31272484668867, 18.092948926236698]
},
"properties": {
"id": "4d0a6cc933d6b60c1c569a85",
"venueName": "Peciatkaren",
"address": "Kmetkova 32",
"distance": 20119,
"icon": {
"iconUrl": "../assets/img/dot_PNG41.png",
"iconSize": [25, 25],
"iconAnchor": [0, 0]
}
}
},
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [48.30957732182106, 18.087218856597]
},
"properties": {
"id": "4bd5ad37637ba5933bc3f670",
"venueName": "Zanzibar",
"address": "Štefánikova trieda 43",
"distance": 20030,
"icon": {
"iconUrl": "../assets/img/dot_PNG41.png",
"iconSize": [25, 25],
"iconAnchor": [0, 0]
}
}
}]
}

There is nothing wrong with your GeoJSON and there are several ways how you could use it.
Docs:
https://www.mapbox.com/mapbox-gl-js/api/#geojsonsource
https://www.mapbox.com/mapbox-gl-js/style-spec/#sources-geojson
One example below:
var yourgeojson = {
"type": "FeatureCollection",
"features": [
{ "type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [48.31272484668867, 18.092948926236698]
},
"properties": {
"id": "4d0a6cc933d6b60c1c569a85",
"venueName": "Peciatkaren",
"address": "Kmetkova 32",
"distance": 20119,
"icon": {
"iconUrl": "../assets/img/dot_PNG41.png",
"iconSize": [25, 25],
"iconAnchor": [0, 0]
}
}
},
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [48.30957732182106, 18.087218856597]
},
"properties": {
"id": "4bd5ad37637ba5933bc3f670",
"venueName": "Zanzibar",
"address": "Štefánikova trieda 43",
"distance": 20030,
"icon": {
"iconUrl": "../assets/img/dot_PNG41.png",
"iconSize": [25, 25],
"iconAnchor": [0, 0]
}
}
}]
};
map.on('load', function () {
map.addSource('someid', {
type: 'geojson',
data: yourgeojson
});
map.addLayer({
"id": "points",
"type": "symbol",
"source": "someid",
"layout": {
...
}
});
});

Related

Filtering on GeoJSON data

I was wondering how filtering is normally done on a FeatureCollection for GeoJSON data. For example, take the following earthquake data:
{
"type": "FeatureCollection",
"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
"features": [
{ "type": "Feature", "properties": { "id": "ak16994521", "mag": 2.3, "time": 1507425650893, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -151.5129, 63.1016, 0.0 ] } },
{ "type": "Feature", "properties": { "id": "ak16994519", "mag": 1.7, "time": 1507425289659, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -150.4048, 63.1224, 105.5 ] } },
{ "type": "Feature", "properties": { "id": "ak16994517", "mag": 1.6, "time": 1507424832518, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -151.3597, 63.0781, 0.0 ] } },
{ "type": "Feature", "properties": { "id": "ci38021336", "mag": 1.42, "time": 1507423898710, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -118.497, 34.299667, 7.64 ] } },
{ "type": "Feature", "properties": { "id": "hv61900626", "mag": 2.91, "time": 1504833891990, "felt": null, "tsunami": 0 }, "geometry": { "type": "Point", "coordinates": [ -155.011833, 19.399333, 2.609 ] } }
]
}
Now, if this data is all within a single FeatureCollection, how would anyone filter the data, for example to view earthquakes with magnitude > 2.5 ? It seems like when dealing with a FeatureCollection the first thing to do would be to extract each of the features into its own item: is that what is usually done, so that individual properties may be queried?
GeoJSON is a transfer format and as you have noticed any operation on it requires you to read and parse the whole file every time. If you plan to do anything with the data you should translate it into a more useful format which supports indexes. If you need to keep to a file based format then I recommend GeoPackage which is supported by the majority of modern GIS. Alternatively, you could use a spatially enabled database such as PostGIS.
In either case the easiest way to convert the data is to use ogr2ogr

Multiple items for a single coordinate in GeoJSON

I've set up a rest API that is supplying geojson data from a database. The issue is each coordinate could/does have multiple items associated to it.
Currently troubleshooting the application's javascript file to load, and plot - but looking for feedback on best practice. Any information is appreciated. Thanks!
var foo = {
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"properties": {
"item": "A",
},
"geometry": {
"type": "Point",
"coordinates": [
90,
135
]
}
},
{
"type": "Feature",
"properties": {
"item": "B",
},
"geometry": {
"type": "Point",
"coordinates": [
90,
135
]
}
},
{
"type": "Feature",
"properties": {
"item": "C",
},
"geometry": {
"type": "Point",
"coordinates": [
90,
135
]
}
},
...

Azure DataFactory ForEach Copy activity is not iterating through but instead pulling all files in blob. Why?

I have a pipeline in DF2 that has to look at a folder in blob and process each of the 145 files sequentially into a database table. After each file has been loaded into the table, a stored procedure should be trigger that will check each record and either insert it, or update an existing record into a master table.
Looking online I feel as though I have tried every combination of "Get MetaData", "For Each", "LookUp" and "Assign Variable" activates that have been suggested but for some reason my Copy Data STILL picks up all files at the same time and runs 145 times.
Recently found a blog online that I followed to use "Assign Variable" as it will be useful for multiple file locations but it does not work for me. I need to read the files as CSVs to tables and not binary objects so therefore I think this is my issue.
{
"name": "BulkLoadPipeline",
"properties": {
"activities": [
{
"name": "GetFileNames",
"type": "GetMetadata",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"dataset": {
"referenceName": "DelimitedText1",
"type": "DatasetReference",
"parameters": {
"fileName": "#item()"
}
},
"fieldList": [
"childItems"
],
"storeSettings": {
"type": "AzureBlobStorageReadSetting"
},
"formatSettings": {
"type": "DelimitedTextReadSetting"
}
}
},
{
"name": "CopyDataRunDeltaCheck",
"type": "ForEach",
"dependsOn": [
{
"activity": "BuildList",
"dependencyConditions": [
"Succeeded"
]
}
],
"typeProperties": {
"items": {
"value": "#variables('fileList')",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "WriteToTables",
"type": "Copy",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"storeSettings": {
"type": "AzureBlobStorageReadSetting",
"wildcardFileName": "*.*"
},
"formatSettings": {
"type": "DelimitedTextReadSetting"
}
},
"sink": {
"type": "AzureSqlSink"
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"name": "myID",
"type": "String"
},
"sink": {
"name": "myID",
"type": "String"
}
},
{
"source": {
"name": "Col1",
"type": "String"
},
"sink": {
"name": "Col1",
"type": "String"
}
},
{
"source": {
"name": "Col2",
"type": "String"
},
"sink": {
"name": "Col2",
"type": "String"
}
},
{
"source": {
"name": "Col3",
"type": "String"
},
"sink": {
"name": "Col3",
"type": "String"
}
},
{
"source": {
"name": "Col4",
"type": "String"
},
"sink": {
"name": "Col4",
"type": "String"
}
},
{
"source": {
"name": "DW Date Created",
"type": "String"
},
"sink": {
"name": "DW_Date_Created",
"type": "String"
}
},
{
"source": {
"name": "DW Date Updated",
"type": "String"
},
"sink": {
"name": "DW_Date_Updated",
"type": "String"
}
}
]
}
},
"inputs": [
{
"referenceName": "DelimitedText1",
"type": "DatasetReference",
"parameters": {
"fileName": "#item()"
}
}
],
"outputs": [
{
"referenceName": "myTable",
"type": "DatasetReference"
}
]
},
{
"name": "CheckDeltas",
"type": "SqlServerStoredProcedure",
"dependsOn": [
{
"activity": "WriteToTables",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"storedProcedureName": "[TL].[uspMyCheck]"
},
"linkedServiceName": {
"referenceName": "myService",
"type": "LinkedServiceReference"
}
}
]
}
},
{
"name": "BuildList",
"type": "ForEach",
"dependsOn": [
{
"activity": "GetFileNames",
"dependencyConditions": [
"Succeeded"
]
}
],
"typeProperties": {
"items": {
"value": "#activity('GetFileNames').output.childItems",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "Create list from variables",
"type": "AppendVariable",
"typeProperties": {
"variableName": "fileList",
"value": "#item().name"
}
}
]
}
}
],
"variables": {
"fileList": {
"type": "Array"
}
}
}
}
The Details screen of the pipleline output shows the pipeline loops for the number of items in the blob but each time, the Copy Data and Stored Procedure are run for each file in the list at once as opposed to one at a time.
I feel like I am close to the answer but missing one vital part. Any help or suggestions are GREATLY appreciated.
Your payload is not correct.
GetMetadata actvitiy should not use the same dataset with Copy Activity.
GetMetadata activity should reference a dataset with a folder, the folder contains all file you want to deal with. but your dataset has 'filename' parameter.
use the output of the getMetadata activity as the input of forEach activity.

Restrict json schema to Polygon from GeoJson

I'm building a JSON Schema which has a property boundary. I'm referencing the GeoJson schema which works fine. Now I want to restrict my boundary to be of type Polygon, which is an enum from the GeoJson schema.
How to do this?
This is the relevant part of my schema:
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"plot": {
"type": "object",
"properties": {
"boundary": {
"description": "The boundary of the plot",
"title": "Plot boundary",
"additionalProperties": false,
"required": [
"type",
"coordinates",
"crs"
],
"TODO": "Restrict to (multi)polygons only.",
"$ref": "http://json.schemastore.org/geojson"
}
}
}
}
}
This is my validating json:
{
"plot":
{
"boundary": {
"crs": {
"type": "name",
"properties": {
"name": "EPSG:3857"
}
},
"coordinates": [],
"type": "MultiPolygon"
}
}
}
It seems I was using the wrong geojson.
I changed my code and now it works as expected and the new schemas are draft-7 as well.
Here's my updated code:
"boundary": {
"title": "The boundary of the plot",
"anyOf": [
{
"$ref": "http://geojson.org/schema/MultiPolygon.json"
},
{
"$ref": "http://geojson.org/schema/Polygon.json"
}
],
"additionalProperties": false
and
"geoLocation": {
"title": "Front door geolocation",
"$ref": "http://geojson.org/schema/Point.json",
"additionalProperties": false
},
The JSON could be:
"boundary":
{
"type": "Polygon",
"coordinates": [
[
[100.0, 0.0], [101.0, 0.0], [101.0, 1.0], [100.0, 1.0],
[100.0, 0.0]
]
]
}
and
"geoLocation": {
"coordinates": [125.25, 135.255],
"type": "Point"
}

How to change markers colour in geojson

I wanna change marker color for in geojson. However, this code doesn't work as expected.
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [-2.5555, 53.6666]
},
"style": {
"color": "red",
"weight": 5,
"opacity": 0.65
},
"properties": {}
}
]
}