I have data that I want to create a GIS type application that will have the typical features of adding and removing layers of different types. What is the best architectural approach?
The data consists of property locations in Eastings and Northings.
I also have ordnance survey data in GML and Shapefiles.
I know this is a very broad question but the subject area also seems very broad to me and I am unsure which direction to go.
I was thinking of using SQL 2008 spatial and Bing Silverlight control to visualise that maps.To do this would I have to convert the eastings and northings to GWS84 geography datatype? But then if I converted the shapefiles to GML and imported all the GML files to sql using GeomFromGML they would be in geometry datatypes. Wouldn’t the two types be incompatible?
Also, should ESRI ArcGIS API for Silverlight
feature in the equation? Is this a good environment to create maps that I can point as SQL sqerver 2008 as the datasource (using a WCF service if necessary)?
Any advice greatly appreciated!
This is something I've done several times, using OS data from SQL Server in both Bing Maps AJAX and Silverlight controls. Some general comments below (in no particular order!):
Don't expect to implement full-blown
GIS functionality using Bing Maps.
Simple querying, retrieval and
display of the data is all fine (+
some simple editing), but after that
you'll be struggling with what can be
achieved in the browser.
All vector shapes supplied to Bing Maps need to be in (geography)
WGS85 coordinates, EPSG:4326.
However, all data will be projected
and displayed using (projected)
Spherical Mercator system, EPSG:3857.
In terms of vector shapes, you can expect to achieve a similar level of performance as you get in the SSMS spatial results tab - that is, (with careful architecture) you can plot up to about 5,000 features on the map at once, zoom / pan around them, click on them to bring up various properties and attributes etc. However, after that you'll find the UI becomes fairly unresponsive (which
I imagine is the reason why the spatial results tabs itself limits you to displaying 5,000 records at once).
If you want to display more features than this, one approach is to rasterize them by projecting them into the EPSG:3857 projection, creating a .PNG/.JPG image file of the features and then cutting that image into tiles according to the Bing Maps quadkey tile numbering system as explained here:
http://msdn.microsoft.com/en-us/library/bb259689.aspx and displaying them as a tilelayer. Tile layers are significantly faster than displaying the equivalent vector shapes, although it means the data is static.
If you do create raster tiles, you can either render them dynamically
or pre-render them to improve performance - i.e. you could set
up a job to render and update the tileset for slowly-changing data
every night/every month etc.
If you're talking about OS Mastermap data, the sheer level of detail
involved means that you need to think more carefully about what
features you want to display, and how you want to display them. Take
greater London, for example, which covers an area about 50km x 40km. To
create raster tiles (each of which are 256px x 256px) at zoom level 19
covering this area you'd need to render and store 1.3 million separate
tiles. If each of those is generated from a database query that takes, say
200ms to run, it's gonna take a looooonggggg time to prepare all your
data. Also, once the files have been generated, you might want to think
about storing them in a DB rather than saving them on a file system.
As for loading the OS data into SQL Server in the first place - there
are several tools that will import either from GML or shapefile into SQL
Server, and handle the projection from EPSG:27700 (Ordnance Survey
National Grid) to WGS84 along the way. Try GDAL/OGR or Safe FME for
starters.
I have a blog at http://alastaira.wordpress.com that has several blog posts that you may find useful describing various aspects of integrating Bing Maps and SQL Server. Particularly, you might want to look at:
http://alastaira.wordpress.com/2011/02/16/loading-ordnance-survey-open-data-into-sql-server-2008/
http://alastaira.wordpress.com/2011/01/23/the-google-maps-bing-maps-spherical-mercator-projection/
http://alastaira.wordpress.com/2011/02/21/using-ogr2ogr-to-convert-reproject-and-load-spatial-data-to-sql-server/
Related
I'm interested in implementing some data visualizations as map layers. But I'm interested in generating data layers only above land area (land cover). A good example would be to plot population density over a coastal city. What is a good approach for this, when it comes to the data source and how to actually display layers with such detailed boundaries?
Technically, so far I'm using Leaflet.js and tiles based on OpenStreetMaps, but the question is not necessary technology specific. Also, I'm not interested in plotting this for the whole planet, but for areas of a few hundreds square kilometers (for e.g. a coastal city).
To better give an idea of what I'm interested in, this Koordinates map is something that is similar to what I'm interested in. However, I need something a bit more detailed on the borders.
Usually you need a desktop or server based GIS such coverage, but not JS to do the processing on the client side.
How you do the mapping (here: linking statistical data and land areas) depends on your data itself. You can load OSM based shape files into QGIS and do some python scripting or using the PostGIS commands to link your data and choose a map style.
Another idea would be http://geocommons.com that allow easy visualization if you upload CSV files.
Depending on your area of interest, you can obtain some highly detailed shapefiles from numerous sources. Especially if the local area provides GIS data to the public (many larger coastal cities do, e.g. New York, London). From there, you can create a GeoJSON text of the geometries (here's a free tool for that). Parsing the JSON is very simple and it's very easy to add it to leaflet maps. You can even get creative and add more keys to each geometry object with the data you want to visualize.
We are creating a speed limit map application using different colors to highlight street with different speed limits (similar to ITO speed limit map: http://www.itoworld.com/map/124?lon=-79.37151&lat=43.74796&zoom=12).
The problem we have is that we are conducting our own research and have our own speed limit data instead of pulling the data from OpenStreetMap or Google Map like ITO map. We also need to create a data storage in order to dynamically update the map as we add more speed limit information in the future.
Is there anyway to create our own instance of OpenStreetMap and replace only the speed limit information with our own data? We don't have any vector data and we have no experience working with them.
Is there any suggestion of tools to use for creating highlighting layers based on the speed limit we have? Is OpenLayers a good option?
Any help is appreciated, thank you very much.
Update 2013/11/20
Thank you very much for your answers, now we have a much better understanding of your problem. This is a university design project so we basically have no budget. We are looking for:
1) A basic "base map" that include the basic tile information (openstreetmap seems a good choice since google map api doesn't provide free road information as long as we can find)
2) A geo data server that can host our own street speed limit data (looks like geoserver and mapserver are good choices), or a design simple database that can fulfill our need(doesn't know is it possible yet)
3) A plotting tool that can render our speed limit data as "group of lines" on the map since these data will be changed frequently (openlayers and leaflet are good candidates).
Is there anything else needed?
What you want to do is a trivial programming task once you have decided a few things:
These are probably the three biggest questions you need to answer. I added some commentary, but look at each of these questions beyond this post to find what works for you.
Who do you want to use for your map? Since you only have one type of data you will want to display that data on someone else's nice looking map. The big choices are Bing, Google, OpenLayers/OSM, and ESRI. Your choice will most likely be driven by the licensing of the above services and if you are willing to pay or not. A need to support mobile devices may also factor into your decision. Since the map is what your users will see, choose the best looking map you can afford.
How will you serve up your data? You have several options to serve your speed limit data. GeoServer and MapServer and ESRI are some popular mapping software packages. If you only displaying a few layers of data all mapping software will be overkill. The actual software to render your map data will most likely affect only your pocket book, so free is good here usually.
Tiles vs Lines
You will server your data as either a group of lines sent to the browser, or as pre-rendered tiles to be loaded on top of the map. If you data changes frequently you will want to serve it dynamically as line data (an array of points.) If your data does not change frequently, you should consider tiling your data. Tiling involves pre-rending of the entire map at all zoom levels. This allows the map to be loaded very fast and this how almost all base maps are rendered. The downside is that the tile generation can take a long amount of time and tiles can take a large amount of space.
This is a very broad question. There are many components to drawing your own speed limit map.
On the front-end, there is a web browser map interface. OpenLayers is good at that. There are plenty of other tools that can do this as well, such as Leaflet or even Google Maps API.
Next is something to provide the actual speed limit route data. This can be served as a vector layer or a raster layer. There are plenty of tools here, too. UMN Mapserver is free and reasonably good. ESRI makes a whole fleet of products in this area as well.
The speed limit route data also needs to be saved somehow. This can be done in files or in a database such as PostGIS. Again, lots of great options.
It is the role of the system architect to determine which technologies to employ to solve the problem.
I'm a Python/Django developer who is new to GeoDjango (and GIS, in general). I was hoping someone could provide some guidance with respect to the different projection systems offered by the City of Toronto.
The City of Toronto is great with publishing Open Data. Here's a link to their Open Data repository.
All shapefiles are available in both "MTM 3 Degree Zone 10, NAD27" as well as "WGS84" formats. GeoDjango is able to import both formats. What are the consequences of choosing to import data from one format as opposed to another? What factors should I consider when deciding?
Depends on how you are using the data. If you are layering it on top of other datasets, choose a common coordinate system for all datasets so that your mapping framework doesn't have to reproject the data every time a map is drawn. WGS84 is a very popular coordinate system, and would be a good choice for mixing with other data.
If you are just using it to generate maps in the Toronto area, MTM 3 Degree Zone 10 is probably your best bet, as WGS84 will introduce slight distortions when drawing a map on the screen, although casual (and probably advanced) map users will be unlikely to notice it.
Is there a way I can build terrain data in a simple way from sources like Google Maps. I am not interested in heights but a simple 2D representation will be good enough.
For instance, I am trying to represent a terrain using the roads and buildings inside a map to model a traffic simulation. Representing objects like buildings is necessary so that when my cars are moving on a road, they should know when to take a turn etc. Are there any standards for representing these?
There are dozens of map standards. also map data tends to be very expensive, although there are some low cost and open source map sources. Eg. OpenStreetMap, and for the US, Tiger/Line.
I would also read up on, at least, some introductory GIS - I think you'll find the field is much bigger and more complex that you are initially thinking.
I am trying to take a shapefile of subdivisions within a county that I have created and line it up with another shapefile that was given to me by the County Appraisal District (parcel data). When I try to get them to line up then my streets shapefiles is not aligned with everything else. They are all on the same coordinate system and I do not want to have to recreate the shapefile for the subdivisions. Any thoughts?
This is a question with answers that may be simple or may be very complex, depending on your situation. As a GIS developer, I've most commonly seen this as a symptom of an incorrectly defined coordinate system. However, whether this is the case or not, and what the solution is strongly depends on your environment. From here on, I'll assume that you're working in an ESRI package...
I agree with the other posters that your problem is one of mismatching projections and/or datum definitions.
The most important thing to understand as regards projections in ESRI software is this:
Manually setting the projection of a dataset (shapefile, geodatabase feature class, etc) in ArcCatalog does NOT reproject that dataset!!!
In order to reproject your data, you must EXPORT the data from an ArcMap session in which you've been working and where the data is obviously lined up correctly. During the EXPORT, you are given the choice of saving your data with the coordinate system of the underlying map or that of the original dataset.
Your best bet is to follow these steps to create a new dataset with the correct projection and then extrapolate what you need to do to fix your specific problem:
Create a new ArcMap session and set its coordinate system:
Do this in a fresh ArcMap session with NO OTHER DATA. Be sure to explicitly set the coordinate system of the ArcMap mapview to your desired coordinate system (I recommend the one that matches the data you're trying to overlay, or one from another well-established dataset).
Add one other dataset with a known good coordinate system.
Create your new dataset in this ArcMap session. Give your new data the same coordinate system as the ArcMap mapview and the one other dataset in the map. Set the XY domain of the new data to exceed the area defined by your other dataset, but don't go beyond the size that will reduce your desired spatial resolution.
Create your data. It can be any data at this point. Some lines, some polygons, etc. Save your work.
Export your new dataset. When prompted, choose to save with the coordinate system of the underlying mapview.
Create a new ArcMap session and add your new dataset. Then add your parcel dataset. They should now occupy the same space in your map window.
Edit your new data to your heart's content.
Some probable issues if this doesn't help:
You didn't follow these steps correctly - check the ESRI documentation; this is a well documented issue.
The parcel data you're trying to match doesn't have properly defined coordinate system. It's always possible that the keepers of this data don't know what they're doing and have munged it up. I've seen this problem more times that I care to admit.
You've matched the projection but have mis-matched the datum. Many municipalities are still using data in NAD27, which is way out of date. Some have moved to the modern NAD83. The difference can be up to 300 meters, depending on where in the country you are. Also, data that originates from surveying or GPS equipment is usually collected in WGS84 (the typical default for satellite surveying), which is for all practical purposes the same as NAD83, at least at mapping scale resolutions.
Try researching these issues and see how it goes. I'll say it again:
Manually setting the projection does NOT actually project that data!!
Good luck!
Your problem is probably one of projection
| projection: character string that names a map projection to use. See
| 'mapproject' (in the 'mapproj' library). The default is to
| use a rectangular projection with the aspect ratio chosen so
| that longitude and latitude scales are equivalent at the
| center of the picture.
`-----
Agree that your problem is projection. Is there a .prj file extension associated with either of your files? If not key first is finding out what projects you have. I would guess State Plane of some sort if you are dealing with U.S. centric local data from a local government.
Cadastral tools (surveyer tools) usually let you specify a handful of control points and will then "warp" the data to fit to your control points. This can be anything from a simple shift to something more complex. If everything is shifted by a few feet, you can also just use your "editor" to select all shapefiles and then move them however many feet necessary.
If you've verified that both shapefiles are using the same coordinate system, then projection is less likely to be the problem. It's fairly common for parcel data to be "offset" from other data sources (such as roads). This comes from inconsistent collection methods and points of reference.
Another source of error can be that one of the shapefiles has the wrong coordinate system specified. For example, if the roads were actually WGS 1984, but it's prj is set to NAD1983, you will see some significant errors. This usually happens if you had to manually set the coodinate system for a shapefile (i.e. it didn't include a PRJ and you created one).