Big shape data won't load in GeoServer - google-maps

I'm trying to load a 500MB shape file into GeoServer and get it to respond to a client request within a reasonable time frame (it currently doesn't respond even after 30mins of waiting). I want it to deliver image tiles; I'm using Google Maps API v3 ImageMapType to automatically request the correct tiles using the GeoServer WMS URL. The layer consists of hundreds of thousands of polygons for coastal Tasmania - so the layer is very sparse. I've tried:
Creating a tile cache (but the ETA is 15 years in zoom range 13 to 18) and it creates a lot of blank tiles (est. >95%)
Removing all attributes in the layer before loading into GeoServer (still waited an hour for it to begin seeding tile cache and still gave no progress)
Merging the polygons so there are only 10 polygons in the layer (same behaviour)
Using the bounds options in the tile cache (same behaviour)
[edit] Reprojecting the layer into EPSG:900913 (same behaviour)
Cutting the layer into 12 sections to reduce empty space, loading them as a Layer Group, and seeding the tile cache from this (even 1 of these layers wouldn't begin seeding - too big still?)
The next option we're looking at is breaking the layer into 1km grids and loading all 8000 layers as a Layer Group. I'm doubtful this will work. However 1 of these layers DID work when seeding the cache - and it only took a few seconds for all zoom levels.
How do I get GeoServer to serve this large, sparse data? Surely other people have this issue? Do I need to do something special with the layer itself? Or is there something in GeoServer that I should be configuring?

To start: a 500MB map should be peanuts for GeoServer, unless you bought your hardware well over a decade ago. I work with much larger datasets on a daily basis.
Perhaps you let GeoServer access the shapefile directly from disk?
I'd recommend the following setup:
Make sure you have enough RAM installed. I just saw that I could buy 24GB for less than 80 euros. That should be enough to cache your database entirely;
Install Postgres with PostGIS extensions;
To make sure that no re-projection is necessary, you can pre-convert all coordinates to Google's marcator projection (EPSG:9009l3);
Make sure you have a spatial index on the geometry column;
If your map is static, you can pre-render the tiles. This is really going to be a big boost in performance. Try to find until which zoomlevel you can pre-render within a reasonable time. The zoomed-in images are usually faster anyway, because fewer elements are involved to create the image;
Furthermore, I doubt that you're ever going to get a result if it took 30 minutes. At least the webserver already timed out way before that. Download the image manually by pasting the URL. If you don't see a picture, then open the downloaded image in a text editor: there might be a textual error message in the picture file instead of binary data. That error message usually describes what the problem is.

Related

Free heatmapping or similar?

I have a database with around 1 million records. Each have a lat and long. I need to be able to process at maximum I would say around half a million points onto a map with some sort of weighting.
Google Maps is great but only up to 100,000 points. Is there anything else open source that can do this?
Your browser won't be able to handle so many markes. You browser will crash or simply won't load the page. Did you ever think about how much memory would 1 million marker use? Every marker is 2 float + other data.
You need to separate your markes into groups/sections then load the section's data via ajax or such.

How to use our own data to create map layer dynamically?

We are creating a speed limit map application using different colors to highlight street with different speed limits (similar to ITO speed limit map: http://www.itoworld.com/map/124?lon=-79.37151&lat=43.74796&zoom=12).
The problem we have is that we are conducting our own research and have our own speed limit data instead of pulling the data from OpenStreetMap or Google Map like ITO map. We also need to create a data storage in order to dynamically update the map as we add more speed limit information in the future.
Is there anyway to create our own instance of OpenStreetMap and replace only the speed limit information with our own data? We don't have any vector data and we have no experience working with them.
Is there any suggestion of tools to use for creating highlighting layers based on the speed limit we have? Is OpenLayers a good option?
Any help is appreciated, thank you very much.
Update 2013/11/20
Thank you very much for your answers, now we have a much better understanding of your problem. This is a university design project so we basically have no budget. We are looking for:
1) A basic "base map" that include the basic tile information (openstreetmap seems a good choice since google map api doesn't provide free road information as long as we can find)
2) A geo data server that can host our own street speed limit data (looks like geoserver and mapserver are good choices), or a design simple database that can fulfill our need(doesn't know is it possible yet)
3) A plotting tool that can render our speed limit data as "group of lines" on the map since these data will be changed frequently (openlayers and leaflet are good candidates).
Is there anything else needed?
What you want to do is a trivial programming task once you have decided a few things:
These are probably the three biggest questions you need to answer. I added some commentary, but look at each of these questions beyond this post to find what works for you.
Who do you want to use for your map? Since you only have one type of data you will want to display that data on someone else's nice looking map. The big choices are Bing, Google, OpenLayers/OSM, and ESRI. Your choice will most likely be driven by the licensing of the above services and if you are willing to pay or not. A need to support mobile devices may also factor into your decision. Since the map is what your users will see, choose the best looking map you can afford.
How will you serve up your data? You have several options to serve your speed limit data. GeoServer and MapServer and ESRI are some popular mapping software packages. If you only displaying a few layers of data all mapping software will be overkill. The actual software to render your map data will most likely affect only your pocket book, so free is good here usually.
Tiles vs Lines
You will server your data as either a group of lines sent to the browser, or as pre-rendered tiles to be loaded on top of the map. If you data changes frequently you will want to serve it dynamically as line data (an array of points.) If your data does not change frequently, you should consider tiling your data. Tiling involves pre-rending of the entire map at all zoom levels. This allows the map to be loaded very fast and this how almost all base maps are rendered. The downside is that the tile generation can take a long amount of time and tiles can take a large amount of space.
This is a very broad question. There are many components to drawing your own speed limit map.
On the front-end, there is a web browser map interface. OpenLayers is good at that. There are plenty of other tools that can do this as well, such as Leaflet or even Google Maps API.
Next is something to provide the actual speed limit route data. This can be served as a vector layer or a raster layer. There are plenty of tools here, too. UMN Mapserver is free and reasonably good. ESRI makes a whole fleet of products in this area as well.
The speed limit route data also needs to be saved somehow. This can be done in files or in a database such as PostGIS. Again, lots of great options.
It is the role of the system architect to determine which technologies to employ to solve the problem.

Google Maps API - Slow Max Zoom Service

I am using Max Zoom Service of Google Maps API to get the maximum zoom level of a given coordinates. Most of the time it is fast and each request only takes around 150 ms. However, there have been a few occasions that the service became extremely slow, around 20 seconds.
maxZoomService = new google.maps.MaxZoomService();
maxZoomService.getMaxZoomAtLatLng(center, function(response) {
// my process
});
Have you experienced similar issue?
Yes we have experienced the same problems.
We're using wkhtmltoimage to generate map images for inclusion into PDF files using wkhtmltopdf.
We have to set a maximum time in which the image is generated. If it's too short, then you often will not have all the map tiles downloaded in order to create the image. Too long and there's a really noticeable delay for users (with our comnection 5 seconds seems optimal)
We wanted to make sure that generated satellite maps do not exceed the maximum zoom level using the MaxZoomService and that's when we ran into problems with the service.
As it is an asynchronous service, ideally we wait for the service to report the max zoom level BEFORE creating the map and therefore triggering the correct .map tile downloads.
Setting a default "fallback" zoom for the map in case the service is being really slow is not really an option as subsequently updating the zoom level when getting a return value from the service will in most cases cause new tiles to be reloaded, requiring more delay...
If like us you're interested in specific, repeatable locations (e.g. from a database) then one option might be to cache (store) the max zoom levels in advance and periodically check for updated values.
In our specific case the only other option would be to allow the service a specific amount of time (say 2 seconds) and if it does not respond then fall back to a default zoom.
Not ideal, but can handle the services' "Bad hair days"...

GIS with Bing Silverlight and SQL 2008?

I have data that I want to create a GIS type application that will have the typical features of adding and removing layers of different types. What is the best architectural approach?
The data consists of property locations in Eastings and Northings.
I also have ordnance survey data in GML and Shapefiles.
I know this is a very broad question but the subject area also seems very broad to me and I am unsure which direction to go.
I was thinking of using SQL 2008 spatial and Bing Silverlight control to visualise that maps.To do this would I have to convert the eastings and northings to GWS84 geography datatype? But then if I converted the shapefiles to GML and imported all the GML files to sql using GeomFromGML they would be in geometry datatypes. Wouldn’t the two types be incompatible?
Also, should ESRI ArcGIS API for Silverlight
feature in the equation? Is this a good environment to create maps that I can point as SQL sqerver 2008 as the datasource (using a WCF service if necessary)?
Any advice greatly appreciated!
This is something I've done several times, using OS data from SQL Server in both Bing Maps AJAX and Silverlight controls. Some general comments below (in no particular order!):
Don't expect to implement full-blown
GIS functionality using Bing Maps.
Simple querying, retrieval and
display of the data is all fine (+
some simple editing), but after that
you'll be struggling with what can be
achieved in the browser.
All vector shapes supplied to Bing Maps need to be in (geography)
WGS85 coordinates, EPSG:4326.
However, all data will be projected
and displayed using (projected)
Spherical Mercator system, EPSG:3857.
In terms of vector shapes, you can expect to achieve a similar level of performance as you get in the SSMS spatial results tab - that is, (with careful architecture) you can plot up to about 5,000 features on the map at once, zoom / pan around them, click on them to bring up various properties and attributes etc. However, after that you'll find the UI becomes fairly unresponsive (which
I imagine is the reason why the spatial results tabs itself limits you to displaying 5,000 records at once).
If you want to display more features than this, one approach is to rasterize them by projecting them into the EPSG:3857 projection, creating a .PNG/.JPG image file of the features and then cutting that image into tiles according to the Bing Maps quadkey tile numbering system as explained here:
http://msdn.microsoft.com/en-us/library/bb259689.aspx and displaying them as a tilelayer. Tile layers are significantly faster than displaying the equivalent vector shapes, although it means the data is static.
If you do create raster tiles, you can either render them dynamically
or pre-render them to improve performance - i.e. you could set
up a job to render and update the tileset for slowly-changing data
every night/every month etc.
If you're talking about OS Mastermap data, the sheer level of detail
involved means that you need to think more carefully about what
features you want to display, and how you want to display them. Take
greater London, for example, which covers an area about 50km x 40km. To
create raster tiles (each of which are 256px x 256px) at zoom level 19
covering this area you'd need to render and store 1.3 million separate
tiles. If each of those is generated from a database query that takes, say
200ms to run, it's gonna take a looooonggggg time to prepare all your
data. Also, once the files have been generated, you might want to think
about storing them in a DB rather than saving them on a file system.
As for loading the OS data into SQL Server in the first place - there
are several tools that will import either from GML or shapefile into SQL
Server, and handle the projection from EPSG:27700 (Ordnance Survey
National Grid) to WGS84 along the way. Try GDAL/OGR or Safe FME for
starters.
I have a blog at http://alastaira.wordpress.com that has several blog posts that you may find useful describing various aspects of integrating Bing Maps and SQL Server. Particularly, you might want to look at:
http://alastaira.wordpress.com/2011/02/16/loading-ordnance-survey-open-data-into-sql-server-2008/
http://alastaira.wordpress.com/2011/01/23/the-google-maps-bing-maps-spherical-mercator-projection/
http://alastaira.wordpress.com/2011/02/21/using-ogr2ogr-to-convert-reproject-and-load-spatial-data-to-sql-server/

Visualizing large quantities of data on google maps / visualizations

I have a json file thats roughly 480mb of geolocation points. I was wondering if someone knows of a good 'pattern' to use when trying to visualise the data. The problem I'm encountering is that the data has to be loaded into Google maps from the get go. This is causing all kinds of obvious issues.
I don't have to do this through google. It just seemed like the obvious choice.
With that much data, it may make more sense to handle it on the server side instead of client side. You could set up Geoserver with your appropriate data points. Using OpenLayers, you could overlay your points from Geoserver on top of Google Maps or potentially even on top of your own map if you want to cut out Google Maps all together. The heavy duty processing then happens on the server and only images are displayed in the browser. This cuts down on network traffic and the amount of processing the browser has to do. If you set up Geoserver to do caching, the server won't even have to work very hard.
It really depends on what kind of data this is.
If these are points for polylines or polygons you might try to encode the points (http://code.google.com/apis/maps/documentation/utilities/polylinealgorithm.html and http://code.google.com/apis/maps/documentation/utilities/polylineutility.html). There are also functions which you can use to encode the points. This will significantly reduce the size of your data.
You might also want to consider loading data depending on zoom level on the map. (I am not sure what you mean by "data has to be loaded from the get go" - you can load the data into the map depending on events, etc...) .
Fusion tables mentioned above will only accept 100MB of data.
I can be more specific if you explain the nature of your data and what you trying to do in more details. Hope this helps.
Try Google Fusion Tables