Google maps and data storage - google-maps

I don't have good understanding of Big data libraries/spatial data. Can someone help to demystify how google maps, bing maps (any maps) store data internally? If I want a Image at some latitude 'x' and some longitude 'y' at some zoom factor, how is this data stored for efficient retrieval? Also, there is quite a bit of information associated at every point in cell, how is this data stored? Links to relevant pages or blogs will be of great help.
Thanks

Most highly scalable map providers start out with the spatial data stored in a spatial database, including road lines, rail lines, polygonal boundaries, hydrography, etc. However, they don't render the maps directly from that data for every user request because that would take far too long for the millions of requests they get every day.
Instead, tiles of the maps are rendered offline using the spatial data, and stored as images on disk so they can be retrieved quickly. The tiles are regenerated for different zoom levels so that more detail shows up at different levels. Since lower zoom levels may have millions of tiles, but are rarely visited, those may be generated at viewing time, instead of being pre-generated.
Then they set up their applications (Javascript-based or otherwise), to request all the tiles over a given viewing area and zoom level, and piece the tiles together to generate a complete map.
WMTS is one open standard for delivering map tiles over the web, although WMTS doesn't define the process for generating tiles. GeoWebCache is a service which, paired with a WMS server like GeoServer or MapServer, can be used for generating tiles from spatial data.

You can check out elasticsearch. Its an open source search engine extensively used for bigdata technologies. You can perfrom geospatial queries on top of huge data in elasticsearch with high efficiency.

Related

Google maps - 200K + polygons on the map

I have to plot 200K+ polygons (building shapes) on google map in my project. I have already plotted them on the map. But when moving the map around, it takes sometimes to render polygons. Sometimes even browser tends to freeze. I am looking for either an alternative technique to do that(Something better than Google maps) or a better way to manage polygons.
These are the constraints that I have
** this polygon information is generated on the fly and cannot be pre-prepared.
** Any viewer should be able to see all the polygons(Buildings) in a "bird eye view", so I cant restrict them into a particular zoom level.
You should use some form of server side rendering to generate overlay tiles. We do a similar thing to you with thousands of geospatial objects on a google map. We use the https://carto.com/ service but there are others available and I encourage you to do the research.
Carto stores your data in a postgres database, and then you can use cartodb.js to overlay your rendered data on a map of your choosing (leaflet, google). You can then access click/mouseover events via cartodb.js
https://carto.com/docs/carto-engine/carto-js
I should add that this is easiest for static datasets, but data manipulation on the fly is certainly possible via many of their APIs.

How to create web mapping web application with Map and rendering with million points or vector?

I have to develop an web based application Where I have models(car, truck) having
its Longitude and Latitude or vector points(models) of 1 million.If i want to create web app, which my users will be able to see their models over the map, And even may be able to edit it from client (browser) side.Which tools or Architecture will i implement to render more than million vectors on map.As i am new to web mapping, I want to know how shall i create an interactive map, where user must be able to retrieve old ,present vector data of any format (stored in Data base) .And render it and show with map.
You may want to think about how to generalize your display of information here for your users. I've blogged about an approach for visualization of tens of millions of latitude, longitude pairs using Leaflet and Apache Solr. Solr's facet heatmap feature offers high performance generalization of geospatial data for visualization and query.

What would be the best Web Mapping API for the following requirements?

I have a fairly simple, and what I would think to be common mapping web mapping project to complete. I'm struggling in my selection of a web mapping API. Thus far, I've not been able to one that meets the following requirements.
Able to display thousands of points in one view without choking crippling the browser. To be specific, I'd say I would like to display roughly 30,000 points at one time and still be able to navigate around a slippy map without degraded performance.
Local maps. The web server will run on the local client, so being able to display a map without reaching out to the internet (even if it's a very basic map) is an absolute requirement.
Render dynamic data from a database onto a map (most API's are meeting this requirement).
Draw polygons directly on the map, and export the lat/lon values of all vertices.
In your experience working with map api's, do any of them meet the requirements above?
I've looked at OpenLayers 3, Leaflet, and Polymaps. Aside from reading every piece of documentation ahead of time, I can't discern if any of these would fill all requirements. Again, I'm hoping someone with experience with any API could point me in the right direction.
Thanks!
As for Leaflet:
Thousands of points: you could use one of these plugins:
Leaflet.markercluster, clusters your points into "groups of points" at low zoom levels.
Leaflet MaskCanvas, replaces all your points by a single canvas layer.
Local maps: as long as you provide a way to create image tiles (even on the local machine), most mapping libraries should work.
Dynamic data: depending on what you call "dynamic", all mapping libraries should provide you with built-in methods to display your data.
Drawing polygons and export lat/lon vertices: use Leaflet.draw plugin
OpenLayers 3 would very probably provide you with all these functionalities as well.

Speeding up a very large Bing Maps polyline based layer

I am writing a Bing Map based Universal App for Windows Phone and Windows 8 that shows some quite large map layers.
Writing the initial app was no problem (the tutorial I followed is at http://blogs.msdn.com/b/rbrundritt/archive/2014/06/24/how-to-make-use-of-maps-in-universal-apps.aspx), however I now am experiencing major problems rendering a layer that contains thousands of polylines, with tens of thousands of co-ordinates.
The data is just too big - on Windows 8.1, the map crashes the application, while on Windows Phone 8.1, the layer takes a very long time to render.
According to http://blogs.msdn.com/b/bingdevcenter/archive/2014/04/23/visualize-large-complex-data-with-local-tile-layers-in-bing-maps-windows-store-apps-c.aspx, I should speed it up by converting it to a local tile layer, however, the program mentioned in the article (MapCruncher) requires a PNG as input. The question is, how do I convert my map data to a PNG? I can have the data as a shapefile, KML file, or a CSV file. Is there another way I should be doing this? I know I can do this via Geoserver, however my app has to have offline support and so cannot download from the web server the appropriate files as needed.
If anyone has an other ways I could approach this speed issue with large layers, then that would be greatly appreciated. I am aware that I can speed up rendering of a layer in Bing Maps via quadtrees, however most of what I have found is theoretical. If anyone has some code I can plug in to this, that would be very helpful.
Local tile layers are fine if you only have data in a small area, or only want to show the data for a few zoom levels. Otherwise the number of tiles grows drastically and will make your app huge. If your data changes regularly, or you want to support all zoom levels of the map you should store your data on a server and expose it as a dynamic tile layer. A dynamic tile layer is a web service that generates a till on demand from your data. You can add caching to the tiles for performance. This is the best way to handle large data sets and one I have used a lot. In fact I have a demo here: http://onsbingmapsdemo.cloudapp.net/ This data set consists of 175,000 complex polygons that equates to about 2GB of data.
I have an old blog post on how to do this here: http://rbrundritt.wordpress.com/2009/11/26/dynamic-tile-layers-in-the-bing-maps-silverlight-control/
If you prefer working with MVC you might find these project useful:
https://ajaxmapdataconnector.codeplex.com/
https://dataconnector.codeplex.com/

Poor rendering time when plotting large shape files on Google Maps

First, I'd like to let it be known that this is my first journey into developing mapping applications with such a large set of data. So, I am in way locked into the techniques or technologies listed below...
The Overview
What I am trying to accomplish is the ability to load my company's market information for the United States onto some form of web-based mapping software. Currently I am trying to accomplish this with Google Maps JS api and GeoJSON data, but am not against other alternatives. There are roughly 163 files (ranging from 6MB to 200KB) that I have exported from ArcGIS into GeoJSON files, and then loaded into a database (currently as GeoJSON strings). We have developed a UI that loads the data based on the current map bounds and Max/Min calculations in the corresponding records.
The problem I'm running into is the render time on the map itself, which is annoying when switching between different regions, states, or zoom levels. The API calls to load the data are acceptable in regards to the size of the data being retrieved. My boss said it is great for a proof of concept, but would like to see it much, much faster.
The Questions
What suggestions could you offer to increase the render time?
Are there better options (3rd party libs, techniques, etc) for what I'm trying to accomplish?
It was suggested by a co-worker to export the map shapes and just use the images for overlaying the information based on the coords. Any input on this?
One solution to your problem could be reducing number of vertices in polygon or polyline layers (files) while preserving feature's shape. In that way geoJSON file would be smaller and easier to render.
You can use tools available in ArcGIS for Desktop.
See
How to Simplify Line or Polygon
or
Simplify Polygon
I'm sure there are similar tools in QGIS or any other open source GIS software.
There are other solutions to your problem like prerendering data in map tiles that could be overlayed over Google Maps basemap(s). That is probably the fastest solution but more complicated.
Maybe this could be good starting point.
Just couple of random thoughts.
On one project I worked on I noticed terrible performance with geoJson I loaded via REST service.
I found out that browser's JavaScript engine had hard time parsing large geoJson responses. I finally decided to load data in batches (multiple requests). Performance increased enormously.
I noticed you use 163 files (layers?). Do you display them at the same time (no no no) or you have some control panel to switch layers on/off? If you display them at the same time then performance will suffer for sure and you should look into generating map tiles with thematically grouped layers. In that way you'd come up with couple of tile layers that you can switch on and off.
Also, if data used to generate tiles changes frequently than tiles aren't good idea. Then maybe you should look into Web Map Services. WMS generates maps on the fly from data stored in database. GeoServer supports WMS and it's easy to configure and use.