I am working on creating a tool to select a place via Google Maps API. This part is simple and I have no problems with it, but it is too specific. I want to do something similar to this, but need to accommodate the entire world, not just zip codes.
I would think the same effect should be able to be achieved by the highlighting an administrative level. Does this require KML file, and if so can anyone through a provider that might have this level of detail?
You can get administrative boundaries for the world here http://www.gadm.org/ or here http://www.unsalb.org/. You will not be able to achieve this with a single KML file though, as this would be one big file (I am talking about at least four hundred MB and there is a 4MB limit for KML files on Maps).
You will need to store the data in the db and pull out only relevant boundaries and draw them as polygons on the map. In other words it sounds like you might need a GIS server(open or ArcGis). Although you could build it from scratch using any geo-enabled db such as mySQL or MSSQL.
Related
We are creating a speed limit map application using different colors to highlight street with different speed limits (similar to ITO speed limit map: http://www.itoworld.com/map/124?lon=-79.37151&lat=43.74796&zoom=12).
The problem we have is that we are conducting our own research and have our own speed limit data instead of pulling the data from OpenStreetMap or Google Map like ITO map. We also need to create a data storage in order to dynamically update the map as we add more speed limit information in the future.
Is there anyway to create our own instance of OpenStreetMap and replace only the speed limit information with our own data? We don't have any vector data and we have no experience working with them.
Is there any suggestion of tools to use for creating highlighting layers based on the speed limit we have? Is OpenLayers a good option?
Any help is appreciated, thank you very much.
Update 2013/11/20
Thank you very much for your answers, now we have a much better understanding of your problem. This is a university design project so we basically have no budget. We are looking for:
1) A basic "base map" that include the basic tile information (openstreetmap seems a good choice since google map api doesn't provide free road information as long as we can find)
2) A geo data server that can host our own street speed limit data (looks like geoserver and mapserver are good choices), or a design simple database that can fulfill our need(doesn't know is it possible yet)
3) A plotting tool that can render our speed limit data as "group of lines" on the map since these data will be changed frequently (openlayers and leaflet are good candidates).
Is there anything else needed?
What you want to do is a trivial programming task once you have decided a few things:
These are probably the three biggest questions you need to answer. I added some commentary, but look at each of these questions beyond this post to find what works for you.
Who do you want to use for your map? Since you only have one type of data you will want to display that data on someone else's nice looking map. The big choices are Bing, Google, OpenLayers/OSM, and ESRI. Your choice will most likely be driven by the licensing of the above services and if you are willing to pay or not. A need to support mobile devices may also factor into your decision. Since the map is what your users will see, choose the best looking map you can afford.
How will you serve up your data? You have several options to serve your speed limit data. GeoServer and MapServer and ESRI are some popular mapping software packages. If you only displaying a few layers of data all mapping software will be overkill. The actual software to render your map data will most likely affect only your pocket book, so free is good here usually.
Tiles vs Lines
You will server your data as either a group of lines sent to the browser, or as pre-rendered tiles to be loaded on top of the map. If you data changes frequently you will want to serve it dynamically as line data (an array of points.) If your data does not change frequently, you should consider tiling your data. Tiling involves pre-rending of the entire map at all zoom levels. This allows the map to be loaded very fast and this how almost all base maps are rendered. The downside is that the tile generation can take a long amount of time and tiles can take a large amount of space.
This is a very broad question. There are many components to drawing your own speed limit map.
On the front-end, there is a web browser map interface. OpenLayers is good at that. There are plenty of other tools that can do this as well, such as Leaflet or even Google Maps API.
Next is something to provide the actual speed limit route data. This can be served as a vector layer or a raster layer. There are plenty of tools here, too. UMN Mapserver is free and reasonably good. ESRI makes a whole fleet of products in this area as well.
The speed limit route data also needs to be saved somehow. This can be done in files or in a database such as PostGIS. Again, lots of great options.
It is the role of the system architect to determine which technologies to employ to solve the problem.
I would like to have the possibility to tell if a GPS location is in an inhabited or uninhabited zone.
I have tried some reverse geocoding services out there, but all of them proved useless, because they select the nearest address possible. (I understand why this should be so, it is useful for the purpose of reverse geocoding)
I have noticed in Google Maps, when I search for a city, their boundaries are selected in red dotted well defined line. I would love it to use this, or something similar.
Is there any possible way that Google maps can provide such a service, or something that can solve my problem.
Are there any other web solution or databases that you know of that can give me this information ?
Or maybe I can use any of the reverse geocoding solutions with some parameters (such as restricting the size of searching) to determine if the location is or is not in a populated area?
If you will not find a public service then it gets interesting, and expensive in terms of developping effort.
Public data (world wide) is only available from OpenStreetMap, i think they have such a layer (could be named Land_use (rural, etc.)) This layer is usually used to color a map, look at openstreet map Web page if you find a suitable coloring, that coresponds to your task. (E.g look at green, or gray).
These data are stored in polygons, you would have top extract these polygons (i asume millions of them). Ten you need a fast searching spatial index, like a region Quadtree.
Then you do a "point(lat, lon) in polygon" call, and get the polygon related to your position.
Probaly not all that polygons will fit into main memory, so you must load them on demand (e.g by country).
A variant of this approach is to use a geo spatial database like postgres to store that polygons, and do a DB query.
With that approach most work will be extracting the polygons from OpenStreetMap DB file.
More acurate is data from TomTom, but these can be really expensive.
Re,
I have a KML file that's 400MB large - 1 week of travel and 5000km logged. It's already stripped of non-essential data. GPX file from that is around 80MB or 2.5MB zipped. KML file is around 30MB. I am not sure how many waypoints there are but must be hundreds of thousand.
How would I go about mapping it on Google Maps? Is there a way to reduce the file without compromising the quality of my tracks? I am using jQuery and Google Maps API v2 or v3.
Thanks.
Well it is very easy using a KML file with the Google Maps API. An example would be the following:
KmlLayer(http://myserver/kmlfile.kml)
See the API 3.0 Reference on the KmlLayer here.
But you have a problem with your large KML file. The current maximum for an uncompressed KML file is 10MB, a lot smaller than your 30MB. See the reference on KML support here.
In my experience adding large KML files to a map is also very slow. So loading your e.g. 9MB file will take lots of time. What I suggest you do is to reduce the size of the file.
Do you really need all those hundreds of thousands of waypoints? What was your logging stepping? Like each 10 seconds? Do you need to have your location down to the meter, isn't every 100meters enough? Btw: Are you working on an internet-based thing? I remember Google Earth supporting arbitrary large KML files (but certainly larger than the maps API).
I don't know any track-smothing programs or "waypoint remover", but there surely are such things out there and that is what you will need to make your KML file smaller (also 30MB still sounds very large to me, are you sure it is only waypoints or do you have decriptions and the like in the KML? Could you shorten these? Colors? Use global KML colors instead of defining the color for each waypoint!)
I split a much smaller kml/kmz file into several chunks. However I still noticed MASSIVE slowdowns when I tried to get near the limit. Typically Google Maps would not show the markers when I was at full extent of the map, and would only do so if I zoomed in (and thus it had much fewer markers to deal with).
Often it would only show the markers on a portion of the map tiles, and the other map tiles would be empty.
Even Chrome, which is the fastest browser for this, could barely handle it.
It would be worth considering importing your KML file into a fusion table..
http://www.google.com/fusiontables/public/tour/index.html
The subsequent performance will be spectacular compared to adding overlays based on a KML file.
For some magical reason when you create a KMZ using Geosetter > export to google earth. The KMZ's it creates although much larger than 5mb load into google earth no problem.
I reduced a 5 MB kml file to about 1.3 MB. Step one is to remove all track points and many path points. Only two points are needed for a straight line but GPS receivers might record dozens.
There are automated tools to filter tracks but I prefer to hand-edit.
Step two is to optimize the style information in the kml file. Google Earth Desktop (GED) is useful for editing paths and setting styles, but it never seems to delete a style so you end up with a megabyte (in my case) of duplicate identical styles.
The kml file is in plain text so any text editor can be used to merge styles and delete duplicates. This task is tedious and likely to result in errors, so I found a workaround:
Make a duplicate kml file as a backup and remove all the style information. This is simple and quick. Save the file and open it in GED.
I organised all elements with shared styles into kml folders. Then set the style of each folder resulting in one definition for hundreds of paths or way points.
There is a final foible (bug?) in GED; when you share a folder style, all the folders below including those that are not sub-folders inherit the style.
This is an odd behavior - a bug in my opinion!
So drag the folder to be altered to the bottom of the list. Set the shared style and then drag it back to the top or middle.
I have folders for walks, public-transport-routes, long-distance-paths, bus-stops, train-stations, parking, pubs, points-of-interest and hazards so I need to set about nine styles.
This only takes a minute or two so it's an efficient solution not needing additional software.
As I add new walks and pubs to the file, the duplicate styles start to accumulate again so from time to time the optimization has to be re-done. This is the optimized kml file.
I have a json file thats roughly 480mb of geolocation points. I was wondering if someone knows of a good 'pattern' to use when trying to visualise the data. The problem I'm encountering is that the data has to be loaded into Google maps from the get go. This is causing all kinds of obvious issues.
I don't have to do this through google. It just seemed like the obvious choice.
With that much data, it may make more sense to handle it on the server side instead of client side. You could set up Geoserver with your appropriate data points. Using OpenLayers, you could overlay your points from Geoserver on top of Google Maps or potentially even on top of your own map if you want to cut out Google Maps all together. The heavy duty processing then happens on the server and only images are displayed in the browser. This cuts down on network traffic and the amount of processing the browser has to do. If you set up Geoserver to do caching, the server won't even have to work very hard.
It really depends on what kind of data this is.
If these are points for polylines or polygons you might try to encode the points (http://code.google.com/apis/maps/documentation/utilities/polylinealgorithm.html and http://code.google.com/apis/maps/documentation/utilities/polylineutility.html). There are also functions which you can use to encode the points. This will significantly reduce the size of your data.
You might also want to consider loading data depending on zoom level on the map. (I am not sure what you mean by "data has to be loaded from the get go" - you can load the data into the map depending on events, etc...) .
Fusion tables mentioned above will only accept 100MB of data.
I can be more specific if you explain the nature of your data and what you trying to do in more details. Hope this helps.
Try Google Fusion Tables
How do I take shapefiles and extract lat/lng coords so I can plot polygons on Google Maps?
http://www2.census.gov/cgi-bin/shapefiles/national-files
I asked this question here:
http://groups.google.com/group/Google-Maps-API/browse_thread/thread/18763b4b0cb996c7
and they told me WHAT to do, but not HOW to do it =P
Thx!
It depends on how you need to accomplish this. If you just need a few shapes, you can look up the coordinates in those files yourself. You can use those coordinates to create a GPolygon in Google Maps.
If you need lots of shapes - you'll need to do it programmatically. I would suggest using your favorite language to parse the XML file and retrieve the coordinates for each shape.
I had a similar problem last year when I was developing a screensaver to render presidential polling data. I didn't really want to invest the time to parse the Shapefiles data on the census site (The spec is here if you missed it).
Not sure if I actually saved any time here, but I ended writing a python app to render the 50 states onscreen, trace the edges and then store the data in a simple text format. Not sure if my data is high res enough for your application, but you can grab the data I generated here:
http://www.cannonade.net/pnt.zip
N.B. The data I generate are not latitude/longitudes, but with some scaling you should be able to translate them.
Good luck.
I had better luck using the ARC files at http://www.census.gov/geo/www/cob/index.html
I can't find the webpage right now, but I did find one that had actual code. Google something like "arc to kml" and go from there.