Re,
I have a KML file that's 400MB large - 1 week of travel and 5000km logged. It's already stripped of non-essential data. GPX file from that is around 80MB or 2.5MB zipped. KML file is around 30MB. I am not sure how many waypoints there are but must be hundreds of thousand.
How would I go about mapping it on Google Maps? Is there a way to reduce the file without compromising the quality of my tracks? I am using jQuery and Google Maps API v2 or v3.
Thanks.
Well it is very easy using a KML file with the Google Maps API. An example would be the following:
KmlLayer(http://myserver/kmlfile.kml)
See the API 3.0 Reference on the KmlLayer here.
But you have a problem with your large KML file. The current maximum for an uncompressed KML file is 10MB, a lot smaller than your 30MB. See the reference on KML support here.
In my experience adding large KML files to a map is also very slow. So loading your e.g. 9MB file will take lots of time. What I suggest you do is to reduce the size of the file.
Do you really need all those hundreds of thousands of waypoints? What was your logging stepping? Like each 10 seconds? Do you need to have your location down to the meter, isn't every 100meters enough? Btw: Are you working on an internet-based thing? I remember Google Earth supporting arbitrary large KML files (but certainly larger than the maps API).
I don't know any track-smothing programs or "waypoint remover", but there surely are such things out there and that is what you will need to make your KML file smaller (also 30MB still sounds very large to me, are you sure it is only waypoints or do you have decriptions and the like in the KML? Could you shorten these? Colors? Use global KML colors instead of defining the color for each waypoint!)
I split a much smaller kml/kmz file into several chunks. However I still noticed MASSIVE slowdowns when I tried to get near the limit. Typically Google Maps would not show the markers when I was at full extent of the map, and would only do so if I zoomed in (and thus it had much fewer markers to deal with).
Often it would only show the markers on a portion of the map tiles, and the other map tiles would be empty.
Even Chrome, which is the fastest browser for this, could barely handle it.
It would be worth considering importing your KML file into a fusion table..
http://www.google.com/fusiontables/public/tour/index.html
The subsequent performance will be spectacular compared to adding overlays based on a KML file.
For some magical reason when you create a KMZ using Geosetter > export to google earth. The KMZ's it creates although much larger than 5mb load into google earth no problem.
I reduced a 5 MB kml file to about 1.3 MB. Step one is to remove all track points and many path points. Only two points are needed for a straight line but GPS receivers might record dozens.
There are automated tools to filter tracks but I prefer to hand-edit.
Step two is to optimize the style information in the kml file. Google Earth Desktop (GED) is useful for editing paths and setting styles, but it never seems to delete a style so you end up with a megabyte (in my case) of duplicate identical styles.
The kml file is in plain text so any text editor can be used to merge styles and delete duplicates. This task is tedious and likely to result in errors, so I found a workaround:
Make a duplicate kml file as a backup and remove all the style information. This is simple and quick. Save the file and open it in GED.
I organised all elements with shared styles into kml folders. Then set the style of each folder resulting in one definition for hundreds of paths or way points.
There is a final foible (bug?) in GED; when you share a folder style, all the folders below including those that are not sub-folders inherit the style.
This is an odd behavior - a bug in my opinion!
So drag the folder to be altered to the bottom of the list. Set the shared style and then drag it back to the top or middle.
I have folders for walks, public-transport-routes, long-distance-paths, bus-stops, train-stations, parking, pubs, points-of-interest and hazards so I need to set about nine styles.
This only takes a minute or two so it's an efficient solution not needing additional software.
As I add new walks and pubs to the file, the duplicate styles start to accumulate again so from time to time the optimization has to be re-done. This is the optimized kml file.
Related
Is there any good library that will enable to see the coarse location distribution.
I have some million data points to plot on a map. Doing it on google maps is going to be heavy. I want this to be like a dot plot.
The closest I can think to the functionality you are looking for would be to implement a heatmap layer : https://developers.google.com/maps/documentation/javascript/examples/layer-heatmap.
This would allow you to see the distribution of points without plotting them individually and thus saving you some client-side work.
How well this will work with a million+ points however I am not sure. You might be best off looking into a fusion tables combination.
I'm trying to load a 500MB shape file into GeoServer and get it to respond to a client request within a reasonable time frame (it currently doesn't respond even after 30mins of waiting). I want it to deliver image tiles; I'm using Google Maps API v3 ImageMapType to automatically request the correct tiles using the GeoServer WMS URL. The layer consists of hundreds of thousands of polygons for coastal Tasmania - so the layer is very sparse. I've tried:
Creating a tile cache (but the ETA is 15 years in zoom range 13 to 18) and it creates a lot of blank tiles (est. >95%)
Removing all attributes in the layer before loading into GeoServer (still waited an hour for it to begin seeding tile cache and still gave no progress)
Merging the polygons so there are only 10 polygons in the layer (same behaviour)
Using the bounds options in the tile cache (same behaviour)
[edit] Reprojecting the layer into EPSG:900913 (same behaviour)
Cutting the layer into 12 sections to reduce empty space, loading them as a Layer Group, and seeding the tile cache from this (even 1 of these layers wouldn't begin seeding - too big still?)
The next option we're looking at is breaking the layer into 1km grids and loading all 8000 layers as a Layer Group. I'm doubtful this will work. However 1 of these layers DID work when seeding the cache - and it only took a few seconds for all zoom levels.
How do I get GeoServer to serve this large, sparse data? Surely other people have this issue? Do I need to do something special with the layer itself? Or is there something in GeoServer that I should be configuring?
To start: a 500MB map should be peanuts for GeoServer, unless you bought your hardware well over a decade ago. I work with much larger datasets on a daily basis.
Perhaps you let GeoServer access the shapefile directly from disk?
I'd recommend the following setup:
Make sure you have enough RAM installed. I just saw that I could buy 24GB for less than 80 euros. That should be enough to cache your database entirely;
Install Postgres with PostGIS extensions;
To make sure that no re-projection is necessary, you can pre-convert all coordinates to Google's marcator projection (EPSG:9009l3);
Make sure you have a spatial index on the geometry column;
If your map is static, you can pre-render the tiles. This is really going to be a big boost in performance. Try to find until which zoomlevel you can pre-render within a reasonable time. The zoomed-in images are usually faster anyway, because fewer elements are involved to create the image;
Furthermore, I doubt that you're ever going to get a result if it took 30 minutes. At least the webserver already timed out way before that. Download the image manually by pasting the URL. If you don't see a picture, then open the downloaded image in a text editor: there might be a textual error message in the picture file instead of binary data. That error message usually describes what the problem is.
I am working on creating a tool to select a place via Google Maps API. This part is simple and I have no problems with it, but it is too specific. I want to do something similar to this, but need to accommodate the entire world, not just zip codes.
I would think the same effect should be able to be achieved by the highlighting an administrative level. Does this require KML file, and if so can anyone through a provider that might have this level of detail?
You can get administrative boundaries for the world here http://www.gadm.org/ or here http://www.unsalb.org/. You will not be able to achieve this with a single KML file though, as this would be one big file (I am talking about at least four hundred MB and there is a 4MB limit for KML files on Maps).
You will need to store the data in the db and pull out only relevant boundaries and draw them as polygons on the map. In other words it sounds like you might need a GIS server(open or ArcGis). Although you could build it from scratch using any geo-enabled db such as mySQL or MSSQL.
I have a json file thats roughly 480mb of geolocation points. I was wondering if someone knows of a good 'pattern' to use when trying to visualise the data. The problem I'm encountering is that the data has to be loaded into Google maps from the get go. This is causing all kinds of obvious issues.
I don't have to do this through google. It just seemed like the obvious choice.
With that much data, it may make more sense to handle it on the server side instead of client side. You could set up Geoserver with your appropriate data points. Using OpenLayers, you could overlay your points from Geoserver on top of Google Maps or potentially even on top of your own map if you want to cut out Google Maps all together. The heavy duty processing then happens on the server and only images are displayed in the browser. This cuts down on network traffic and the amount of processing the browser has to do. If you set up Geoserver to do caching, the server won't even have to work very hard.
It really depends on what kind of data this is.
If these are points for polylines or polygons you might try to encode the points (http://code.google.com/apis/maps/documentation/utilities/polylinealgorithm.html and http://code.google.com/apis/maps/documentation/utilities/polylineutility.html). There are also functions which you can use to encode the points. This will significantly reduce the size of your data.
You might also want to consider loading data depending on zoom level on the map. (I am not sure what you mean by "data has to be loaded from the get go" - you can load the data into the map depending on events, etc...) .
Fusion tables mentioned above will only accept 100MB of data.
I can be more specific if you explain the nature of your data and what you trying to do in more details. Hope this helps.
Try Google Fusion Tables
How do I take shapefiles and extract lat/lng coords so I can plot polygons on Google Maps?
http://www2.census.gov/cgi-bin/shapefiles/national-files
I asked this question here:
http://groups.google.com/group/Google-Maps-API/browse_thread/thread/18763b4b0cb996c7
and they told me WHAT to do, but not HOW to do it =P
Thx!
It depends on how you need to accomplish this. If you just need a few shapes, you can look up the coordinates in those files yourself. You can use those coordinates to create a GPolygon in Google Maps.
If you need lots of shapes - you'll need to do it programmatically. I would suggest using your favorite language to parse the XML file and retrieve the coordinates for each shape.
I had a similar problem last year when I was developing a screensaver to render presidential polling data. I didn't really want to invest the time to parse the Shapefiles data on the census site (The spec is here if you missed it).
Not sure if I actually saved any time here, but I ended writing a python app to render the 50 states onscreen, trace the edges and then store the data in a simple text format. Not sure if my data is high res enough for your application, but you can grab the data I generated here:
http://www.cannonade.net/pnt.zip
N.B. The data I generate are not latitude/longitudes, but with some scaling you should be able to translate them.
Good luck.
I had better luck using the ARC files at http://www.census.gov/geo/www/cob/index.html
I can't find the webpage right now, but I did find one that had actual code. Google something like "arc to kml" and go from there.