alternative to Google maps - google-maps

My client wants some of the functionality of Google maps namely:
- geocoding
- generating maps with points based on postal code or long.lat
- optimal trip mapping
Their issues with Google maps
- cannot control outages
- postal codes are sometimes inaccurate or not updated frequently for Canada/UK
- they have no way to correct inaccurate information
They would prefer to host the mapping application themselves, but will require postal code updates.
Can anyone suggest such a product?
thanks

"cannot control outages - postal codes are sometimes inaccurate or not updated frequently for Canada/UK - they have no way to correct inaccurate information"
Outages
hosting your own mapping is the only way to control this, but you would be very very hard pushed to beat Google Maps / Bing Maps uptime over the last 5 years. Take a look at the following:
OpenStreetMap for the road imagery data, this is open source data very good in the UK (Im not sure about canada) and you can make your own changes and submit them (or just change the data you have downloaded)
Geoserver, Mapnik or MapServer will read openstreetmapdata and create the image tiles needed to create your own maps in whatever style you wish. Depending on if you dont want all countries and all zoom levels these products can create all the tiles you will need in advance, but usually they have to be created in real time and cached. You need a BIG fast server to manage tile crunching
Openlayers or Leaflet are open source javascript mapping platforms that will display your tiles for you
Obviously this is just for road maps, aerial imagery would cost you an absolute fortune.
Post Code Data
Many people do not realize that UK postcode data for latitude and longitude is now completely free and available to download every quarter from the official source (ordinance survey) http://www.ordnancesurvey.co.uk/oswebsite/products/code-point-open/index.html.
This is the same data source Google will use and there is none better but it will always contain inaccuracies and always be a few months out of date.
Finally
Hopefully that answer the question you asked and gives you information to inform your client. Now for the question you didn't ask "Is this approach good value to my client?".
I won't presume to know your business or client, however what I described above is possible but with one to many months of work involved to get it all working together and even then it wont have any where near the performance or uptime of something like google /bing maps and only offers a small subset of their features.

I think you're looking for something like Caliper-It's a very custom, and I would expect expensive, solution. Not suggested.
http://www.caliper.com/GISMappingSoftwareDevelopment.htm
One solution could be to use two different mapping services and compare their results, this way there's a much better chance the data is accurate. You can also fix inaccurate data by creating a system which acts as a barrier between the API and your user, where data you know is inaccurate is corrected before it's displayed. Not sure exactly what you're doing though, so this might not work for you.
Is trip mapping/routing the basic functionality you want to do?

Before rushing into rolling your own, I'd suggest a good think about the consequences of doing so. The first that springs to mind is whilst the pros are that you can now control your data, the cons are that you now control your data.
So you are going to have to consider where and when you get updates and the processes you are going to have to employ to keep your maps in sync with the rest of the world. There are a lot of headaches involved in these things which is why so many people use externally hosted solutions such as Googles.

Related

Should I use Google Maps API/Geocoding to power a store finder

I'm new to geocoding so I'm not certain this is even the question I should be asking, but all of the other discussions I've seen on this topic (here and on the Google API forum) are so application specific that I feel like I might be missing a very elementary step - I don't need to know how to implement a store finder - I need to know if I should.
Here is my specific situation - I have been contracted to design an application wherein we will build a database of shops (say, independently owned bars and pubs). This list will continually grow and change as shops close and new ones open. The user can enter his/her point of origin (zip code or address) and be shown a list or map containing all the various shops within a given radius in order of proximity.
I know how to deliver these results from a static database:
One would store the longitude and latitude as columns for each row and then just use that information to check distances.
But I have inherited an (already fairly large) database of shops which have addresses but not coordinates - so I'm not sure what the best way to get those addresses is. I could write a script to query them one at a time against google geocoding, I could have a data entry person manually look up the coordinates for each one and populate the data that way, or maybe there is a third option I'm not aware of.
Is this the right place to be asking this question? Google Maps Geocoding doesn't host a forum of their own, but refers people to Stack Overflow. Other forums on the net dealing with this topic are all relating to a specific technical question but no one seems to be talking about it from a top-down perspective (ie the big picture).
Google imposes a 2,500 queries per day limit on free users and a 100,000 queries a day limit on paid ones - neither of these seem to be up to the task of a site with even moderate traffic if, every time a user makes a request, the entire database (perhaps thousands of shops) are being checked against Google's data. It seems certain we must store the coords locally but even storing them locally, there will have to be checks against Google in order to plot them on a map. If I had a finite number of locations (if, for example, I had six hardware shops) and I wanted to make a store locator, there would be a wealth of discussions, tutorials, and stack overflow questions available to point the way for me, but I'm dealing with a potentially vast number of records and not sure how to proceed or where to begin.
Any advice would be welcome - Additionally, if this is not the best place to be asking this question, a helpful response would be to indicate a better place to post it. I've searched for three days but haven't found what looks like a good resource for asking such subjective questions.
The best way of course would be when you use a geocoding-service to get coordinates and store the coordinates in your DB. But it's not possible with google's geocoding-service, because it's not permitted to store geocoded data permanent.
There are free services without this restriction, some keywords to search for: mapquest, nominatim, geonames(but these services are less accurate than google)
Another option would be to use a FusionTable. The geocoding would run automatically(but the daily limits are the same as for the geocoding-service). The benefit: the geocoding is permanent(you can't access the locations directly by e.g. downloading the DB-dump), but you may use the coordinates for plotting markers(via a FusionTablesLayer) or filtering(e.g. by distance)
The number of entries shouldn't be an issue, 100k is no problem for a database

How to use our own data to create map layer dynamically?

We are creating a speed limit map application using different colors to highlight street with different speed limits (similar to ITO speed limit map: http://www.itoworld.com/map/124?lon=-79.37151&lat=43.74796&zoom=12).
The problem we have is that we are conducting our own research and have our own speed limit data instead of pulling the data from OpenStreetMap or Google Map like ITO map. We also need to create a data storage in order to dynamically update the map as we add more speed limit information in the future.
Is there anyway to create our own instance of OpenStreetMap and replace only the speed limit information with our own data? We don't have any vector data and we have no experience working with them.
Is there any suggestion of tools to use for creating highlighting layers based on the speed limit we have? Is OpenLayers a good option?
Any help is appreciated, thank you very much.
Update 2013/11/20
Thank you very much for your answers, now we have a much better understanding of your problem. This is a university design project so we basically have no budget. We are looking for:
1) A basic "base map" that include the basic tile information (openstreetmap seems a good choice since google map api doesn't provide free road information as long as we can find)
2) A geo data server that can host our own street speed limit data (looks like geoserver and mapserver are good choices), or a design simple database that can fulfill our need(doesn't know is it possible yet)
3) A plotting tool that can render our speed limit data as "group of lines" on the map since these data will be changed frequently (openlayers and leaflet are good candidates).
Is there anything else needed?
What you want to do is a trivial programming task once you have decided a few things:
These are probably the three biggest questions you need to answer. I added some commentary, but look at each of these questions beyond this post to find what works for you.
Who do you want to use for your map? Since you only have one type of data you will want to display that data on someone else's nice looking map. The big choices are Bing, Google, OpenLayers/OSM, and ESRI. Your choice will most likely be driven by the licensing of the above services and if you are willing to pay or not. A need to support mobile devices may also factor into your decision. Since the map is what your users will see, choose the best looking map you can afford.
How will you serve up your data? You have several options to serve your speed limit data. GeoServer and MapServer and ESRI are some popular mapping software packages. If you only displaying a few layers of data all mapping software will be overkill. The actual software to render your map data will most likely affect only your pocket book, so free is good here usually.
Tiles vs Lines
You will server your data as either a group of lines sent to the browser, or as pre-rendered tiles to be loaded on top of the map. If you data changes frequently you will want to serve it dynamically as line data (an array of points.) If your data does not change frequently, you should consider tiling your data. Tiling involves pre-rending of the entire map at all zoom levels. This allows the map to be loaded very fast and this how almost all base maps are rendered. The downside is that the tile generation can take a long amount of time and tiles can take a large amount of space.
This is a very broad question. There are many components to drawing your own speed limit map.
On the front-end, there is a web browser map interface. OpenLayers is good at that. There are plenty of other tools that can do this as well, such as Leaflet or even Google Maps API.
Next is something to provide the actual speed limit route data. This can be served as a vector layer or a raster layer. There are plenty of tools here, too. UMN Mapserver is free and reasonably good. ESRI makes a whole fleet of products in this area as well.
The speed limit route data also needs to be saved somehow. This can be done in files or in a database such as PostGIS. Again, lots of great options.
It is the role of the system architect to determine which technologies to employ to solve the problem.

Compiling a list of all colonies/neighborhoods for a particular city

I want a list of locations (coordinates) for all possible colonies/neighborhoods of some Indian cities. Take for example Delhi. Can this data be obtained with the Places API?
The only thing that comes to my mind is to use a query like -
https://maps.googleapis.com/maps/api/place/search/xml?location=28.540346,77.210026&radius=500&types=administrative_area_level_1|administrative_area_level_2|administrative_area_level_3|locality|neighborhood|street_address|sublocality|sublocality_level_4|sublocality_level_5|sublocality_level_3|sublocality_level_2|sublocality_level_1|subpremise&sensor=false&key=MYKEY
and then keep changing the radius by 500 till the whole city is covered.
Is there a better way of doing this?
Given how often you would need to do this for your map, since caching that data goes against the terms of service, this is not a great approach. If you map gets any decent usage, you'll rapidly hit your quota. Plus you're only get center points of the colonies/neighborhoods. I'd recommend trying to find another source of that data you can download. The Places API was not designed with this in mind.

How Transport for London Website Works

Here let me clarify , I have no intentions to peep in to or any evil intention towards tfls database and other related information.
But , ofcourse Millions of users are greatly beniftted the way it serves the information.
http://journeyplanner.tfl.gov.uk/
So , If we want to create some site like tfl, journeyplanner , what are the basic things we need to keep in mind.
Which Architecture We should use?
Can We create this website using ASP.NET(Should be able to)?
Is TFL integrating it's website with google maps or any other GPS
Edit:
While you enter the Zip/Pin code or Station name , it creates a map automatically from source to destination and calcculates the distance also.
My Question here is , How do they calculate the distance , do they keep help of Maps or GPS or they created there own webservic?
To answer the points, in order:
Which Architecture We should use?
One you know and understand there is more than one approach that would be possible to do a similar thing with.
Can We create this website using ASP.NET(Should be able to)?
You could. Similarly, you could do it as a Java Servlet or PHP application. If you were feeling particularly warped, you could probably make something work in pure Javascript (but your clients might hate you)
Is TFL integrating it's website with google maps or any other GPS
They're more likely using Ordnance Survey data, that they've rendered their own maps from (certainly if you pan right out, coverage runs out quite quickly).
From a routing perspective, they're probably using something like Dijkstra's Algorithm, although it's probably very optimised to cope with timetabling.
There are numerous algorithms for routing, which boil down to "relative cost" (where that cost may be distance, time, financial, or a combination). Not taking into consideration timetables, you can precalculate the costs between connected nodes (e.g. Liverpool St -> Bank via Central Line is ~5 mins), this would give a baseline for something like Dijkstra although you'd still need to factor in the cost of interchaging between modes of transport and waiting for connections to arrive, etc.).
You might want to look into routing algorithms in general (there's even info over on OpenStreetMap's wiki) before looking into the complexities introduced with timetabled services.

Optimal map routing with Google Maps

Is there a way using the Google Maps API to get back an "optimized" route given a set of waypoints (in other words, a "good-enough" solution to the traveling salesman problem), or does it always return the route with the points in the specified order?
There is an option in Google Maps API DirectionsRequest called optimizeWaypoints, which should do what you want. This can only handle up to 8 waypoints, though.
Alternatively, there is an open source (MIT license) library that you can use with the Google Maps API to get an optimal (up to 15 locations) or pretty close to optimal (up to 100 locations) route.
See http://code.google.com/p/google-maps-tsp-solver/
You can see the library in action at www.optimap.net
It always gives them in order.
So I think you'd have to find the distance (or time) between each pair of points, one at a time, then solve the traveling salesman problem yourself. Maybe you could convince Google Maps to add that feature though. I guess what constitutes a "good enough" solution depends on what you're doing and how fast it needs to be.
Google has a ready solution for Travel Salesman Problem. It is OR-Tools (Google's Operations Research tools) that you can find here: https://developers.google.com/optimization/routing/tsp
What you need to do basically is 2 things:
Get the distances between each two points using Google Maps API: https://developers.google.com/maps/documentation/distance-matrix/start
Then you will feed the distances in an array to the OR-Tools and it will find a very-good solution for you (For certain instances with millions of nodes, solutions have been found guaranteed to be within 1% of an optimal tour).
You can also note that:
In addition to finding solutions to the classical Traveling Salesman
Problem, OR-Tools also provides methods for more general types of
TSPs, including the following:
Asymmetric cost problems — The traditional TSP is symmetric: the distance from point A to point B equals the distance from point B to
point A. However, the cost of shipping items from point A to point B
might not equal the cost of shipping them from point B to point A.
OR-Tools can also handle problems that have asymmetric costs.
Prize-collecting TSPs, where benefits accrue from visiting nodes
TSP with time windows
Additional links:
OR-tools at Github: https://github.com/google/or-tools
Get Started: https://developers.google.com/optimization/introduction/get_started
In a typical TSP problem, the assumption is one can travel directly between any two points. For surface roads, this is never the case. When Google calculates a route between two points, it does a heuristic spanning tree optimization, and usually comes up with a fairly close to optimal path.
To calculate a TSP route, one would first have to ask Google to calculate the pair-wise distance between every node in the graph. I think this requires n*(n-1) / 2 calcs. One could then take those distances and perform a TSP optimization on them.
OpenStreetMaps.org has a Java WebStart application which may do what you want. Of course the calculations are being run client side. The project is open source, and may be worth a look.
Are you trying to find an optimal straight line path between locations, or the optimal driving route? If you just want to order the points, if you can get the GPS coordinates, it becomes a very easy problem.
Just found http://gebweb.net/optimap/ It looks nice and easy. Online version using google maps.