Script to estimate itinerary duration based on traffic - google-maps

To get to work I can take two different buses. I would need a script that tells me which one is the fastest, based on the current traffic.
Is it possible, with Google Maps API, to get a duration estimation of an itinerary, provided a list of waypoints?

The DirectionsRequest object has a durationInTraffic property.
Whether or not we should provide trip duration based on current traffic conditions. Only available to Maps API for Work customers.
As you can read from the documentation, this is not available in the free API version.

Related

SKU Usage Per Session Or Per Object/Point On My Map?

I wish Google can break it down explaining this.
Says, if I was a manager with no developing code experience, there is no way that I could get the answer from Google support with the information provided for everyone.
I just don't understand how SKU usage work. Is it per point on a map that need SKU? Is it per whole set of points? If I have 30 points on my road destination, does that means that I used 30 SKU?
Can someone explain this?
If you are using Google Maps Platform APIs, please know that it uses a pay-as-you-go pricing model - meaning all APIs under Google Map Platform are billed by SKU. SKU is a combination of 'Product API' and 'Service or Function you used'. Each combination has a price and cost is calculated by SKU Usage x Price per each use.
So you will be billed depending on which services you are using. You can learn more about this on this link.
If you have more specific question with your project and to open a more personalized channel, I suggest that you create a support case to reach out with the support team.

Google API Future Prediction of Traffic = Typical Traffic shown on Google Maps (maps.google.com)

I just wanted to ask if the future traffic information given in the Google API (Google Distance Matrix API to be exact) is the same as the traffic data used in the typical traffic overlay in the Google maps app. Aren't they both based on historical data, and thus, should be similar if not the same?
Thank you and I will appreciate your help!
Yes, so long as you don't specific traffic_model other than best_guess.
They both use historical traffic data to predict "likely" traffic conditions in the near future, based on time of day and day of week. However, your API requests may not always carry the exact same meaning as your input on Google Maps, so there may be cases where each return slightly different results. Should you find big differences, please file a bug.

Google Javascript API Geocoding Limits

What are the limits for client side geocoding with Google Maps JavaScript API v3?
My research:
Google Maps PHP API has a limit of 2500 geocode requests per day (https://developers.google.com/maps/documentation/geocoding/#Limits)
Google Maps Javascript API v3 has a limit of 25000 map loads per day (https://developers.google.com/maps/documentation/javascript/usage)
Google suggests using javascript API for geoocoding to avoid the 2500 limit through the PHP API. It states "running client-side geocoding, you generally don't have to worry about your quota" (https://developers.google.com/maps/articles/geocodestrat#client)
However, nowhere does it state in any of the documentation what the geocoding limits are through the Google Maps JavaScript API v.3.
(This has been bothering me for a while, and I have researched it on more than one occasion and failed to find a solid answer)
I work in Maps for Business support at Google. The following is my personal opinion, not Google's, but let's just say I'm rather familiar with this topic!
First, it's important to distinguish between client-side geocoding (JavaScript calls to google.maps.Geocoder) and server-side geocoding (HTTP requests to /maps/api/geocode). This question and answer are specifically about client-side geocoding; for server-side limits, see here. In particular, the oft-mentioned 2,500 requests per IP per day limit applies only to server-side geocoding, not client-side.
So the short answer is, there is no documented query limit specifically for client-side geocoding. Once a client has loaded the Google Maps JavaScript API library, it can make as many geocode requests as it likes, provided they're done at a sane speed. Two rules of thumb to ensure you don't run into problems:
Initiate geocoding in response to user interaction, and you'll be fine even if there are short request bursts (eg. click a button to geocode several addresses).
Catch any OVER_QUERY_LIMIT errors and handle them gracefully (eg. exponential backoff). Here is some sample code in Python.
But please do not try to busy-loop the geocoder or hammer away at it for hours on end, there are protections against abuse.
UPDATE
As of 2017 the client side quota is calculated against corresponding web service quota. Please have a look at Christophe Roussy's answer.
Always read the latest you can find on an official Google website for the proper API version, a dated stackoverflow answer is not an up-to-date reference !!!
https://developers.google.com/maps/documentation/geocoding/usage-limits
Here is what it said when I wrote this answer:
Google Maps Geocoding API Usage Limits
2,500 free requests per day, calculated as the sum of client-side and
server-side queries.
50 requests per second, calculated as the sum of client-side and
server-side queries.
Enable pay-as-you-go billing to unlock higher quotas:
$0.50 USD / 1000 additional requests, up to 100,000 daily.
In general I think Google is happy as long as the queries come from real human users as this is of some value to them.
There are also ways to save a few requests by caching results for up to 30 days:
https://developers.google.com/maps/premium/optimize-web-services#optimize
Based on https://developers.google.com/maps/articles/geocodestrat,
Client-side geocoding through the browser is rate limited per map
session, so the geocoding is distributed across all your users and
scales with your userbase.
So, what is rate limits per map session?
As geocoding limits are per user session, there is no risk that your
application will reach a global limit as your userbase grows.
Client-side geocoding will not face a quota limit unless you perform a
batch of geocoding requests within a user session. Therefore, running
client-side geocoding, you generally don't have to worry about your
quota.
Ok, what is the limit? Here we go
https://developers.google.com/maps/documentation/business/articles/usage_limits
With a rate limit or 10 QPS (queries per second), on sending the 11th
request your application should check the timestamp of the first
request and wait until 1 second has passed. The same should be applied
to daily limits.
Still not clear? me too.
This is the closet answer that I found from their document and it says "rate limit of 10 requests per second"
https://developers.google.com/maps/documentation/business/webservices/quota
I don't think a user session can send 10 requests per second, thus I consider it as limitless.
I asked this recently about the Places library (or rather, researched it and came to a conclusion):
Regarding Places Library for Google Maps API Quota limits
Basically, it seems that Map loads with key are limited to 25,000 per day per domain (that might not be right, but i'm pretty sure thats the case).
Geocoding limits are limited in the following way:
2,500 if you're using an API key without setting up billing
100,000 if you're using an API key and have set up billing
If you're doing server-side geocoding (pre-caching results or similar), then your geocoding is per domain, and you're going to run out of requests pretty quickly. (i.e. in a similar vein to if you're doing places requests using the REST API like so: https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=-33.8670522,151.1957362&radius=500&types=food&name=harbour&sensor=false&key=XXXX)
If you're using the javascript library, using the autocomplete functionality or PlacesService like so:
service = new google.maps.Geocoder();
then the limit (2,500 or 100,000 respectively), is per end-user, so your limit scales with your user base.
According to the article you linked (under When to Use Server-Side Geocoding):
The 2,500 request limit is per IP address.
So - what exactly do you mean by limit? Limit for the hosting application, or for a client running it?
The statement above tells me that there is no limit for the "host"; without a key or enforceable referrer URLs, there is no (good) way for Google to determine from whence a geocoding request originates. That's kind of the beauty of REST, among other things.
It also tells me that even if they could enforce it, they haven't set a limit. This would likely be counter-productive to their business model.

Geocoding for a directory

I need guidance on how to approach this problem.
I have a local directory which listings with addresses. I have been able to add geocode the address and save the latitude and longitudes. So far, no problem.
Now when the user searches in the directory, he/she will type in an address and then search for listings. The problem is I will need to geocode the search term(it will be an address/part of an address) and then search for listings.
Considering that the free geocoding limit is only 2500 requests per day, I will run out of the limit very soon every day.
Is there any other approach to this problem?
You could just use a different geocoder which doesn't have this limit.
For example there is Nominatim which uses data from OpenStreetMap. Although OSM's official Nominatim instance has a rather strict usage policy because it runs on donated servers you can just use a different instance, for example the one run by MapQuest. And if you want no restrictions at all then you can choose to run your own Nominatim instance.
Purchase a Maps API for Business License
http://www.google.com/enterprise/mapsearth/products/mapsapi.html
If you're dealing with US addresses, SmartyStreets sounds like it might be a good fit. We specialize in address verification and standardization, and we also return geocoding data (lat/lon) along with 40+ other data points, should you need them. Our Terms of Service are pretty open, unlike Google's (which doesn't let you use the geocode data for anything other than displaying on one of their maps); so you would be able to store and use the lat/lon in your own application. We have a jQuery plugin that can handle the address at point-of-entry, or you could use our API or our Lists service (especially useful if you have an accumulated list you want to process up front).
Our service also has an autocomplete feature that could cut down the time your users spend typing in addresses.

Storing Vs Calling Google Places API

My application queries Google Geocoding API once per user.
It stores their lat lon for analysis later.
I am considering a feature which calls Google Places API, and shows the user some "Places"
near them, on a map.
I do not want to store nearby Google Places. There is considerable overhead in this process.
Many people, on message boards, stress the importance of not bombarding the Places API from within my application. However, Google Places API allows a huge volume of daily traffic, much higher than Maps API and much higher than my website will use.
Can anyone definitively settle this?
The only possible reason to stress reducing the number of calls to the Places API is if you may be in danger of exceeding the stated quota limitsdev-guide (currently set at 100,000 for licensed sites). If you are confident that you will not approach or exceed the limit, then your assessment is correct. The added overhead associated with attempting to store results as a way to avoid similar Places API requests is not worth the added complexity. Simply call the Places API, process the results, and then place them on your Google Map to display to your user(s).