getting historical data from google maps traffic - google-maps

Is it possible to get historical data for the traffic layer in google maps, i.e. specify a time and location and receive whether traffic was normal or slow etc? I don't need the typical traffic conditions given a time and day, I need concrete data as described above.

Some APIs provide historical data only for traffic incidents like congestion and accidents.

Related

Is it possible to retrieve historical traffic data?

I'm trying to acquire historical traffic data that occurred during an emergency evacuation. Is it possible to retrieve this kind of data with Google Maps Platform Roads API? I'm specifically interested in this API, not others like the ones in this link. Thank you.

Way to use Fusion Tables with the Google Maps API while maintaining privacy?

It's my understanding that the only way to use a private Fusion Table with the Maps API is if you're using the Business version of the API. Only public and unlisted tables can be used normally. I would really like to realize the performance benefits of Fusion Tables but am not interested in paying for a business license.
My question is this: How secure could an unlisted table be? The data I'll be displaying is sensitive, but not critical. It is unlikely to be sought out specifically, or scraped by a bot. (no addresses, names of people, phone numbers, etc).
If Fusion Tables really won't be an option for me and my sensitive data, with MySQL at what point would I start to see serious degradation based on the number of markers in an average browser? I estimate the maximum number of points in the table to be somewhere around 1000-2000.
The privacy-setting(public or unlisted) are only required for a FusionTableLayer.
You may use 2 Tables: 1 FusionTable(public or unlisted) to store the geometry and plot the markers, and a 2nd table(private) where you store the sensitive data. Use a common key for the rows, so you'll be able to request the sensitive data from table#2 based on the key returned by table#1.
Which kind of table you use for table#2 depends on you, data in a private FusionTable are accessible after Authentication, but I would prefer my own (mySQL)DB for sensitive data(It happens to me that the data of a FT was accessible via the FT-API, although the download-option was disabled, so I currently wouldn't rely too much in the security-- note that FusionTables are still experimental ).

How many data points can the Google Maps API show at one time?

I have a large volume of data (100,000 points). Is it possible to show all the data points at the same time? Someone said that the Google Maps API 3 cannot load more than 3000 data points. Is this true? How many data points can it show at one time?
You might want to take a look at this article from the Google Geo APIs team, discussing various strategies to display a large number of markers on a map.
Storing the data using Fusion Tables in particular might be an interesting solution.

Given a location's coordinate, locally find out the zip-code without any Web API

I'm working on the visualization of a large spatial data set(around 2.5 million records) using Google maps on a web site. The data points correspond to geographic coordinates within a City. I'm trying to group the data points based on their zip codes so that I can visualize some form of statistical information for each of the zip codes within the city.
The Google API provides an option for "reverse-geocoding" a point to get the zipcode but it enforces a usage limit of 2500 per IP per day. Given the size of my data set I don't think Google or any other limit enforced Web API is practical.
Now, after a few rounds of Google search I've found out that it is possible to set up a spatial database if we have the boundary data for the zip codes. I've downloaded the said boundary data but they're in what is known as "shapefile" format. I'm trying to use PostGIS to set up a spatial database that I can query on for each data point to find out which zipcode it belongs to.
Am I headed in the right direction?
If yes, can someone please share any known examples that have done something similar as I seem to have reached a dead end with PostGIS.
If no, can you please point me to the right course of action given my requirements.
Thanks.

Best Practice around Geocoding?

Can anyone weigh in on best practice around geocoding location data?
We have several hundred "locations" (and growing) that are regularly inserted into the database.
We will occassionally run queries to plop these on a Google Maps using the Javascript API.
Do you think it's better to geocode these addresses when they're inserted into the database (ie, add lng and lat fields and populate and store these--addresses won't be modified manually) or call a geocoding service whenever we are generating our maps?
Storing the long/lat values will be much faster when it comes to use them. As long as you're happy they wont be moving around on their own between data entry and map drawing then do that.
We call the geocoding api and store the latitude/longitude upon insert of a new record or update of any address field. Not only is this quicker when adding the location to a map, but it reduces repeated calls to the API. Given that there is a limit to the number of calls you may make in a day (enforced as calls per second) you don't want to do it any more than you have to.
Storing the geocodes (but updating them occasionally) is a recommended best practice by Google.