Best Practice around Geocoding? - sql-server-2008

Can anyone weigh in on best practice around geocoding location data?
We have several hundred "locations" (and growing) that are regularly inserted into the database.
We will occassionally run queries to plop these on a Google Maps using the Javascript API.
Do you think it's better to geocode these addresses when they're inserted into the database (ie, add lng and lat fields and populate and store these--addresses won't be modified manually) or call a geocoding service whenever we are generating our maps?

Storing the long/lat values will be much faster when it comes to use them. As long as you're happy they wont be moving around on their own between data entry and map drawing then do that.

We call the geocoding api and store the latitude/longitude upon insert of a new record or update of any address field. Not only is this quicker when adding the location to a map, but it reduces repeated calls to the API. Given that there is a limit to the number of calls you may make in a day (enforced as calls per second) you don't want to do it any more than you have to.

Storing the geocodes (but updating them occasionally) is a recommended best practice by Google.

Related

Way to use Fusion Tables with the Google Maps API while maintaining privacy?

It's my understanding that the only way to use a private Fusion Table with the Maps API is if you're using the Business version of the API. Only public and unlisted tables can be used normally. I would really like to realize the performance benefits of Fusion Tables but am not interested in paying for a business license.
My question is this: How secure could an unlisted table be? The data I'll be displaying is sensitive, but not critical. It is unlikely to be sought out specifically, or scraped by a bot. (no addresses, names of people, phone numbers, etc).
If Fusion Tables really won't be an option for me and my sensitive data, with MySQL at what point would I start to see serious degradation based on the number of markers in an average browser? I estimate the maximum number of points in the table to be somewhere around 1000-2000.
The privacy-setting(public or unlisted) are only required for a FusionTableLayer.
You may use 2 Tables: 1 FusionTable(public or unlisted) to store the geometry and plot the markers, and a 2nd table(private) where you store the sensitive data. Use a common key for the rows, so you'll be able to request the sensitive data from table#2 based on the key returned by table#1.
Which kind of table you use for table#2 depends on you, data in a private FusionTable are accessible after Authentication, but I would prefer my own (mySQL)DB for sensitive data(It happens to me that the data of a FT was accessible via the FT-API, although the download-option was disabled, so I currently wouldn't rely too much in the security-- note that FusionTables are still experimental ).

How many data points can the Google Maps API show at one time?

I have a large volume of data (100,000 points). Is it possible to show all the data points at the same time? Someone said that the Google Maps API 3 cannot load more than 3000 data points. Is this true? How many data points can it show at one time?
You might want to take a look at this article from the Google Geo APIs team, discussing various strategies to display a large number of markers on a map.
Storing the data using Fusion Tables in particular might be an interesting solution.

Given a location's coordinate, locally find out the zip-code without any Web API

I'm working on the visualization of a large spatial data set(around 2.5 million records) using Google maps on a web site. The data points correspond to geographic coordinates within a City. I'm trying to group the data points based on their zip codes so that I can visualize some form of statistical information for each of the zip codes within the city.
The Google API provides an option for "reverse-geocoding" a point to get the zipcode but it enforces a usage limit of 2500 per IP per day. Given the size of my data set I don't think Google or any other limit enforced Web API is practical.
Now, after a few rounds of Google search I've found out that it is possible to set up a spatial database if we have the boundary data for the zip codes. I've downloaded the said boundary data but they're in what is known as "shapefile" format. I'm trying to use PostGIS to set up a spatial database that I can query on for each data point to find out which zipcode it belongs to.
Am I headed in the right direction?
If yes, can someone please share any known examples that have done something similar as I seem to have reached a dead end with PostGIS.
If no, can you please point me to the right course of action given my requirements.
Thanks.

SQL db vs. Fusion Tables performance in Google Maps

I'm developing an application that stores geolocation data in a SQL table to generate a map of all entered points/addresses by users. I would like this to scale to a large amount of points, possibly 50,000+ and still have great performance. Looking over the google maps API articles, however, they say performance can be greatly improved using fusion tables instead.
Does anyone have experience with this? Would performance suffer if I have thousands of markers loaded on a map from a SQL table? Does KML or any other strategy seem a better fit?
Once I'm zoomed out enough I could use MarkerClusters, but I'm not sure if that affects performance either since I'm still loading all the geocodes to the page.
You can't compare both technologies.
When you load thousands of markers from a sql-database, you have to create each single marker, what of course will have bad performance, because you'll need to send the data for thousands of markers to the client and create the markers on client-side.
When you use fusion-tables, you don't load markers, you load tiles. It doesn't matter how many markers are visible on the tiles, the performance will always be the same.
KML is not an option, because the amount of features is limited(currently to 1000)
Well, perhaps the only advantage is the fact that your data records may be kept private if you use some SQL table instead of Fusion Tables - but only if this is a concern in your project.
Daniel

MySQL GIS and Spatial Extensions - how to map regions and query against them

I am trying to make a smartphone app which will return a list of users within a certain proximity, say 100m. It's easy to get the coordinates of my BlackBerry and write them to a database, but in order to return a list of other users within 100m, I need to pull every other record from the database and compare the distance between the two points, checking to see if it's within range, before outputting that user's information.
This is going to be time consuming if there are many users involved. So I would like to map areas (countries, cities, I'm not yet sure of the resolution I'll need) so that I can first target a smaller subset of all users. This will save on processing time.
I have read the basics of GIS and spatial querying on the mysql website but to be honest the query is over my head and I hate copying and pasting code without understanding it. Plus it only checks for proximity - I want to first check if a coordinate falls within a certain area.
Does anyone have any experience of such matters and feel like giving me some pointers? Resources such as any preexisting databases of points describing countries as polygons would be really helpful too.
Many thanks to anyone who takes the time :)
I would reccomend against using MySQL for doing spatial analysis. They have only implemented bounding box analysis and it is not implemented for most spatial functions. I would recommend using either PostGIS or perhaps spatialCouch or extend MongoDB. This would be much better for what you look to be doing.