How to use Google Fusion Tables - google-maps

Am beginner to Fusion tables Concept. I have already tried my hands on some basic features of Fusion tables SQL API and liked it. Following are some questions I have with respect to the same. Request your help in the same.
Whats the benefit of using Fusion table API for storing data and presenting it on Map via Google Maps V3 API which could other wise be done using V3 API over our own data source?
Whats the way forward on Private Fusion tables? I am worried on removing the technology altogether since its still in Beta or putting usage limitations that would limit the usage of the end application? Also can the data security be trusted.
Whats the difference between Table API and SQL API?

I can only answer question 1. I can think of several advantages which will depend on the nature of your data.
Greatly improved map display performance when dealing with thousands of markers (points, lines and polygons). Tiles are generated dynamically by Google for quick display via the Map API with no marker creation overhead
Geocoding. Addresses and/or existing KML can be used when you lack lat/lon locations
There is a fairly simple SQL like filtering mechanism via the Maps API which includes spatial queries
FT have an undocumented but extremely useful JSONP option which uses the same SQL like syntax.

Related

Export my analytics data and put them in a database

I am looking to export the analytics data towards a database sql. Do you know one tools who could help me?
Do you know how I can see on Google analytics the traffic resulting from a particular URL??
Thank you all!
You have several options:
UI export: in the top/right corner of your reports you should have an option to download data in various formats (XLS, CSV...)
API: you can use the reporting API to get it out in a programmatic/automated way
One thing you won't be able to do with the free version no matter what you try:
Reconstruct the entire analytics data: whether with the UI or API, you're limited to querying only 7 dimensions maximum at a time (eg ga:country, ga:deviceCategory etc...), and cannot combine certain dimensions together (no official list available, it's trial and error to find out), whereas there are dozens of dimensions available.
So the question for you becomes:
How much resources do I want to invest into partially reverse-engineering Google Analytics vs. the value it brings me vs. what it would cost to get alternative analytics solutions?
I found a cloud based solution which exports raw google analytics data to MySQL database. Setup is simple, all you need to do is add your Google Analytics connection and a database to which the data needs to be exported.
MySQL, PostgreSQL, SQL Server and BigQuery are the supported destinations. It creates a few custom dimensions in your Google Analytics account and Tag in Google Tag Manager to send hits to Google Analytics. Data is exported from Google Analytics to the selected destination every day.
I have been using it for last three months now. Hope this helps.
Exporting the analytics data is a thorny one.
My understanding is that paid GA usage allows the export of all collected GA data.
But free usage does not.
For free usage, all you are going to be able to do, realistically, is to create a report over your GA data (in Data Studio or Google Sheets) that contains the rows and columns you want, and then collect this information and squirt it into a SQL table. You are also liable to come up against sampling.
Re traffic from particular URL, the news is better: just filter on Hostname and Page.

Loading data from JSON into Google Cloud Datastore

I have recently pulled multiple JSON files from a SQL database and I would like to load them into my Google Datastore. Can anyone suggest the best way to go about this. I have read the docs and they detail how to create entities but I cannot determine how to do a bulk data load. Any tips or tricks would be welcome.
Two years later and no answer! The key to doing this right now, in 2017, seems to be the new Dataflow thing in Google Cloud. There are SDKs for both Java and Python, but it's still so new I'm using the Java SDK, the 1.9 version. I've adapted two of the examples and have it putting data into the Datastore. It seems to play nice with namespaces so far, but it's a little difficult to make fields with parent/child relationships.

Integrating CSV person data on Google Maps through Bluemix?

I am working on a project with a large CSV file that containes the location and movement of users. I would like to place this on a custom map in Google Maps via bluemix and use Bluemix Services to explore the data.
The primary goals are:
Getting the CSV data on the custom Google map. When running, the data should progress in time and show the movement of users.
Making the CSV points cluster for UX. (so that points that are near each other would stack together)
My primary question is how to get started on this. Do you reccommend i work on this locally and then connect Bluemix to my project or can i create all of this in Bluemix. I would much prefer the last option if possible.
If you have any suggestions to Watson Services or other Bluemix Features that may improve the app this is also greatly appreciated
Thank you for your time.
Ps. I realize Google Maps integrates best with Java Script. Do you recommend converting the CSV to Json when working with Bluemix?
This is a quite broad question and maybe not well suited for stackoverflow (stackoverflow is not a discussion forum, it is a competition of what answer is the most accurate one for a very strict question) so maybe this is a question for https://developer.ibm.com/answers/ instead.
That said, bluemix works well with js, java, ruby, python, go, php, etc (probably js and java better than the others) so I'd go in this direction. Also, I think you should investigate bluemix geospatial analytics (https://console.ng.bluemix.net/catalog/services/geospatial-analytics/) for your application.
For data storage, I suggest you to take a look on cloudant (https://console.ng.bluemix.net/catalog/services/cloudant-nosql-db/) which is a very popular option in bluemix and suits well for most cloud apps. If you want to take a more traditional approach, you can also consider a relational db such as DB2 (https://console.ng.bluemix.net/catalog/services/ibm-db2-on-cloud/)

Are there security issues with Fusion Tables

I'm contemplating using google fusion tables rather than using a mysql database to populate a google map. The data that I'm displaying is somewhat proprietary, not top secret, but I'm concerned that if I use a fusion table, people will be able to scrape the the Data, or just grab it directly from the fusion table. I may be misunderstanding how this works, but it seems from some the the threads I read, there is really no way to protect your data.
As usual the answer is: it depends.
If you want to use the Google Maps feature "FusionTableLayer" to display the data, you'll have to be a Google Maps Premier customer to be able to display data from private tables. For "normal" users, this is only possible with public tables.
But when you access the data via the API you can easily just select and display the data you want. But as long as you keep the table private there is no way of getting the original data.
Last but not least you have to trust Google ;-)

What's the best way to save KML in MySQL?

I have a few maps with certain areas (zones) which I'll want to capture in KML. Within those areas I need to pinpoint addresses.
How do I save those maps with its values as efficient as possible to query them later?
Maybe you could use MySQL spatial extensions.
Take a look at GIS and Spatial Extensions and to GIS on MySQL