I am working on a project with a large CSV file that containes the location and movement of users. I would like to place this on a custom map in Google Maps via bluemix and use Bluemix Services to explore the data.
The primary goals are:
Getting the CSV data on the custom Google map. When running, the data should progress in time and show the movement of users.
Making the CSV points cluster for UX. (so that points that are near each other would stack together)
My primary question is how to get started on this. Do you reccommend i work on this locally and then connect Bluemix to my project or can i create all of this in Bluemix. I would much prefer the last option if possible.
If you have any suggestions to Watson Services or other Bluemix Features that may improve the app this is also greatly appreciated
Thank you for your time.
Ps. I realize Google Maps integrates best with Java Script. Do you recommend converting the CSV to Json when working with Bluemix?
This is a quite broad question and maybe not well suited for stackoverflow (stackoverflow is not a discussion forum, it is a competition of what answer is the most accurate one for a very strict question) so maybe this is a question for https://developer.ibm.com/answers/ instead.
That said, bluemix works well with js, java, ruby, python, go, php, etc (probably js and java better than the others) so I'd go in this direction. Also, I think you should investigate bluemix geospatial analytics (https://console.ng.bluemix.net/catalog/services/geospatial-analytics/) for your application.
For data storage, I suggest you to take a look on cloudant (https://console.ng.bluemix.net/catalog/services/cloudant-nosql-db/) which is a very popular option in bluemix and suits well for most cloud apps. If you want to take a more traditional approach, you can also consider a relational db such as DB2 (https://console.ng.bluemix.net/catalog/services/ibm-db2-on-cloud/)
Related
With a colleague of mine, we're building an app written in Dart with Flutter on Android Studio. We've arrived at the point where we need to start integrating a database to collect and send user filled data, and so we chose MongoDB which will be integrated into Docker so that our app is ready to function on multiple devices. Since we will have many users and each of them will be entering their own data, we have a lot of parameters to take into account so we're creating a JSON skeleton to map out the structure of what data goes where. The obstacle is we have no clue what the best way is to approach MongoDB-Docker integration with our Android Studio code, as it is our first time using MongoDB and Docker. Any good tips or resources that could put us on the right track ? Thank you
Hi your real question should be : what will you use in beetween those two ? You should (I guess) create an API to simplify and securise your user-DB interactions.
If you're not familiar with those principles a quick reasearch should help.
If you want to continue without any API, you should put many effort into having a VERY clean code as your app will have a lot of information inside its source code and proceed to create a documentation. Also you could use an ORM.
I work for web hosting company looking to integrate different data sources with BigQuery but the question now is what would be an ideal reporting/BI tool to get the data from BigQuery so proper/fast/easy retrieval/analysis/ reporting can be done with it.
I'm looking into the options suggested by google here: https://cloud.google.com/bigquery/partners/ but I was wondering if someone out there has possibly a more hands-on experience that could make a recommendation.
the company works with a mysql based billing system (with client, support, service data) which is the main source of info, along with other chat, cms and inhouse-developed systems that provide other sources of information that allow to maintain the web infrastructure where the business depends on.
Thank you.
It's really hard to answer this. Depends on the personnel you have at hand.
We are doing for idea validation mostly Data Studio.
Some personnel knows Tableau, but once you are out from GCP, all become a slow process, queries and interface updates in 30-60 seconds, as they all relay and store on their own the data.
We have wired some data to ElasticSearch as well, and we use Kibana.
But once it's all validated, we consolidated into our own Dashboards the reports. Mainly because we are mostly developers and can do the programming. If you have a data analyist or data scientist with their own tools, let them use what they are comfortable with.
Always do iteration and versioning, you as a developer should be driven by a good product manager who tells exactly what charts to build out.
I have recently pulled multiple JSON files from a SQL database and I would like to load them into my Google Datastore. Can anyone suggest the best way to go about this. I have read the docs and they detail how to create entities but I cannot determine how to do a bulk data load. Any tips or tricks would be welcome.
Two years later and no answer! The key to doing this right now, in 2017, seems to be the new Dataflow thing in Google Cloud. There are SDKs for both Java and Python, but it's still so new I'm using the Java SDK, the 1.9 version. I've adapted two of the examples and have it putting data into the Datastore. It seems to play nice with namespaces so far, but it's a little difficult to make fields with parent/child relationships.
At first, this question appeared to be too trivial to me to actually require a Stackoverflow post. However, after executing many Google searches for the information, I am at a lost when trying to figure this out about Couchbase.
In Couchbase (I am using the 2.2 Community version), how do I share views among developers? Is there some sort if import/export functionality available? If not, then how does Couchbase intend for developers to share the views that they are using without needing to do manual copying/pasting? It is obvious that the code that a development team would write for querying Couchbase will require accurate view names. Without having a way to send a developer a view file, to accurately setup a Couchbase DB, how can it even be possible to develop with Couchbase locally as a team?
I'm sorry if I sound a little desperate or harsh here, but if it isn't possible to share views among multiple developers, then I don't see how Couchbase can be a viable DB solution for a team of developers trying to share database configuration, similar to how a team using an SQL DB would share schema files to set up the DB.
Several ways you can approach this:
1) Create views programmatically as demonstrated here in java:
http://tugdualgrall.blogspot.com.es/2012/12/couchbase-101-create-views-mapreduce.html
or here in node.js:
http://www.tuicool.com/articles/RvYbQn
2) Store all your views in your version control system (This is the option I use). If you are developing locally then only you need your personal view code, once they are working and your tests are all passing then you can check them in.
I assume you'd then be developing on an testing environment so yes sadly here you'd have to update the views either by hand or by using option 1.
You could also take a look at perhaps using this tool but only for views: http://www.couchbase.com/communities/q-and-a/how-bulk-import-design-docs-and-views-couchbase-server
This functionality currently is not available in the admin UI.
There is a defect/enhancement open Ability to import/export views MB-8436. You can leave there your feedback and vote for it so it will be included in the next release.
In the meantime you can use Design Document REST API
Also there is a workaround blog
I see many tutos to download GA data to a google spreadsheet and refresh it automatically
but none to export data into mysql table(s)
Is it possible ?
If yes where can i find tutos or a freelancer with skills on this subject
If yes is it possible to store the data of several UA in the same table ?
Thx
Yes, you can do that with the Google Analytics core reporting api and you can use their query explorer to see what data you can pull from it.
Also here is all their libraries and sample code that you can use as a tutorial. As far as hiring someone, I would check out the job board here or odesk.com.
if you don't want to play around with code and development, just go ahead and use Analytics Canvas that provides easy data import via API from Google Analytics, which you can then easily export to SQL / MySQL / any other database. All using simple and intuitive graphic interface.
I usually use another solution that does not require coding — Skyvia tool: Google Analytics and MySQL Integration. It allows me to create a copy of Google Analytics report data in MySQL and keep it up-to-date with little to no configuration efforts. I don’t even need to prepare the schema — Skyvia can automatically create a table for my report data. You can load 10000 records per month for free — this is enough for me.