How to connect properly to Google Cloud SQL using Pentaho Kettle - mysql

Good day,
I have just started using Pentaho's Kettle to try and connect to a Google Cloud SQL instance. I've noticed that after making the connection, the Test on Spoon says it's working, However, despite that the connection 'connects', it cannot pull the metadata about the tables or schemas inside.
Yes, I did enable the google sql tool before hand, which, by documentation says, allows 3rd party programs to access the Google App cloud. However, as I said, it can't seem to find any tables despite being able to 'connect'.
Has anyone else encountered this?

i found the problem with the help of someone from work. He posted our findings here. :) I hope it helps. It involves actually adding in JARs to your spoon
http://lorenzo-dee.blogspot.com/2013/04/kettle-pentaho-data-integration.html

Related

MYSQL or PostgresSQL on AWS

I am trying to understand the trade-offs between going with MySQL or PostgreSQL on AWS.
Some considerations for me are that I am an amateur database user, so I need to be sure resources are available which allow me to overcome problems quickly. Along these lines, I bought the book 'PostgreSQL on the Cloud' and was all set to go with PostgreSQL since the book laid out a great use case.
One thing held me back though is that it is important for my work to be able to to easily use Excel as a front end for importing and exporting data into and out of the Database on AWS.
It looks like MySQL has an open extension which is fully integrated with Excel and is also well documented. My research into PostgreSQL uncovered a much more uneven integration with Excel and a lot of long painful group frustration a closer integration has not already occurred.
Right now, I am leaning to MySQL, but want to make sure I am not missing something.
Thanks!
Microsoft touts a PostgreSQL plugin as well: https://support.office.com/en-us/article/connect-to-a-postgresql-database-power-query-bf941e52-066f-4911-a41f-2493c39e69e4. Never used it, so can't comment on it.
You mention you are a beginner, so I'll add... be careful about security with either of these options. There are options to encrypt the channel between the client and server, which you indicate is running on AWS. If not secure, anyone would be able to effectively monitor the connections, extract credentials, and do whatever to your AWS-hosted DB. Generally, cloud-hosted DBs should be behind an authentication/authorization login process.

Adding MySQL functionality in LabView

I am new to LabVIEW and trying to make a small project. In LabVIEW from one device, I am measuring some values and then need to store them in database. Initially, I used Excel to store data. But now I need to add MySQL functionality to store data and then later retrieve when need for analyzing.
I look for NI toolkit but it is expensive. I need some free and open source solution for my project.
I search over SO and google to find any examples where I can start and make it work, but I couldn't find any.
If someone suggest me some resources or having some example code that I can use to achieve my goal. thanks in advance.
Take a look at LabSQL. This works in LabVIEW 2017, allowing connection to a MySQL database without NI's LabVIEW Database Connectivity Toolkit.
I normally use the Database Connectivity Toolkit, but I did confirm I could get this to work in 2017 as well (though connecting to a MSSQL database instead of MySQL).
The only thing that tripped me up at first was not using the Create Connection before Open Connection (because I was used to the aforementioned toolkit). I didn't try anything complicated; I just ran a simple selection query. But it looks like everything should work pretty similarly to the toolkit. As adambro said, if you have a more specific question, maybe we can help with an answer.
I would suggest you could use SQLite. It is a fairly easy toolkit. You can download it via the VI package manager. By dr. James Powell. SQLite is excellent in storing data locally.
Use the SQLite browser from sqlitebrowser.org.
Also a nice way to learn SQL!

Google Cloud SQL import size bigger than original DB

I'm confronting a strange situation here with Google Cloud SQL.
I'm migrating a 15.7Gb mysql database to Google cloud. I've followed the migration process exactly as the doc says. And everything worked perfectly. Absolutely no issue during the process, my application works just fine. The only problem here is that the size used by the DB shown on Google Cloud is much bigger that the original DB. Right now I have a 39Gb sql database, from a 15.7Gb database.
After some research and testing I've come to the conclusion that it's the way that Google count the data on their side.
I just wanted to know if somebody have any idea, or can confirm what I'm saying.
Thank you for your answers.
Thanks to #Vadim.
The huge database size was due to the binary logs option that was enabled by default on Google Cloud SQL.
I've pruned them by disabling and re-enabling the binary logs option. The database got from 39Gb to 24Gb.

Unable to find all issues through SonarQube WS API

Goal: Export all SonarQube issues for a project to JSON/CSV.
Approach 1: Mine the sonar mysql database
Approach 2: Use the SonarQube WS API
First I was motivated to go for approach-1, but having discussion with the SonarQube core developer community I got the impression not to touch the database at any situation.
Thus I proceed with approach-2 and developed scripts to get issues. However, later I found that through WS-API, I can get upto 10000 issues which does not meet my goal.
Now I am convinced that the approach-1 i.e., mining the database is best for me. When looking at the "issues" table in sonar db, I have the following question.
Question. What is the format/encoding of the "location" field and how can I decode it from python/java?
Extracting data from database is not recommended at all. Schema and content frequently changes. Each upgrade may break your SQL request. Moreover it contains binary data (issue location) which can't be parsed as-is.
The only way to get data is through web services. If api/issues/search faces a limitation that you consider as critical, then you should explain your functional need to the SonarQube google group.

ASP.NET Web API 2 local MySQL code first

I have done simple Web API project using SQL server database with code first approach, see here. But I am not getting any helpful complete tutorial which explains how to create Web API using code first approach with MySQL as database. Could you please give me some sample example.
There isn't much difference since both are relational databases. You need to install the corresponding nuget (Install-Package MySql.Data.Entity), update connection string and you should be good to go
Here is detailed example - http://www.asp.net/identity/overview/getting-started/aspnet-identity-using-mysql-storage-with-an-entityframework-mysql-provider