Migrate from mysql to postgre in Google Cloud SQL - mysql

Does anyone know how to migrate from mysql db to postgre db in Google Cloud SQL ?
I tried browsing the web put I can't really find any instructions how to accomplish this
The Data Migration service only enables you to upgrade major version within same db but not to switch to different db

According to the documentation, DMS currently supports only homogeneous database migration 1 here’s a link for best practices 2.
There are currently no other Google Cloud tools to do the MySQL to PostgreSQL migration as you are looking for.
Nevertheless, in order to do the MySQL to PostgreSQL migration, a conversion would be necessary as the Databases are not entirely similar.
There is a possible workaround in stackoverflow link that shares multiple solutions to do a conversion, please keep in mind that the information is supported by the community meaning Google Cloud Platform cannot vouch for it.
With the aforementioned, you have two options in order to do the migration. In the first one, you would need to follow the next steps:
1.- Do an export of your data in a specific format (dump file or csv) as the documentation mentions 4.
2.- Do the conversion of the data in order to have the right format (Postgresql) 3.
3.- Do the import of the data as the documentation mentions 5.
On the other hand, the second solution could be using the 3rd party tool “pgloader” 6,7 that may help you with the migration.

Related

How to migrate from Oracle 10g databse to MySql 8 database

We are currently using Oracle 10G database for the backend support of our application. We need to migrate the entire Oracle database schema into MySQL database, including all existing tables, views, procedures, triggers and sequences etc.
Can anyone kindly help me to guide the steps of migration, without hampering any schema definitions, keys and constraints etc.??
Also I came to know that MySQL does not support 'sequences'. In that case how can we convert the sequences which are present in Oracle database?
Please don't just mention any tool name, because I found few tools online but those are really lengthy and cumbersome processes to follow. Kindly mention step-wise, so that it's understandable easily.
I used Sql Developer IDE earlier, but it supports the reverse way migration, that is, from MySql to Oracle, not the one I need. Hence, I could not use it.
There is an Oracle Doc ID 1477151.1 for that case.
Though you asked to not mention any tool name, in that document Oracle advises to use MySQL Migration Wizard and provide some script examples for manual migration in case if automatisaton won't work.
Check those out. I hope that'll help
UPD: Again, I'm aware of you asking not to mention any tool, but here's another excerpt from that doc where even Oracle clearly says you have to use a third-party tool
Migration of Stored Procedures, Functions, Packages, Triggers, Views, Sequences must be performed using third party tools and needs manual effort. This document highlights method to perform data migration.
There are a host of third party tools, some of which are open source. For example:
http://www.sqlines.com/oracle-to-mysql
http://kettle.pentaho.com/
http://www.convert-in.com/ora2sql.htm
http://www.ispirer.com/products/oracle-to-mysql-migration

SaaS application on shared server with multiple databases

I am going to create a SaaS application in PHP. In that application the user can create and manage multiple tables to extend functionality. After user finish with the application he can download php code and database.
We will also provide sql import functionality so the user can create schema from (.sql) file.
I search on google but not found any proper solution. You can consider sqlfiddle functionality here.
I have 2 options in my mind but need better solution:
1) For creating multiple database and its tables, use table prefix as a solution
2) Convert mysql to sqlite. At the time of download create export as mysql (.sql) file.
It can have aprox. 10,000 users/databases. Please suggest a solution to provide each user a seperate database if any.
If shared server will not work I will purchase VPS. The main requirement is to provide each user their own database.
I am going to choose sqlite as a choice for db. After doing some benchmark sqlite seems good option for DDL and DML operation.
I will use mysql to sqlite .sql converter: https://github.com/sutara79/convert-mysql-to-sqlite
To improve the speed I follow following stackoverflow post:
Improve INSERT-per-second performance of SQLite?

Unable to find all issues through SonarQube WS API

Goal: Export all SonarQube issues for a project to JSON/CSV.
Approach 1: Mine the sonar mysql database
Approach 2: Use the SonarQube WS API
First I was motivated to go for approach-1, but having discussion with the SonarQube core developer community I got the impression not to touch the database at any situation.
Thus I proceed with approach-2 and developed scripts to get issues. However, later I found that through WS-API, I can get upto 10000 issues which does not meet my goal.
Now I am convinced that the approach-1 i.e., mining the database is best for me. When looking at the "issues" table in sonar db, I have the following question.
Question. What is the format/encoding of the "location" field and how can I decode it from python/java?
Extracting data from database is not recommended at all. Schema and content frequently changes. Each upgrade may break your SQL request. Moreover it contains binary data (issue location) which can't be parsed as-is.
The only way to get data is through web services. If api/issues/search faces a limitation that you consider as critical, then you should explain your functional need to the SonarQube google group.

Configuring mysql database for psiturk experiment

I recently picked up a task at a lab I volunteer at and my PI said their experiment wasn't working because no workers were able to complete the task...
My hypothesis is that their sqlite implementation doesn't allow for proper recording of experimental data due to sqlite's ineffectiveness at concurrent operations (as stated in the psiturk documentation).
My question is, how can I properly set up a mysql database to work with their already made experiment?
I created a new database called "particpants" from the mysql interpreter. Then I started the mysql server successfully...
Next, I changed the database_url in the "config.txt" file from being equal to sqlite://participants.db (a local database file) to being equal to mysql://aweeeezy#localhost:3306/participants, but I can not connect to the database when I try to start the psiturk server.
I also tried mysql://aweeeezy#localhost/particpants...I can't figure out how to format this database_url string so that the experiment works with mysql, and I haven't found anything helpful when searching through mysql related posts and/or psiturk related posts.
Please help a databases noob!
The format for the database_url field of config.txt using MySQL needs to be mysql://username:password#hostname/database. psiTurk 2.0 depends on SQLAlchemy and some docs about that are here: http://docs.sqlalchemy.org/en/rel_1_0/core/engines.html

Conversion from Microsoft SQL Server to MySQL

How do I convert a Microsoft SQL Server database backup file such that to import in MySQL database? Is there any way or free tool available for this?
Not sure about how complex a database you have, but if its just some tables and data, there is a free script here that will automagically convert Microsoft SQL Server tables and data over to MySQL.
If you need something more sophisticated, then MySQL has a migration toolkit which allows you to migrate from Microsoft SQL Server to MySQL. In addition here is a tutorial on how to use it. Note that this has now been discontinued, in favor of MySQL Workbench, which has data migration built in.
In addition, this converter will convert everything except stored procedures from MSSQL to MySQL, for a price of only $50 which isn't bad.
Also you may want to check out this whitepaper from MySql's website on how to plan a migration from SQL Server to MySQL, as well as some resources.
ms2my (Pre-Alpha, free)
http://sourceforge.net/projects/ms2my/
"A tool that helps with MSSQL to MySQL converting/replicating (both csv&dump) under *nix.Possible to use it with crontab for regular data fetching.Keeping mySQL-based data warehouse refreshed could also be one of the possibilities of using this script."
MSSQL to MySQL Converter (free trial download, for purchase $49)
http://www.convert-in.com/mss2sql.htm
I've looked for quite awhile, and if you don't want to try ms2my, the above is about the only other option. And it isn't free.
Best of luck finding a free one, hopefully there is one that is hidden away out there on the internet that I can't find.
If you are using a living MS SQL Server and a living MySQL server, then I think your best AND MORE ACCURACY option is to use an ETL/data transformation tool like Pentaho Data Integration (Kettle).
With Kettle you can visually design (using easy-to-learn data flow steps) almost any data transformation from single/multiple data source(s) to single/multiple data destination(s). One of the features you may be interested is the database/tables migration wizard.
If the community distribution of Kettle is not enough for you, then you can use the Enterprise Edition with more features, support, etc.
Take a look at Omega Sync it supports export import and synchronisation between different DBMS's including Schema and table data.