How to migrate DB2 z/OS data to MySQL? - mysql

As part of the data migration from DB2 z/OS (Mainframe) to Google cloud SQL, I don't see the direct service/connector provided by google or IBM. So, I am exploring the option to move the data to MySQL first and then to Cloud SQL.
I could see the solution to migrate from Mysql to cloud SQL but not DB2 to MYSQL.
I searched in google for this need but I could not find the resolution.
Will it be connected based on JDBC connection or something else?

The approach of migrating first to CSV and then to Cloud SQL for MySQL sounds good. As you said, you will need to create a Cloud Storage Bucket [1], upload your CSV file to the bucket, and follow the steps here [2] corresponding to MySQL.
[1] - https://cloud.google.com/storage/docs/creating-buckets
[2] - https://cloud.google.com/sql/docs/mysql/import-export/import-export-csv#import

Related

How to sync MS SQL Server of Azure Portal with WordPress Database?

I want to establish 2 way sync between MS SQL Database with WordPress Mysql Database. I even tried to use Workbench or Microsoft SQL Server Migration Assistant for MySQL. But nothing worked. Every time i get some "Connection to MySQL Failed" Error.
I have even tried all option i have seen online but noting worked.
You have to tell us more about what you are trying to do.
As far as I can tell yo have 2 options:
Use PolyBase: Which is very impractical, close to retirement, lacking in documentation, and personally I wasn't able to install it not even once.
Use Linked Servers: As I posted in the reply you can follow the step by step guide and create a linked server to MySQL. From there you can query the linked server and eventually create a stored procedure that import/export data but forget about an on-time sync
Azure Data Factory: If you are on Azure SQL Database you can setup Azure Data Factory and create a pipeline for sync data. Or eventually use SSIS if you are on prem.

Amazon Redshift Create External Schema MySQL not enabled

We are planning to use Amazon Redshift and planning to use Federated Queries to connect and insert data from our Aurora RDS (MySQL, Provisioned). I followed this documentation to setup the secrets manager store, IAM role for redshift, security group etc.:
https://docs.aws.amazon.com/redshift/latest/dg/getting-started-federated-mysql.html
So now when I try to run the query CREATE EXTERNAL SCHEMA .... FROM MYSQL, I get this error message:
ERROR: CREATE EXTERNAL SCHEMA .. FROM MYSQL is not enabled.
I am running this examples on Redshift Query Editor.
Example query from documentation:
CREATE EXTERNAL SCHEMA <schema_name>
FROM MYSQL
DATABASE '<database_name>'
URI '<endpoint_of_rds>'
IAM_ROLE '<iam_role_attached_to_redshift>'
SECRET_ARN '<secret_arn_contains_rds_credentials>';
I tried to go around Redshift console to check for any setting to enable it but could not find any. Anyone familiar with the setup?
Only question asked so far is this but it is using redshift spectrum: What are the steps to use Redshift Spectrum.?
Please advise. Thanks!
Federated queries are in preview at the momement:
The following is prerelease documentation for the federated query to MySQL feature for Amazon Redshift, which is in preview release.
This means that you have to explicit set your cluster to enable preview features.
You do this by setting maintenance track of your cluster to sql_preview.
It's worth noting that this feature is coming out of preview very soon/already. According to the AWS redshift forums:
https://forums.aws.amazon.com/ann.jspa?annID=8900
Although I should add that I get the same 'not enabled' error and my cluster version is now 1.0.31186, so I'm not sure what's going on with that.
Also this should be a a reply to Marcin's answer but I lack the reputation to use such features.

can we setup a sync of databases between on-premises MySQL server to Azure Database for MySQL

by using mysqldump I am able to migrate data from on-premises to Azure Database for MySQL. but now after dump and restore i want to create a continuous sync between them. how's it possible?
I have followed below link but havn't achieved it. Is it possible or not?
https://learn.microsoft.com/en-us/azure/mysql/howto-data-in-replication#other-useful-stored-procedures-for-data-in-replication-operations
Looks like there is no prebuilt service available in azure.
You can find more information on this Link

Can I import my MySQL data into Google Spanner?

I tried to import a mysqldump generated .sql file but Google spanner didn't accept the syntax, which makes sense.
With that, we're trying to migrate our data, which is in a MySQL data, into Google cloud spanner. Is this possible?
Here's a project that I used to upload my (test) data to Cloud Spanner from a PostgreSQL database: https://github.com/olavloite/spanner-jdbc-converter. It uses standard JDBC functionality, so it should be easy to adapt it to work with MySQL as well.
Another possibility, if the database you're trying to upload is not very big, would be to use a standard database tool that allows you to copy data from JDBC compliant database to another. DBeaver supports this feature. Have a look here for how to set up DBeaver with Google Cloud Spanner: http://www.googlecloudspanner.com/2017/10/using-standard-database-tools-with.html
You can use Harbourbridge to migrate both schema and data to Cloud Spanner from a MySQL source.

GCE instance to copy data from a different GCE MySQL database to Google CloudSQL

This may sound stupid and it is. But i'm wondering if it is possible to create a GCE instance that its sole purpose is to copy data from another GCE's MySQL database and copies all data to a Google Cloud SQL instance every few minutes and essentially updates the GCloud SQL.
Essentially i'm trying to get around how GAE can't connect to a GCE MySQL database but you can connect to a Google cloud SQL database.
I have tried "FEDERATED Tables" however Google Cloud SQL doesn't support that. So this is my last resort.
Thanks
Why do you need the GCE database at all? That is, why can't you just use a Cloud SQL database for all of your database needs?
You could try manually replaying the binary log to your Cloud SQL instance, ie:
Enable binary logging on your GCE MySQL instance.
Use mysqlbinlog to periodically extract the latest contents of the log as SQL statements. Use the positioning functions to make sure each run starts where the last finished.
Send the SQL outputted by mysqlbinlog to your Cloud SQL instance.