Can I import my MySQL data into Google Spanner? - mysql

I tried to import a mysqldump generated .sql file but Google spanner didn't accept the syntax, which makes sense.
With that, we're trying to migrate our data, which is in a MySQL data, into Google cloud spanner. Is this possible?

Here's a project that I used to upload my (test) data to Cloud Spanner from a PostgreSQL database: https://github.com/olavloite/spanner-jdbc-converter. It uses standard JDBC functionality, so it should be easy to adapt it to work with MySQL as well.
Another possibility, if the database you're trying to upload is not very big, would be to use a standard database tool that allows you to copy data from JDBC compliant database to another. DBeaver supports this feature. Have a look here for how to set up DBeaver with Google Cloud Spanner: http://www.googlecloudspanner.com/2017/10/using-standard-database-tools-with.html

You can use Harbourbridge to migrate both schema and data to Cloud Spanner from a MySQL source.

Related

How to migrate DB2 z/OS data to MySQL?

As part of the data migration from DB2 z/OS (Mainframe) to Google cloud SQL, I don't see the direct service/connector provided by google or IBM. So, I am exploring the option to move the data to MySQL first and then to Cloud SQL.
I could see the solution to migrate from Mysql to cloud SQL but not DB2 to MYSQL.
I searched in google for this need but I could not find the resolution.
Will it be connected based on JDBC connection or something else?
The approach of migrating first to CSV and then to Cloud SQL for MySQL sounds good. As you said, you will need to create a Cloud Storage Bucket [1], upload your CSV file to the bucket, and follow the steps here [2] corresponding to MySQL.
[1] - https://cloud.google.com/storage/docs/creating-buckets
[2] - https://cloud.google.com/sql/docs/mysql/import-export/import-export-csv#import

How to save Amazon Redshift output to local csv through SQL Developer

I am trying save Amazon Redshift output to local csv through SQL Developer. Please suggest a solution other than the export wizard in SQL Developer to generate csv, as the data volume is in 10s of millions and the extract is required frequently. Spool command is creating file in local but unable to format it as a CSV file.
The only reason we support redshift's JDBC driver in SQL Developer is so you can migrate that data warehouse to Oracle Autonomous DW Cloud Service.
Yes, you can run queries and spool CSV to local files, and that might be faster than using the wizard, but what you're describing isn't really what ORACLE SQL Developer is built for.

Importing .bak from MSSQL into MySQL database

My companies site uses a mysql database. One of our clients just trying to take advantage of our API is only able to give us the data in the form of a MSSQL .bak file
I have been trying to import the file using the migration tool built inot mysql workbench but have no luck.
On top of that I am trying to see if this can be done in powershell as I would like to automate this process in the future.
Any suggestions or help would be appreciated
You cannot. MS SQL Server backups are proprietary to MS SQL Server and cannot be used with any other RDBMS. You will need to restore this backup to SQL Server, then use an additional tool to transfer the data from SQL Server into MySQL.
Can you do that second portion through PowerShell? Probably. Though SSIS would probably be a better method.

Convert Azure SQL to MySQL

I'm a fairly new developer working on a database for a university research project. I created the database in Microsoft Access then used the SSMA Access to SQL migration tool to export it to Azure SQL. I'm now building a Ruby on Rails implementation of the databases front end on a Debian VPS and would like to migrate the Azure SQL database to MySQL for testing purposes with a view to eventually converting all of the database front ends to connect to the MySQL database.
I've been able to find plenty of articles discussing moving MySQL to Azure SQL but very little which details the process in reverse. Any and all help would be appreciated!
Thanks,
Mike
Mysql allows you to import full database dumps (table definitions and content) as long as they are in sql format, in a text file.
As long as you manage to generate a dump of your database as a text file of sql statements you should be ok.
The only thing is azure may not give you the chance to generate that dump.
But you'll probably have third party tools that will allow you to.
If not, since you initially built you database in access you will for certain find free tools to do a access to mysql migrations. (a quick google search shows me bullzip and mdbtools as two free tools that do just that.)
Once you have your sql dump file just import it into mysql from the command line or using the source command in the mysql client.
Wouldn't it be better to write code that is database agnostic. That is your code shouldn't care what database you are using

load yago files into MySql

I want to load the files of Yago database into my database in mysql. I tryed what is written in yago's website (to run the script Postgres.sql) but since I work on windows it does not recognize the "psql" operation. I tryed also to open the script directly in mysql but it says that there is sql syntax. What can I do? Thanks for helpers!
You are trying to load a Postgres SQL script into a mySQL database.
Postgres and mySQL speak different dialects of SQL and can't necessarily understand scripts meant for eachother.
If you want to load up a postgres SQL script, use a postgres database.
Installers for the windows version of postgres are available here:
https://www.postgresql.org/download/windows/
Once you have postgres installed, you can then use an admin tool e.g.
PGadmin (https://www.pgadmin.org/) to run the script on your DB and import the information within.
Feel free to comment if you want more detailed instructions.