I have an existing MySQL database that I want to refresh with data from MSAccess database. I already have the SQL files created from the Access database with all of the insert statements. There are 3 SQL files, the largest of which is 8 MB.
The database is on an AWS server. In the past I've imported data using Sequel Pro from my Mac. This is very slow and subject to session failure.
Now I've figured out how to create the SQL files on my Windows VM and FTP them directly to the AWS server. My intent is to have a stored procedure truncate all tables and SOURCE the SQL files:
SOURCE /home/me/file1.sql ;
SOURCE /home/me/file2.sql ;
etc...
The stored procedure would also do any prep work on the tables and any post-import things needed like fixing foreign keys, etc.
The first problem is that this command doesn't work and causes syntax error:
set autocommit=0 ; source /home/me/CBD.sql ; commit ;
"source" is squiggly underlined and it says "missing colon". This happens whether or not I use the auto commit stuff.
Any ideas how I could do this?
Thanks...
Related
I previously used MySQL Workbench to do this, in an environment that was already set up.
How do I set up a minimal working environment to just create and join tables on my own computer? (Connections???)
More details:
I downloaded and installed MySQL Workbench, and I can't even run SELECT sysdate();. There's a red x next to it. If I try "CREATE DATABASE MY_DATABASE; there's a green check, but the execute button is grey.
Doing some reading I apparently need "connections." Reading about that, I apparently need to also install MySQL Database Server. Who knows what else.
So, again, the question is how do I set up a minimally working environment to just create tables from .csv files, join them with MySQL commands, and export the results to another .csv file? (I know the syntax of the command to import a .csv file, and how to join tables.)
Thanks.
Install MySQL WorkBench AND MySQL Server.
From the command line, in the directory where MySQL server is installed, execute "mysqld --initialize" (One time only.)
execute "mysqld" from the command line, after the initialization given in step 1, and after any reboots. (It runs in the background, and doesn't exit when you exit MySQL WorkBench. (It can optionally be installed as an automatically running Windows service during installation.)
Execute Database -> Connect to Database upon starting MySQL WorkBench (each time you start the application). The default local host connection works fine.
After doing File -> New Model and setting up table(s), do Database -> Forward Engineer. This will place your new database in the Schemas section on the home/main window.
Double click on the Schema you created (default name is mydb) and it changes to bold font. Now scripts you run from that main window will run against the database you created.
I'm moving a SQL Server table from SQL Server 2008 R2 to MySQL on my c:\ drive by using SSIS.
In SSIS, I have created ADO.Net source and destination connections with a 64 bit unicode driver I downloaded and have managed to successfully to transfer the data from SQL Server to a newly created MySQL table.
However, even though all rows were copied successfully, the table doesn't show up in the MySQLdatabase with my other tables. SSIS put the table in a different directory from the one where the database tables are located. I don's see how to change the location. When I'm selecting a table in SSIS for the data to go to, it doesn't show the right list of tables.
It's showing tables located in c:\xampp\mysql\data\mysql while my database tables are located in c:\xampp\mysql\mydb. How can I access these new tables, or direct SSIS to use a different path?
Thanks for your response. I have a local instance of mySQL. As Ivan suggested, it is a problem with the connection I setup. I didn't see anyway to change that, since when you look at the connections offered in SSIS after picking "NEW" to create a new connection in the ADO.Net setup, there's a drop down menu with my existing connection and a few others but that's it, so what you have to do is opt for creating a new connection string, and in that set of menus I was able to add a new connection (the string is built for you), so when I redid the connection, and moved the next table it went into the database. So, that worked.
Just as an aside, Of course, I found that the transformation will only succeed if you run the following statement in MySQL: "SET GLOBAL sql_mode='ANSI' once you've started up MySQL from the XAMP control panel.
I have a local database thats about 1GB and my remote host is a free host that I am using for testing. want to make sure everything works before i spend money on a paid host. The problem is the phpmyadmin on the remote server only allows 50mb files which which just doesn't cut it, especially since the restore usually fails due to execution time limits. Below is the list of everything I've tried.
LOCAL
phpmyadmin -----> backingup of table no longer work because of timeout even with modified php.ini settings because of shear size of db
mysqldumper -----> program creates dumps with inserts, there is no option for me to make it create insert ignores. ill explain the problem later below.
mysqlworkbench -----> creates database using database name of my local server (problem is my remote server has a different database name and i cant open a 1gb .sql file to edit the database name at the very top. computer just craps out and I have to force quit workbench)
sqlsplitter (mac program) cuts up large .sql or .sql.gz files
REMOTE
phpmyadmin with .gz/.sql files cut up into 20mb chunks
-----> timeout. phpmyadmin resume function doesnt work either. it just overwrites old data
mysqldumper -----> process ends up in an error randomly midway through my restore on remote server using a backup created with mysqldumper on my local computer (single file or multipart, both dont work). could be at 10% completion, could be at 50%.
bigdump -----> used single and multipart dumps from mysqldumper, same problem. randomly quits halfway through. some multiparts were successful in completing, but when one failed and I tried the failed part again, it would give me an error saying unique key already exists in table. i dont want to unset all my unique key stuff and have to go through and delete all duplicates later.
mysqldumper -----> does not work with dump from mysqlworkbench
bigdump -----> gives me an error sql error denied for creating database using dump from mysqlworkbench (i cannot open up a 1 gb file to delete that 1 line that says create database)
Does anybody know of a better method to upload to my host? I have no command line access on there and only a 500mb space limit (no limit on sql space though).
Thanks
Use mysqldump. Figure out what the error you're seeing is, and fix it. The mysqldump utility works. I've restored dumpfiles with hundreds of gigabytes of data to servers, and never use anything else. If it doesn't work for you, you're doing something incorrectly.
You can prevent it from writing a USE database-name; statement at the top of the file by invoking it with the database name as the last argument, without using the --databases option before it.
You can add the --insert-ignore command line option to write all the INSERT statements as INSERT IGNORE to work around your partial insert issues
You can use --no-data to extract a dump file that contains table definitions, not data, and get all of the tables declared, first.
You can use the --no-create-info option to extract a dump file with just the inserts, not the table definitions.
http://dev.mysql.com/doc/refman/5.6/en/mysqldump.html
You can also use a simple bash loop to extract each table into its own file, so you have smaller files to work with:
for TABLE in `mysql [args] -e 'show tables in database-name'`; do mysqldump [args] database-name $TABLE > $TABLE.sql; done
When restoring the files, add the --compress option to the mysql command line arguments for a faster transfer, and specify your (new) database name as the last argument, so the client will use the correct database before applying the file, which no longer contains the database name.
I have been working on this all day and now I'm stuck, so hopefully someone out there can help me :)
The challenge.
Migrate data from MS SQL to MySQL.
The MS SQL I received as a bak file, which I restored using SQL server management studio on a PC with Windows 7 Home edition.
I have created a source MySQL database on a webserver, running LAMP.
The solution (maybe)
I'm currently trying to convert the database, initially just one table for testing, using MySQL Workbench with the database->migrate wizard, but now I'm stuck at the Bulk Data Transfer. I would expect this step to create the table in my MySQL database and transfer the data, but that never happens.
For the source I choose Connection Method = ODBC (native)
No problems connecting to the source and destination databases
I choose to keep schema info as table prefix, so imported tables look like: database.dbo_table_name
Migration step succeeds (migrate selected objects & Generate SQL Create Statements)
The Create statements look like this if I don't edit them
CREATE TABLE IF NOT EXISTS 'restored_database_name'.'dbo_table_name' … I think the 'restored_database_name' part causes a permission error. It does if I type this in the SQL tab directly in phpMyAdmin. Therefore I change it to:
CREATE TABLE IF NOT EXISTS 'dbo_table_name' …
Also per default this is part of the SQL:
DROP SCHEMA IF EXISTS 'restored_database_name';
CREATE SCHEMA IF NOT EXISTS 'restored_database_name …
I think this also causes some permission issues, so I commented these out.
In the next step I uncheck the 'Create schema in target RDBMS' since I don't think I want this.
The problem:
Nothing interesting for the next steps, but then at the "Bulk data Transfer" I get this error:
ERROR: 'restored_database_name'. 'dbo_database_name': mysql_stmt_prepare: SELECT command denied to user 'mysql_user_name'#host' for table 'dbo_database_table_name.
Finished copying 0 rows.
I think the error is somehow related to permissions on the destination database. I wonder if it is possible to make sql file, not just with the create table commands, but also with INSERT commands so I could just take the sql and import it using command line or phpmyadmin.
I'm using Workbench 5.2 CE
Any help is appreciated
I've seen that you have made your way into the Workbench's Migration Wizard. Maybe you're just missing something so I suggest you to review this blog so you can verify your steps: How-To: Guide to Database Migration from Microsoft SQL Server using MySQL Workbench.
Unfortunately you can't use the Migration Wizard data copy command line utility to generate the SQL file with all the inserts, but I'm pretty sure you can get this from MS SQL Server Management Studio and it should pretty much work for MySQL without modifications (or with minor modifications).
I had copyed the data folder under wamp/bin/mysql/mysql(v)/data befor formate my computer, then after installing new os and then wamp I replace the data folder.
Now when I opening phpmyadmin the list of databases are showing but under the data base the tables are not shownig.
When I am using the myadminer it shows the lisst of table but not the tables data.
when I am using sqlbuddy one warning is showing in the place of listing tables. warning is like
Warning: array_key_exists() expects parameter 2 to be array, boolean given in E:\wamp\apps\sqlbuddy1.3.3\dboverview.php on line 215
you should have taken a proper backups - using mysqldump. if you used innodb storage engin and it was configured to keep log files or tablespace files somewhere else - i'm afraid you've lost your data.
I'm sorry, you have lost your data. In the future create backups frequently and create MySQL dump whenever you want to migrate your database.