mysqldump restore issues - mysql

i am trying to create a backup of my db.
i have separated the schema and the data in two different sql files from mysql dump.
when i try to restore the schema works fine, but when i execute the data.sql file i get too many errors.
sql error
please check the link for the error

Your problem might be that you are trying to insert data into some columns which reference to another column which is not populated yet thereby the query is failing on integrity checks. I suggest you create one sql dump for both the schema and data and run it once. That may save you this headache.

Related

How to import Oracle sql file into MySQL

I have a .sql file from Oracle which contains create table/index statements and a lot of insert statements(around 1M insert).
I can manually modify the create table/index part(not too much), but for the insert part there are some Oracle functions like to_date.
I know MySql has a similar function STR_TO_DATE but the usage of the parameter is different.
I can connect to MySQL, but the .sql file is the only thing I got from Oracle.
Is there any way I can import this Oracle .sql file into MySQL?
Thanks.
Although the above job can be done by manually editing the script appropriately however there are products available which can be of use. Refer to the link for more information on one such product.
P.S. I am not affiliated in any way to the product
Since you mention about insert script basically i think you will be inserting data for this you can use any ETL tool, like open source tool like Pentaho data integrator, pretty simple to do, just search table to table transformation from different database connection on youtube to learn you should be able to connect to both mysql and oracle database else this wont help, but all the table structures you should create manually in the source database for data - you can just load it using ETL, no need to edit for every single line of insert if its more than 100 may be its very painful thing to do.

Accesing data from one mysql database to another in MYSQL Workbench

I have two different databases. I have to access data from one database and insert them into another ( with some data processing included, it is not only to copy data ) Also, the schema is really complex and each table has many rows, so copying data into schema in the second database is not an option. I have to do that using MySQL Workbench, so I have to do it using SQL queries. Is there a way to create a connection from one database to another and access its data?
While MySQL Workbench can be used to transfer data between servers (e.g. as part of a migration process) it is not useful when you have to process the data first. Instead you have 2 other options:
Use a dedicated tool you write yourself to do that (as eddwinpaz mentioned).
Use the capabilities of your server. That is, copy the data to the target server, into a temporary table (using dump and restore). Then use queries to modify the data as you need it. Finally copy it to the target table.

Pulling Data from MySql and dropping it into SQL

I was wondering if anybody had any advice or references that explain how to pull data from MySQL and drop it into SQLSever! Any input would be greatly appreciated!! Thanks!
You can add MySQL as a "linked server"
http://forums.mysql.com/read.php?60,123221,123221
Once you did that you can reference the MySQL tables, views, etc with their fully qualified name in regular SQL Server queries
I could not tell if you wanted to access a MySQL database because you were migrating to SQL server, or if you wanted to access a MySQL database on an ongoing basis. I am answering as if you want to migrate, because you already have an excellent answer on accessing MySQL from SQL Server.
If you want to migrate, mysqldump is one way to go. Its format is easily parsed.
You can also write the results of any MySQL table -- and SQL Server for that matter -- to .csv format. .csv format is nearly a universal transfer format for data.

copy tables between databases

I am using MySQL v5.1.
I am developing a Rails app. and writing a ruby script to copy database. So far, I have got an array of table names, the number of tables is 2090. I need to create all the tables in a new database, my code looks like:
#"table_names" is fetched by execute 'show tables' SQL commands
table_names.each { |tbl_name|
ActiveRecord::Base.connection.execute("CREATE TABLE #{new_db_name}.#{tbl_name} LIKE #{old_db_name}.#{tbl_name}")
}
This code works, but it took a long time to complete, because the code has to execute the CREATE TABLE command one by one and there are 2090 tables to create.
I am wondering is there any way to have bulk creating of tables (like bulk inserting of data) in SQL to save the time? If not, how can I improve the speed of creating the tables? That's copy all 2090 tables from one database to another.
P.S. I don't want to hard code all 2090 table names in SQL file.
Simplest method in mysql is to do a mysqldump of the database in question, then restore it to the new database, e.g:
mysql_dump -pPASSWORD -uUSERNAME name_of_db > name_of_db.sql
mysql -pPASSWORD -uUSERNAME name_of_db < name_of_db.sql
the dump file will contain all the necessary DDL/DML queries to recreate the database, plus disabling foreign keys and whatnot so that the dump can be loaded without causing any foreign key problems while the restored DB is in a halfway state.
Sounds like what you're looking for is a SchemaCompare tool as opposed to a DataCompare tool. This is built into Visual Studio for SQL. This tool will do that: http://toadformysql.com/index.jspa

question about MySQL database migration

If I have a MySQL database with several tables on a live server, now I would like to migrate this database to another server. Of course, the migration I mean here involves some database tables, for example: add some new columns to several tables, add some new tables etc..
Now, the only method I can think of is to use some php/python(two scripts I know) script, connect two databases, dump the data from the old database, and then write into the new database. However, this method is not efficient at all. For example: in old database, table A has 28 columns; in new database, table A has 29 columns, but the extra column will have default value 0 for all the old rows. My script still needs to dump the data row by row and insert each row into the new database.
Using MySQLDump etc.. won't work. Here is the detail. For example: I have FOUR old databases, I can name them as 'DB_a', 'DB_b', 'DB_c', 'DB_d'. Now the old table A has 28 columns, I want to add each row in table A into the new database with a new column ID 'DB_x' (x to indicate which database it comes from). If I can't differentiate the database ID by the row's content, the only way I can identify them is going through some user input parameters.
Is there any tools or a better method than writing a script yourself? Here, I dont need to worry about multithread writing problems etc.., I mean the old database will be down (not open to public usage etc.., only for upgrade ) for a while.
Thanks!!
I don't entirely understand your situation with the columns (wouldn't it be more sensible to add any new columns after migration?), but one of the arguably fastest methods to copy a database across servers is mysqlhotcopy. It can copy myISAM only and has a number of other requirements, but it's awfully fast because it skips the create dump / import dump step completely.
Generally when you migrate a database to new servers, you don't apply a bunch of schema changes at the same time, for the reasons that you're running into right now.
MySQL has a dump tool called mysqldump that can be used to easily take a snapshot/backup of a database. The snapshot can then be copied to a new server and installed.
You should figure out all the changes that have been done to your "new" database, and write out a script of all the SQL commands needed to "upgrade" the old database to the new version that you're using (e.g. ALTER TABLE a ADD COLUMN x, etc). After you're sure it's working, take a dump of the old one, copy it over, install it, and then apply your change script.
Use mysqldump to dump the data, then echo output.txt > msyql. Now the old data is on the new server. Manipulate as necessary.
Sure there are tools that can help you achieving what you're trying to do. Mysqldump is a premier example of such tools. Just take a glance here:
http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
What you could do is:
1) You make a dump of the current db, using mysqldump (with the --no-data option) to fetch the schema only
2) You alter the schema you have dumped, adding new columns
3) You create your new schema (mysql < dump.sql - just google for mysql backup restore for more help on the syntax)
4) Dump your data using the mysqldump complete-insert option (see link above)
5) Import your data, using mysql < data.sql
This should do the job for you, good luck!
Adding extra rows can be done on a live database:
ALTER TABLE [table-name] ADD [row-name] MEDIUMINT(8) default 0;
MySql will default all existing rows to the default value.
So here is what I would do:
make a copy of you're old database with MySql dump command.
run the resulting SQL file against you're new database, now you have an exact copy.
write a migration.sql file that will modify you're database with modify table commands and for complex conversions some temporary MySql procedures.
test you're script (when fail, go to (2)).
If all OK, then goto (1) and go live with you're new database.
These are all valid approaches, but I believe you want to write a sql statement that writes other insert statements that support the new columns you have.