Importing 200k table [duplicate] - mysql

During development, how our local WAMP servers get up-to-date data from the test server is that a dump of the database is made and we upload that dump using the source command to load the .sql file.
Recently, at the very end of the import we have been getting errors about the #old variables which stored the original settings like foreign key constraints before they’re changed (so turning off foreign key constraints so that the import doesn’t throw errors when it recreates tables and attempts to create foreign keys when one of the tables has yet to be created). I have worked out that the cause is that the product table is getting more and more data and at a point the session has timed out during the import.
I’m wondering what setting can I set (either as part of the SQL query on in the my.ini file) that will stop all timeouts, in effect making a session last forever while we are signed in.

Strategies for importing large MySQL databases
PHPMyAdmin Import
Chances are if you’re reading this, PHPMyAdmin was not an option for your large MySQL database import. Nonetheless it is always worth a try, right? The most common cause of failure for PHPMyAdmin imports is exceeding the import limit. If you’re working locally or have your own server, you can try changing the MySQL ini settings usually found in the my.ini file located in the MySQL install folder. If you’re working with WAMP on Windows, you can access that file using the WAMP control panel under MySQL > my.ini. Remember to restart WAMP so your new settings will be used. Settings you may want to increase here include:
max_allowed_packet
read_buffer_size
Even with enhanced MySQL import settings you may still find that imports time out due to PHP settings. If you have access to PHP.ini, you can make edits to the maximum execution time and related settings. In WAMP, access the PHP.ini file under the WAMP control panel at PHP > php.ini. Consider raising the limits on the following settings while trying large MySQL imports:
max_execution_time
max_input_time
memory_limit
Using Big Dump staggered MySQL dump importer
If basic PHPMyAdmin importing does not work, you may want to try the Big Dump script from Ozerov.de for staggered MySQL imports. What this useful script does is run your import in smaller blocks, which is exactly what is often needed to successfully import a large MySQL dump. It is a free download available at http://www.ozerov.de/bigdump/.
The process of using Big Dump is fairly simple: you basically position your SQL import file and the Big Dump script together on the server, set a few configs in the Big Dump script and then run the script. Big Dump handles the rest!
One key point about this otherwise great option, is that it will not work at all on MySQL exports that contain extended inserts. So if you have the option to prevent extended inserts, try it. Otherwise you will have to use another method for importing your large MySQL file.
Go command line with MySQL console
If you’re running WAMP (and even if you’re not) there is always the option to cut to the chase and import your large MySQL database using the MySQL console. I’m importing a 4GB database this way as I write this post. Which is actually why I have some time to spend writing, because even this method takes time when you have a 4GB SQL file to import!
Some developers (usually me) are intimidated by opening up a black screen and typing cryptic commands into it. But it can be liberating, and when it comes to MySQL databases it often the best route to take. In WAMP we access the MySQL console from the WAMP control panel at MySQL > MySQL Console. Now let’s learn the 2 simple MySQL Console commands you need to import a MySQL database, command-line style:
use `db_name`
Command use followed by the database name will tell the MySQL console which database you want to use. If you have already set up the database to which you are importing, then you start by issuing the use command. Suppose your database is named my_great_database. In this case, issue the following command in the MySQL Console. Note that commands must end with a semi-colon.
mysql-> use my_great_database;
mysql-> source sql_import_file.sql
Command source followed by the location of a SQL file will import the SQL file to the database you previously specified with the use command. You must provide the path, so if you’re using WAMP on your local server, start by putting the SQL file somewhere easy to get at such as C:\sql\my_import.sql. The full command with this example path would be:
mysql-> source C:\sql\my_import.sql;
After you run that command, the SQL file should begin to be imported. Let the queries run and allow the import to complete before closing the MySQL console.
Further documentation for MySQL command line can be found here: http://dev.mysql.com/doc/refman/5.5/en/mysql.html.
Another solution is to use MySQL Workbench.

This solution worked for me:
max_allowed_packet <-- --> upped size to 8M
read_buffer_size <-- --> upped from 256 to 512
Using Xampp control panel on localhost. After making the changes to the my.ini file in MySQL config, don’t forget to quit Xampp (or Wamp) and restart it for changes to take effect.
(Four days of head-banging and I finally got it fixed!)
Symptoms were on Import: #2006 MySql server went away. However, only 10 table rows were being imported out of 87 table rows.

Consider using MySQL Workbench, it's free and handles very large script very well (from the menu choose: File -> Open SQL Script - if it's large, it will ask you if you'd like run it). Has served me well over the years when working with large SQL dumps.

Related

Read MYD, MYI, FRM files

I've read several posts on this forum and others and still can't read some MYD/MYI/FRM files which contain data that I want to import into SQL Server.
I stopped the MySQL service at the client location and copied all the files (the same way that we would do with SQL Server).
I have installed the most recent MySQL ODBC connector, MySQL engine and MySQL Workbench on my Windows 10 PC. The engine is running and the ODBC connector is configured with a username and password. Using the workbench, I have been able to successfully log in.
I created a MyTest database which created a C:\ProgramData\MySQL\MySQL Server 8.0\Data\mytest folder.
I stopped the MySQL service and copied MyData.MYD, MyData.MYI and MyData.FRM to the mytest folder.
I restarted the MySQL service.
But if I run a "Select * from mytest.MyData" query in the workbench or using the MySQL 8.0 Command Line Client, it keeps indicating that the table mytest.MyData doesn't exist. Now, I read that it might be appropriate to run a CHOWN command but I don't know what if syntax would be different for a Windows system and whether this command would be executed directly from a Command prompt.
Any assistance would be greatly appreciated. Thanks!
Older versions of MySQL let you get away with that for MyISAM tables, but MySQL 8.0 introduced a whole new implementation of "data dictionary" which is a meta-database maintained internally, and it has a bias for InnoDB over MyISAM. So I would believe that moving files around ad hoc no longer works as it once did. To move tables from one instance to another, you should really do the export and import steps properly.
For InnoDB, you may use transportable tablespaces. At least you don't have to do a full data import that way, but it takes a few more steps than just file copy.
P.S.: I haven't used MyISAM in many years anyway. I prefer my database to support ACID properties, and MyISAM does not support any of the ACID properties.

Export table from Xampp MySQL from file directory

I have an error with Xampp where it will not connect to my database. I have had this error several times in the past and no solutions have helped. I find that uninstalling and reinstalling ends up being the fastest method, however, by doing so I lose my database tables. As I have no access to the default export and import functions, I was wondering how I transfer the tables across.
Last time, I copied and pasted these files into the same directory when I reinstalled Xampp and the tables transferred but the data within did not. The tables had also lost some functionality as well.
If anyone has any methods to do this I will greatly appreciate it.
The best way to backup and restore your data is to dump the database to an SQL file. You can use the provided mysqldump tool for that, which is the best tool for the job.
Copying the entire MySQL "datadir" (which is probably one level up from the screenshot you've posted) and completely replacing one of your new stopped MySQL instance will probably work, but is not reliable and is not supported. Since there are other files (such as ibdata1) that handle the overall health and functioning of the entire MySQL instance, you can't only copy over individual database directories. Furthermore, you should only do this when the original and replacement MySQL servers/daemons/services are completely stopped because otherwise there are file corruption or concurrency issues.
Since you're running the server on your own computer, you should grant yourself sufficient access to be able to use mysqldump and use that regularly as a backup, since file-level backups of the data directory are not reliable.

Browse and modify an SQL dump from MySql Workbench

I have a mysqldump from an SQL server and i want to open it in a program like MySQL Workbench or DBeaver so that i can easily search it and remove some values etc.
I'm trying to use MySQL Workbench however am unsure if i can import this SQL DUMP directly into here from the file like this. I have created in a new model and clicked import and it seems to show all my tables however they are empty.
Is this possible and how would i go about this?
MySQL Workbench can restore a dump without loading it first into an editor (and hence can even handle gigabyte sized dumps). For this go to the Data Import/Restore admin section,
select your dump, set options (e.g. what of the dump to restore) and click Start Import to start the process.
However this doesn't allow to change the dump and because of usual dump sizes even browsing them is often not possible. You can try to load the dump file into an editor if it is not too large (say around 250MB, depending on system RAM). If it is much larger you can only try special tools like hex editors (which load large files piece by piece).
As far as I remember, MySQLDump exports the database to an sql script with a .sql file extension?
In MySQL Workbench, open this file Using File->Open SQL Script or alternatively CTRL+SHIFT+O
Running the script should then create the database

How to migrate large mysql database where remote server has stringent limits?

I have a local database thats about 1GB and my remote host is a free host that I am using for testing. want to make sure everything works before i spend money on a paid host. The problem is the phpmyadmin on the remote server only allows 50mb files which which just doesn't cut it, especially since the restore usually fails due to execution time limits. Below is the list of everything I've tried.
LOCAL
phpmyadmin -----> backingup of table no longer work because of timeout even with modified php.ini settings because of shear size of db
mysqldumper -----> program creates dumps with inserts, there is no option for me to make it create insert ignores. ill explain the problem later below.
mysqlworkbench -----> creates database using database name of my local server (problem is my remote server has a different database name and i cant open a 1gb .sql file to edit the database name at the very top. computer just craps out and I have to force quit workbench)
sqlsplitter (mac program) cuts up large .sql or .sql.gz files
REMOTE
phpmyadmin with .gz/.sql files cut up into 20mb chunks
-----> timeout. phpmyadmin resume function doesnt work either. it just overwrites old data
mysqldumper -----> process ends up in an error randomly midway through my restore on remote server using a backup created with mysqldumper on my local computer (single file or multipart, both dont work). could be at 10% completion, could be at 50%.
bigdump -----> used single and multipart dumps from mysqldumper, same problem. randomly quits halfway through. some multiparts were successful in completing, but when one failed and I tried the failed part again, it would give me an error saying unique key already exists in table. i dont want to unset all my unique key stuff and have to go through and delete all duplicates later.
mysqldumper -----> does not work with dump from mysqlworkbench
bigdump -----> gives me an error sql error denied for creating database using dump from mysqlworkbench (i cannot open up a 1 gb file to delete that 1 line that says create database)
Does anybody know of a better method to upload to my host? I have no command line access on there and only a 500mb space limit (no limit on sql space though).
Thanks
Use mysqldump. Figure out what the error you're seeing is, and fix it. The mysqldump utility works. I've restored dumpfiles with hundreds of gigabytes of data to servers, and never use anything else. If it doesn't work for you, you're doing something incorrectly.
You can prevent it from writing a USE database-name; statement at the top of the file by invoking it with the database name as the last argument, without using the --databases option before it.
You can add the --insert-ignore command line option to write all the INSERT statements as INSERT IGNORE to work around your partial insert issues
You can use --no-data to extract a dump file that contains table definitions, not data, and get all of the tables declared, first.
You can use the --no-create-info option to extract a dump file with just the inserts, not the table definitions.
http://dev.mysql.com/doc/refman/5.6/en/mysqldump.html
You can also use a simple bash loop to extract each table into its own file, so you have smaller files to work with:
for TABLE in `mysql [args] -e 'show tables in database-name'`; do mysqldump [args] database-name $TABLE > $TABLE.sql; done
When restoring the files, add the --compress option to the mysql command line arguments for a faster transfer, and specify your (new) database name as the last argument, so the client will use the correct database before applying the file, which no longer contains the database name.

How to Prevent Connection Timeouts for Large MySQL Imports

During development, how our local WAMP servers get up-to-date data from the test server is that a dump of the database is made and we upload that dump using the source command to load the .sql file.
Recently, at the very end of the import we have been getting errors about the #old variables which stored the original settings like foreign key constraints before they’re changed (so turning off foreign key constraints so that the import doesn’t throw errors when it recreates tables and attempts to create foreign keys when one of the tables has yet to be created). I have worked out that the cause is that the product table is getting more and more data and at a point the session has timed out during the import.
I’m wondering what setting can I set (either as part of the SQL query on in the my.ini file) that will stop all timeouts, in effect making a session last forever while we are signed in.
Strategies for importing large MySQL databases
PHPMyAdmin Import
Chances are if you’re reading this, PHPMyAdmin was not an option for your large MySQL database import. Nonetheless it is always worth a try, right? The most common cause of failure for PHPMyAdmin imports is exceeding the import limit. If you’re working locally or have your own server, you can try changing the MySQL ini settings usually found in the my.ini file located in the MySQL install folder. If you’re working with WAMP on Windows, you can access that file using the WAMP control panel under MySQL > my.ini. Remember to restart WAMP so your new settings will be used. Settings you may want to increase here include:
max_allowed_packet
read_buffer_size
Even with enhanced MySQL import settings you may still find that imports time out due to PHP settings. If you have access to PHP.ini, you can make edits to the maximum execution time and related settings. In WAMP, access the PHP.ini file under the WAMP control panel at PHP > php.ini. Consider raising the limits on the following settings while trying large MySQL imports:
max_execution_time
max_input_time
memory_limit
Using Big Dump staggered MySQL dump importer
If basic PHPMyAdmin importing does not work, you may want to try the Big Dump script from Ozerov.de for staggered MySQL imports. What this useful script does is run your import in smaller blocks, which is exactly what is often needed to successfully import a large MySQL dump. It is a free download available at http://www.ozerov.de/bigdump/.
The process of using Big Dump is fairly simple: you basically position your SQL import file and the Big Dump script together on the server, set a few configs in the Big Dump script and then run the script. Big Dump handles the rest!
One key point about this otherwise great option, is that it will not work at all on MySQL exports that contain extended inserts. So if you have the option to prevent extended inserts, try it. Otherwise you will have to use another method for importing your large MySQL file.
Go command line with MySQL console
If you’re running WAMP (and even if you’re not) there is always the option to cut to the chase and import your large MySQL database using the MySQL console. I’m importing a 4GB database this way as I write this post. Which is actually why I have some time to spend writing, because even this method takes time when you have a 4GB SQL file to import!
Some developers (usually me) are intimidated by opening up a black screen and typing cryptic commands into it. But it can be liberating, and when it comes to MySQL databases it often the best route to take. In WAMP we access the MySQL console from the WAMP control panel at MySQL > MySQL Console. Now let’s learn the 2 simple MySQL Console commands you need to import a MySQL database, command-line style:
use `db_name`
Command use followed by the database name will tell the MySQL console which database you want to use. If you have already set up the database to which you are importing, then you start by issuing the use command. Suppose your database is named my_great_database. In this case, issue the following command in the MySQL Console. Note that commands must end with a semi-colon.
mysql-> use my_great_database;
mysql-> source sql_import_file.sql
Command source followed by the location of a SQL file will import the SQL file to the database you previously specified with the use command. You must provide the path, so if you’re using WAMP on your local server, start by putting the SQL file somewhere easy to get at such as C:\sql\my_import.sql. The full command with this example path would be:
mysql-> source C:\sql\my_import.sql;
After you run that command, the SQL file should begin to be imported. Let the queries run and allow the import to complete before closing the MySQL console.
Further documentation for MySQL command line can be found here: http://dev.mysql.com/doc/refman/5.5/en/mysql.html.
Another solution is to use MySQL Workbench.
This solution worked for me:
max_allowed_packet <-- --> upped size to 8M
read_buffer_size <-- --> upped from 256 to 512
Using Xampp control panel on localhost. After making the changes to the my.ini file in MySQL config, don’t forget to quit Xampp (or Wamp) and restart it for changes to take effect.
(Four days of head-banging and I finally got it fixed!)
Symptoms were on Import: #2006 MySql server went away. However, only 10 table rows were being imported out of 87 table rows.
Consider using MySQL Workbench, it's free and handles very large script very well (from the menu choose: File -> Open SQL Script - if it's large, it will ask you if you'd like run it). Has served me well over the years when working with large SQL dumps.