MySQL server has gone away exception while importing a huge database - mysql

I am trying to import a new database from an sql dump file with size above 2.6 GB. To do this I am running the below command to import the database from the sql file.
mysql -u root -proot --database=test_db < test_db.sql 
and the error response I got at this time.
ERROR 2006 (HY000) at line 51: MySQL server has gone away
I think the problem over here is some timeout is happening somewhere. Only one table has creatd in the new database from the sql file. Is there anyway to overcome this issue?

Try changing the max_allowed_packet setting to a larger value in the server.
Open "my.ini/cnf", located in your MySQL installation folder, and under [mysqld] section change "max_allowed_packet = 64M" and don't forget to restart the server. You can check the value by executing:
SHOW VARIABLES LIKE 'max_allowed_packet';
Refer: http://dev.mysql.com/doc/refman/5.0/en/gone-away.html

Related

MySQL server has gone away upon changes

I'm trying to import ~1GB .sql file which contains INSERTS. I'm using following command to do so: mysql -u root -p databae < database.sql but I keep receiving an error MySQL server has gone away.
I've tried increasing max_allowed_packet to 64M but no change.

MySQL gone away error running mysql command large file

I want to load a large .sql file (1.5GB) but everytime it stuck on 623.9 MB with the error: MySQL server has gone away.
Running the following command: mysql -u {DB-USER-NAME} -p {DB-NAME} < {db.file.sql path}
Already changed my my.cnf file with the following values, but that did not help:
wait_timeout = 3600
max_allowed_packet = 100M
innodb_lock_wait_timeout = 2000
What am I missing here?
Refer to this link -
MySQL Server has gone away when importing large sql file -
for various combinations that have worked for this problem.

Error dumping a large SQL database in MAMP

I'm trying to dump a large database (126 MB) with extended inserts in MAMP (localhost on my computer) and I get errors whatever I do.
First I tried to dump it via terminal with
/applications/MAMP/library/bin/mysql -u root -p databasename < /path/file.sql
but I got this error (line 44 is where there is the first INSERT INTO)
ERROR 2006 (HY000) at line 44: MySQL server has gone away
so I copied my-large.cnf into /applications/MAMP/conf/ renamed it to my.conf, set the new values for max_allowed_packet
[mysqldump]
quick
max_allowed_packet = 32M
and also put the line skip-character-set-client-handshake after [mysqld]
# The MySQL server
[mysqld]
skip-character-set-client-handshake
saved the file, restarted the server, tried again to dump the database with the command line and still got the same error.
I also tried to import it with MY MAMP DUMP but I get an error message after few seconds: An error occurred while processing SQL file.
I then tried with bigdump, where I set $max_query_lines = 6000; but the script doesn't even seem to run (yes, I put the file and the script in the same directory and yes, the mysql server is running).
I really don't know what else to do, what could be the problem?

Error 2013 in MySQL export database

my MySQL server was working fine, but today I can't connect to this. I get the error "Can't connect to mysql server on 'localhost' (10061)". So I check that the Mysql55 service in my Windows 7 x64 is stopped. I start the service, but when I open a specifically database the server stops. With other databases I don't have any problem.
I try to dump the database to a .sql file and I get the error "mysqldump: Error 2013: Lost connection to MySQL server during query when dumping table table_name at row: 43795". Then the server stops. I try this Error Code: 2013. Lost connection to MySQL server during query but not works.
When I try to export the database in other times, I get the same error, but with a different row value.
It seems while dumping the particular table sending receiving too large data from that table .
use --max_allowed_packet=500M and then dump it..also check your my.ini config file max_allowed_packet variable size you have specified .

Importing a large SQL file

I've seen a lot of questions regarding this topic, but going through these didn't help so I'm asking..
I have a large sql file (around 50mb) which I cannot import using phpmyadmin as it's limited to 2.5mb.
I've tried to use bigdump and I'm getting an error that says that says that I'm using "extended inserts or very long procedure definitions".
I've also tried using the source command from the console, which also gives me an error message saying that the defined max_allowed_packet is too low, after changing it to 128M (was 16M before) I'm getting another issue where during the source command I'm losing the connection to the DB server (hosted locally):
ERROR 2013 (HY000): Lost connection to MySQL server during query
ERROR 2006 (HY000): MySQL server has gone away
No connection. Trying to reconnect...
ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (111)
ERROR:
Can't connect to the server
The solution I found to be working is to further increase the max_allowed_packet to 512M.
Then the following will work:
mysql -u username -ppassword databasename < file.sql
Does this work from the console?
mysql -u username -ppassword databasename < file.sql
(Yes, there is no space between the -p and password)
In my case the problem ("Lost connection to MySQL Server during query") was in a corrupted dump file or in the misbehaving HDDs:
First, I made a dump on the main server and then copied that dump to the replication server. But it seems the replication server had some problems with its HDDs and the dump became corrupted, i.e. MD5 of the original dump file on the main server was different from MD5 of the dump copy on the replication server.
Are you exporting or importing. If you are importing try CSV format.
You can export into CSV and then import. If it still does not import because it is large CSV, then you can break the CSV yourself to make it smaller CSV files.