I've seen a lot of questions regarding this topic, but going through these didn't help so I'm asking..
I have a large sql file (around 50mb) which I cannot import using phpmyadmin as it's limited to 2.5mb.
I've tried to use bigdump and I'm getting an error that says that says that I'm using "extended inserts or very long procedure definitions".
I've also tried using the source command from the console, which also gives me an error message saying that the defined max_allowed_packet is too low, after changing it to 128M (was 16M before) I'm getting another issue where during the source command I'm losing the connection to the DB server (hosted locally):
ERROR 2013 (HY000): Lost connection to MySQL server during query
ERROR 2006 (HY000): MySQL server has gone away
No connection. Trying to reconnect...
ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (111)
ERROR:
Can't connect to the server
The solution I found to be working is to further increase the max_allowed_packet to 512M.
Then the following will work:
mysql -u username -ppassword databasename < file.sql
Does this work from the console?
mysql -u username -ppassword databasename < file.sql
(Yes, there is no space between the -p and password)
In my case the problem ("Lost connection to MySQL Server during query") was in a corrupted dump file or in the misbehaving HDDs:
First, I made a dump on the main server and then copied that dump to the replication server. But it seems the replication server had some problems with its HDDs and the dump became corrupted, i.e. MD5 of the original dump file on the main server was different from MD5 of the dump copy on the replication server.
Are you exporting or importing. If you are importing try CSV format.
You can export into CSV and then import. If it still does not import because it is large CSV, then you can break the CSV yourself to make it smaller CSV files.
Related
I'm trying to import ~1GB .sql file which contains INSERTS. I'm using following command to do so: mysql -u root -p databae < database.sql but I keep receiving an error MySQL server has gone away.
I've tried increasing max_allowed_packet to 64M but no change.
I have a problem with my SQL database.
My RAID failed but I recover data from the drive and now I have my old database, but it is full errors. I want to export it to SQL file and import it to new RAID and drives. ¨
But when I tried to dump it wit this command:
root#LFCZ:/home# mysqldump -u root -password mc | gzip -9 > mc.sql.gz
It gives me this Error:
mysqldump: Got error: 2013: Lost connection to MySQL server during query when using LOCK TABLES
Can you help with that? Only thing I need is to get .sql file. It is a very big database (approx. 13 GB) but It is running on OVH dedicated server, so it is powerfully enough.
I'm trying to dump a large database (126 MB) with extended inserts in MAMP (localhost on my computer) and I get errors whatever I do.
First I tried to dump it via terminal with
/applications/MAMP/library/bin/mysql -u root -p databasename < /path/file.sql
but I got this error (line 44 is where there is the first INSERT INTO)
ERROR 2006 (HY000) at line 44: MySQL server has gone away
so I copied my-large.cnf into /applications/MAMP/conf/ renamed it to my.conf, set the new values for max_allowed_packet
[mysqldump]
quick
max_allowed_packet = 32M
and also put the line skip-character-set-client-handshake after [mysqld]
# The MySQL server
[mysqld]
skip-character-set-client-handshake
saved the file, restarted the server, tried again to dump the database with the command line and still got the same error.
I also tried to import it with MY MAMP DUMP but I get an error message after few seconds: An error occurred while processing SQL file.
I then tried with bigdump, where I set $max_query_lines = 6000; but the script doesn't even seem to run (yes, I put the file and the script in the same directory and yes, the mysql server is running).
I really don't know what else to do, what could be the problem?
my MySQL server was working fine, but today I can't connect to this. I get the error "Can't connect to mysql server on 'localhost' (10061)". So I check that the Mysql55 service in my Windows 7 x64 is stopped. I start the service, but when I open a specifically database the server stops. With other databases I don't have any problem.
I try to dump the database to a .sql file and I get the error "mysqldump: Error 2013: Lost connection to MySQL server during query when dumping table table_name at row: 43795". Then the server stops. I try this Error Code: 2013. Lost connection to MySQL server during query but not works.
When I try to export the database in other times, I get the same error, but with a different row value.
It seems while dumping the particular table sending receiving too large data from that table .
use --max_allowed_packet=500M and then dump it..also check your my.ini config file max_allowed_packet variable size you have specified .
I am trying to import a new database from an sql dump file with size above 2.6 GB. To do this I am running the below command to import the database from the sql file.
mysql -u root -proot --database=test_db < test_db.sql
and the error response I got at this time.
ERROR 2006 (HY000) at line 51: MySQL server has gone away
I think the problem over here is some timeout is happening somewhere. Only one table has creatd in the new database from the sql file. Is there anyway to overcome this issue?
Try changing the max_allowed_packet setting to a larger value in the server.
Open "my.ini/cnf", located in your MySQL installation folder, and under [mysqld] section change "max_allowed_packet = 64M" and don't forget to restart the server. You can check the value by executing:
SHOW VARIABLES LIKE 'max_allowed_packet';
Refer: http://dev.mysql.com/doc/refman/5.0/en/gone-away.html