Error dumping a large SQL database in MAMP - mysql

I'm trying to dump a large database (126 MB) with extended inserts in MAMP (localhost on my computer) and I get errors whatever I do.
First I tried to dump it via terminal with
/applications/MAMP/library/bin/mysql -u root -p databasename < /path/file.sql
but I got this error (line 44 is where there is the first INSERT INTO)
ERROR 2006 (HY000) at line 44: MySQL server has gone away
so I copied my-large.cnf into /applications/MAMP/conf/ renamed it to my.conf, set the new values for max_allowed_packet
[mysqldump]
quick
max_allowed_packet = 32M
and also put the line skip-character-set-client-handshake after [mysqld]
# The MySQL server
[mysqld]
skip-character-set-client-handshake
saved the file, restarted the server, tried again to dump the database with the command line and still got the same error.
I also tried to import it with MY MAMP DUMP but I get an error message after few seconds: An error occurred while processing SQL file.
I then tried with bigdump, where I set $max_query_lines = 6000; but the script doesn't even seem to run (yes, I put the file and the script in the same directory and yes, the mysql server is running).
I really don't know what else to do, what could be the problem?

Related

MySQL server has gone away upon changes

I'm trying to import ~1GB .sql file which contains INSERTS. I'm using following command to do so: mysql -u root -p databae < database.sql but I keep receiving an error MySQL server has gone away.
I've tried increasing max_allowed_packet to 64M but no change.

MySQL gone away error running mysql command large file

I want to load a large .sql file (1.5GB) but everytime it stuck on 623.9 MB with the error: MySQL server has gone away.
Running the following command: mysql -u {DB-USER-NAME} -p {DB-NAME} < {db.file.sql path}
Already changed my my.cnf file with the following values, but that did not help:
wait_timeout = 3600
max_allowed_packet = 100M
innodb_lock_wait_timeout = 2000
What am I missing here?
Refer to this link -
MySQL Server has gone away when importing large sql file -
for various combinations that have worked for this problem.

phpmyadmin #2006 error my.cnf configuration find proper file

i'm working on server with centOS and i'm trying to import database from another server. While importing from file in phpmyadmin I'm reciving #2006 error 'server has gone'
From the information I found, i need to change max_allowed_packet but the thing is needed to add this line in /etc/my.cnf beacuse this file wasn't have any variables. I cannot find another my.cnf ( tried find /name my.cnf ) but it given me only this file, but when i type show variables in phpmyadmin - i've got plenty of vars with max_allowed_packet at 1mb.
i located my.cnf in /etc/my.cnf and /etc/my.cnf.d/server.cnf and there was no line like max_allowed_packet - so i added it, restarted with service mariadb restart and still this value got 1MB value
what can i do?
Solution which worked for me was log with ssh to mysql and there run sql script " set global max_allowed_packet " to 3344558823 . restart db by "services mariadb restart"

MySQL server has gone away exception while importing a huge database

I am trying to import a new database from an sql dump file with size above 2.6 GB. To do this I am running the below command to import the database from the sql file.
mysql -u root -proot --database=test_db < test_db.sql 
and the error response I got at this time.
ERROR 2006 (HY000) at line 51: MySQL server has gone away
I think the problem over here is some timeout is happening somewhere. Only one table has creatd in the new database from the sql file. Is there anyway to overcome this issue?
Try changing the max_allowed_packet setting to a larger value in the server.
Open "my.ini/cnf", located in your MySQL installation folder, and under [mysqld] section change "max_allowed_packet = 64M" and don't forget to restart the server. You can check the value by executing:
SHOW VARIABLES LIKE 'max_allowed_packet';
Refer: http://dev.mysql.com/doc/refman/5.0/en/gone-away.html

Importing a large SQL file

I've seen a lot of questions regarding this topic, but going through these didn't help so I'm asking..
I have a large sql file (around 50mb) which I cannot import using phpmyadmin as it's limited to 2.5mb.
I've tried to use bigdump and I'm getting an error that says that says that I'm using "extended inserts or very long procedure definitions".
I've also tried using the source command from the console, which also gives me an error message saying that the defined max_allowed_packet is too low, after changing it to 128M (was 16M before) I'm getting another issue where during the source command I'm losing the connection to the DB server (hosted locally):
ERROR 2013 (HY000): Lost connection to MySQL server during query
ERROR 2006 (HY000): MySQL server has gone away
No connection. Trying to reconnect...
ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (111)
ERROR:
Can't connect to the server
The solution I found to be working is to further increase the max_allowed_packet to 512M.
Then the following will work:
mysql -u username -ppassword databasename < file.sql
Does this work from the console?
mysql -u username -ppassword databasename < file.sql
(Yes, there is no space between the -p and password)
In my case the problem ("Lost connection to MySQL Server during query") was in a corrupted dump file or in the misbehaving HDDs:
First, I made a dump on the main server and then copied that dump to the replication server. But it seems the replication server had some problems with its HDDs and the dump became corrupted, i.e. MD5 of the original dump file on the main server was different from MD5 of the dump copy on the replication server.
Are you exporting or importing. If you are importing try CSV format.
You can export into CSV and then import. If it still does not import because it is large CSV, then you can break the CSV yourself to make it smaller CSV files.