Im trying to upload a 3.8gb sql dump into phpmyadmin, in wamp. I set the max_file sizes among other things to 5gb and even restarted the server multiple times. The php_info(); also gives me 5gb as the file size.
I would like to break the file into smaller chunks so that I can upload it bit by bit,if the entire thing isnt possible, and have looked for such tools. Used one though http://www.rusiczki.net/2007/01/24/sql-dump-file-splitter/ but it gives me an error at 2gb.
Could anyone suggest anything that i could do to get my dump into mysql?
Thanks a lot
Traverse to your mysql bin directory (e.g. below)
D:\wamp\bin\mysql\mysql15.1.36\bin>
Then you can import any large database from your dump file giving the following command (e.g. below)
mysql -u username -ppassword database_name < dump.sql
You should really skip phpmyadmin and do it through the command line. The correct command is:
$ mysql -u username -p -h localhost DATA-BASE-NAME < data.sql
If you are still unsure about that one (although try it, you'll be surprised how easy it is) you can have a go at mysql workbench.
Related
In my xampp,i could not start the MYSQL section.So XAMP re installed.Before reinstallation i keep database backup from the path C:\xampp\mysql\data.
After reinstallation paste datafolder content into this path.Then I tried to accessing phpmyadmin,then database and table names are listed there.But there is no content into each table.While clicking the database table, it shows an error table does not exist.Please give any solution for my this issue.Thanks in advance
Well, dumping DBs is pretty easy. I assume you are on localhost and your user name is root.
Make sure your MySQL server is up and running.
Run Command line and for dumping ALL your DBs, type:
mysqldump -u root -p --all-databases > dump.sql
Hit Enter. The system will ask you for a MySQL password, type it and hit enter. If there's no password set for your server, just hit Enter. Your dump file will be saved in the current directory.
To restore the dump (e.g. after you reinstalled MySQL), run Command line again and make sure you are in the same directory where your dump.sql is. Type:
mysql -u root -p < dump.sql
It will ask you for a password. You know what to do. Depending on how many data you have it can take some prolonged time. Hope it will help!
P.S. Dumping/restoring single DB is a little bit different, let me know if you need it and I will update this post.
I have 5GB database that needs to be uploaded to phpmyadmin and that too on the shared server where i cannot access the shell.Is there any solution that can take lesser time to upload? Please do help me by providing the steps to upload the sql file. I have searched through internet but could not find an answer.
Do not use phpmyadmin.
Assuming you have shell, upload the file and feed it directly to mysql command.
Your shell command will look like:
cat file.sql | mysql -uuser -ppassword database
or you can do gzipped file:
zcat file.sql.gz | mysql -uuser -ppassword database
Prior doing this check:
database connection works (correct database, user and password)
database is empty :)
mysql max packet size is OK
you have enough diskspace
* UPDATE *
You said you do not have shell access.
Then you have following options -
upload the file and contact support, let they do it for you.
feed it remote, cpanel have special menu where you can get remove access, other panels have same ability too.
in this case code will be executed on your computer and look like:
cat file.sql | mysql -uroot -phipopodil -hwebsite.com
or for windows:
/path/to/mysql -uroot -phipopodil -hwebsite.com < file.sql
do some "hack" - feed it through crontab, at or via php system() command.
If you choose "hack" option, note following:
php have max_execution_time - even if you set it to zero, there could be some limit "imposed" from hosting.
usually hosts have limited mysql updates per hour.
there could be some ulimit restrictions.
if you execute feeding of 5 GB on shared server, server will slow down and administrator will check what you are doing.
This depends on your database, you tagged it with 3 different database types, mysql, sql-server, and postgresql. I know mysql and postgresql have import features, although I'd be surprised if SQL Server didn't as well. You could import the database file via the command line instead of having to use phpmyadmin.
Incidentally, the phpmyadmin tool also has an import feature, but that again depends on the format of your database. If it's a compatible sql file, you could upload it to phpmyadmin and import it there, but I'd recommend the previous method I mentioned, upload it to your host, then use whatever database tool (mysqlimport for mysql, or if it's the result of a pg_dump command, you can just run:
psql <dbname> < <yourfile>
ie
psql mydatabase < inputfile.sql
I'm transferring complete database from online server to localhost server.
But all records are not transferring. Is there any way to transfer complete data with same rows .
I tried via Navicat, export and import single tables, import and export .sql and gzip but all result are different
My Hosting is Shared.
Software on localhost Xamp
You can try mysqldump.
mysqldump -h hostname -u user -pPassWord --skip-triggers --single-transaction --complete-insert --extended-insert --quote-names --disable-keys dataBaseName > DUMP_dataBaseName.sql
then move you file DUMP_dataBaseName.sql to your localhost, and:
mysql -hHost -uUser -pPass -DBase < DUMP_dataBaseName.sql
Result is not missing probably your mysql tables on innodb click any table and see how many rows you are seeing .:) inno db now give the exact result
One issue I have run into countless times when moving WordPress sites are special characters (specifically single quotes and double quotes). The database will export fine, but upon import, it breaks at an "illegal" quote. My workflow now consists of exporting the database, running a find and replace on the sql file to filter out the offending characters, and then importing.
Without knowing more about your specific situation, that's just something I would look into.
What is a way to import a 40GB+ .sql file into a MySQL database, that works?
I tried big dump. I tried loading the .sql into workbench and executing.
None of the methods I tried so far, have worked.
I was wondering what is the best way or even a-way (that works) of importing such a large database .sql into mysql? Note that splitting the SQL file into chunks may not work, as some tables are larger than 5GB.
I appreciate any suggestions.
I imported a big dump file like this: mysql -u username -p database_name < your_dump.sql
Step 1: Make changes to the configuration file: my.cnf (located in C:\ProgramData directory in Windows if running MySQL V5.6, Note: this directory is hidden by default, you must enable visibility of hidden files in Folder Options).
max_allowed_packet=2000M
Step 2: Restart MySQL service
Step 3: Use the following dump command:
mysql -hhostname -uusername -ppassword db_name
< C:\sql_input_path\db.sql > C:\error_log_output_path\error.txt
I'm executing on a Centos Linux server this command:
mysql -u root -p mydatabase < dump.sql
I enter the password. It seems that all is ok because I absolutely get no errors, no messages that something happened. But sadly, the file is not imported!
Tryed in different ways:
-Putting the sql file in another location.
-Creating the DB first and than without the DB.
-Avoiding the dbname
-Adding max_allowed_packet=800M in /etc/my.cnf (because the file is 490mb)
-Restarted Mysql. But nothing to do. No errors, no import. I'm stuck and in panic. What to do? ?
If you have already created database then use below steps to import data from .sql file
DataBase to use:
use DataBaseName;
Give the source file path
source /path/to/dump.sql;
Hope this will help