Importing 40GB+ .sql file - mysql

What is a way to import a 40GB+ .sql file into a MySQL database, that works?
I tried big dump. I tried loading the .sql into workbench and executing.
None of the methods I tried so far, have worked.
I was wondering what is the best way or even a-way (that works) of importing such a large database .sql into mysql? Note that splitting the SQL file into chunks may not work, as some tables are larger than 5GB.
I appreciate any suggestions.

I imported a big dump file like this: mysql -u username -p database_name < your_dump.sql

Step 1: Make changes to the configuration file: my.cnf (located in C:\ProgramData directory in Windows if running MySQL V5.6, Note: this directory is hidden by default, you must enable visibility of hidden files in Folder Options).
max_allowed_packet=2000M
Step 2: Restart MySQL service
Step 3: Use the following dump command:
mysql -hhostname -uusername -ppassword db_name
< C:\sql_input_path\db.sql > C:\error_log_output_path\error.txt

Related

mysql 5.7.19 dumping data does not work(input)

I am trying input datadump to mysql.5.7.19.
I command like below :
mysql -uroot -p temp < temp201708.sql
my my.cnf prints error to /var/log/mysqld.log, but I do not see any error log while dumping.
However, when jobs done, I can not find any tables or data in temp schema. I think I saw some articles that different mysql versions could be a problem. Is it right?
Currently, I do not know what point is wrong, since there were no error log. What should I look for to solve this problem?
Thanks.
FYI, I do not know what version of mysql which made that dump file, I just received from the client.
mysqldump -u root -p DatabaseName > /Path-To-put-The-File/fileName.sql
When doing a dump in mysql:
run cmd
locate location of your mysql server like
Enter:
cd C:\Program Files\MySQL\MySQL Server 5.6\bin
mysqldump -uUsernameHere -pYourPasswordHere dbnamehere > "D:\sample.sql" make sure you have a drive D: in your computer or you can change it.
This is working and tested.
You can use it by calling a .bat file to execute the command.
Otherwise if you want to store a data into your Database use mysql only and not mysqldump and change > to <
like this:
mysql -uUsernameHere -pYourPasswordHere dbnamehere < "D:\sample.sql"
Thanks for answers.
I found the cause.
In the dump file, there are some states like USE mysql;, USE temp;.
So even though I indicate which schema to put in in command line,
it just ignore the command line and follow the script.
I found all of my data in mysql schema.
==========================
Conclusion
if you dump with some command like below
$ > mysqldump -u[user] -p[passwd] [scheme] > dump.sql
it is ok with
$ > mysql -u[user] -p[passwd] [target_scheme] < dump.sql
If not, you should check the contents of .sql file.

Can i import a .sql file from wamp to xampp?

i have a .sql file that was exported from wamp, my friend only uses xampp. Is it possible to import my .sql file to xampp?
Yes because the .sql file is most likely independent of the server stack.
Of course it is possible, when wamp give you a export of .sql, this file can be used with all standard web server. (xampp)
I think this may help
Depending on the tool and parameters used to create the .sql dump file of multiple databases, the file will normally have
CREATE DATABASE DBn...;
and
USE DBn;
statements that will allow your import to proceed without hiccups. For example, both the mysqldump command mysqldump command and phpMyAdmin's Export function for multiple database insert them.
Assuming you have exported with a sensible tool like those above, then you can import the database with a command line like this:
mysql -u username -p < dumpfile.sql
Your mysql username account needs to have appropriate privileges to, for example, create databases.
You can even run this command in a more familiar way by naming the startingDB database to use before running the commands in dumpfile.sql:
mysql -u username -p startingDB < dumpfile.sql

Uploading Big MySql Files in Phpmyadmin

Im trying to upload a 3.8gb sql dump into phpmyadmin, in wamp. I set the max_file sizes among other things to 5gb and even restarted the server multiple times. The php_info(); also gives me 5gb as the file size.
I would like to break the file into smaller chunks so that I can upload it bit by bit,if the entire thing isnt possible, and have looked for such tools. Used one though http://www.rusiczki.net/2007/01/24/sql-dump-file-splitter/ but it gives me an error at 2gb.
Could anyone suggest anything that i could do to get my dump into mysql?
Thanks a lot
Traverse to your mysql bin directory (e.g. below)
D:\wamp\bin\mysql\mysql15.1.36\bin>
Then you can import any large database from your dump file giving the following command (e.g. below)
mysql -u username -ppassword database_name < dump.sql
You should really skip phpmyadmin and do it through the command line. The correct command is:
$ mysql -u username -p -h localhost DATA-BASE-NAME < data.sql
If you are still unsure about that one (although try it, you'll be surprised how easy it is) you can have a go at mysql workbench.

Mysql command returns no errors and doesn't import the sql file. Why?

I'm executing on a Centos Linux server this command:
mysql -u root -p mydatabase < dump.sql
I enter the password. It seems that all is ok because I absolutely get no errors, no messages that something happened. But sadly, the file is not imported!
Tryed in different ways:
-Putting the sql file in another location.
-Creating the DB first and than without the DB.
-Avoiding the dbname
-Adding max_allowed_packet=800M in /etc/my.cnf (because the file is 490mb)
-Restarted Mysql. But nothing to do. No errors, no import. I'm stuck and in panic. What to do? ?
If you have already created database then use below steps to import data from .sql file
DataBase to use:
use DataBaseName;
Give the source file path
source /path/to/dump.sql;
Hope this will help

how import a mysqldump file locally

I want to import an sqldump file in my database "dbname" into the table "data" without using the network interface.
when i import the file via
mysql dbname -u databaseuser -pdatabasepass<data.sql
this is really slow on my ubuntu 12.04
but when i use this instead:
mysqlimport -u databaseuser -pdatabasepass --local data.sql
it is as fast as normal.
i think, because on my mashine it is parsing each line separately when using "mysql" and "<"
is there a way to use the mysqlimport syntax for importing an sql-file with CREATE and DROP TABLE stuff?
maybe just split the upper part in the sql-dump (with the drop table and create table statement) automatically and then only send the rest of the file with the INPUT-statements to mysqlinsert, that should work
There is no way to do that directly. What you need to do is run mysqldump --tab instead of the normal mysqldump, that will create both a .sql files containing table definitions and a CSV file containing data that can be imported with mysqlimport.
You can specify the path to a UNIX socket file with --socket=/path/to/... (or -S), which may be faster. You can find the socket path in the server's my.cnf. So, for example (and I'm just making up the path, here);
mysql dbname -u databaseuser -pdatabasepass -S /var/run/mysqld/mysqld.sock < data.sql
You can use this script: How do I split the output from mysqldump into smaller files?
and modify it, so you have only the create-statements in one file and the data statements in another.