We need to run a test import before doing things in our live environment but need to know how long the import takes.
The file 10Gb at Max compression, when uncompressed its around 170 Gb. In the past when we have run tests it has taken around 9 hours which is too long to sit in front of a PC and watch :-)
Is there a way to log how long the import takes / when it finished.
We are running this command to import the script:
zcat /import_file.sql.gz | mysql -u 'root' -p database
If we were able to log a time to a file once the script is completed that would be perfect.
I am aware of "show table status;" but we are unsure what the last table is called so this kind of doesn't help.
Thanks in advance.
I found the answer you can use the time command and then log the output to a file.
here is the command: { time zcat /import_file.sql.gz | mysql -u 'root' -p database ; } 2> time.txt
the 2> with store the output into the time.txt file. I hope this helps anyone with this issue.
Related
I used this command mysqldump -u root -p etl_db > ~/backup.sql to get the backup data.
Now I want to import into a new remote MySQL database.
I saw there are 2 ways to do it.
https://dev.mysql.com/doc/refman/8.0/en/mysql-batch-commands.html
I wonder which one would be faster?
shell> mysql < dump.sql
or
mysql> source dump.sql?
I saw some people says that source is for small data but there are others say that it's good for large data. I couldn't find much documentation.
Thanks!
I am running a web application within a shared hosting environment which uses a MYSQL database which is about 3GB large.
For testing purposes I have setup a XAMPP environment on my local macOS machine. To copy the online DB to my local machine I used mysqldump on the server and than directly imported the dump file to mysql:
// Server
$ mysqldump -alv -h127.0.0.3 --default-character-set=utf8 -u dbUser -p'dbPass' --extended-insert dbName > dbDump.sql
// Local machine
$ mysql -h 127.0.0.3 -u dbUser -p'dbPass' dbName < dbDump.sql
The only optimization here is the use of extended-insert. However the import takes about 10 hours!
After some searching I found that disabling foreign constraint checks during the import should speed up the process. So I added the following line at the begining of the dump file:
// dbDump.sql
SET FOREIGN_KEY_CHECKS=0;
...
However this did not make any significant difference... The import now took about 8 hours. Faster but still pretty long.
Why does it take so much time to import the data? Is there a better/faster way to do this?
The server is not the fastest (shared hosting...) but it takes just about 2 minutes to export/dump the data. That exporting is faster than importing (no syntax checks, no parsing, just writing...) is not surprising but 300 times faster (10 hours vs. 2 minutes)? This is a huge difference...
Isn't there any other solution that would be faster? Copy the binary DB file instead, for example? Anything would be better than using a text file as transfer medium.
This is not just about transferring the data to another machine for testing purposes. I also create daily backups of the database. If it would be necessary to restore the DB it would be pretty bad if the site is down for 10 hours...
I am using XAMPP, but I needed a new PHP version, so I had to install the new XAMPP available. As I got many websites, it is very hard to export every single database and then to import it. That's why I have exported all databases in one single SQL file, which is big around 8 GB.
My question is how to import that big file from the command line, when it includes all my databases ?
Thanks in advance !
C:\xampp\mysql\bin\mysql -u {username} -p {databasename} < file_name.sql
I use a mysql import script like this.
sudo mysql -u root -p < /var/tmp/db.sql
I see that my data is beeing imported but the console is freezed.
root#****:/var/tmp# sudo mysql -u root -p < /var/tmp/db.sql
Enter password:
I have to type STRG+C to get the console back. Firstly i thought that the import just needs so much time, but i can wait hours, the console doesnt come back.
I am on ubuntu and the mysql file is round about 1GB.
Do you have any idea why the script freeze the console?
Thanks for helping
Importing 1GB took hours? That's not normal. You need to know what process is spending its time.
Try this:
$ ps -ef|grep [m]ysql
Identify the process id then,
$ strace -cp <pid>
Leave it 10 seconds or a minute then ^C. That will tell you where the process is spending its time, e.g. it could just be waiting for the disk if you seen read and write dominate.
I think i found the answer on my own.
Setting this in /etc/mysql/my.cnf
[mysqld]
init_connect='SET autocommit=0'
Now, the import takes about 2 Minutes.
I have a problem with a shell script:
I am trying to inject data from a source file containing MySQL queries into a data base. Here are the relevant lines of my shell script:
mysql -u root
source /usr/local/insert.sql;
quit;
For example I am running the file as ./insertfile and it is running smoothly but when it comes to data insertion in MySQL it is logging into MySQL using the mysql -u root command but the remaining operations (source /usr/local/insert.sql; and quit;) are not being executed. When I quit MySQL manually it tries to execute the rest of the command from my insert.sql file.
So please help me use the right shell script so that I can insert the queries from the source file.
One way to do that would be
mysql -u root --execute="source /usr/local/insert.sql; quit;"
It seems that your import hangs !
Check for lock on your database.
show processlist;
Run FLUSH TABLES to release any possible locks and then run your import command.
if source command hangs again :
Enter your myslq server
drop database insert;
create database insert
exit the mysql server and run:
mysqldump -u -p database-name < dump.sql
Thanks for your help. I have tried adding your line in my script and it was primarily giving some errors then I changed the command like below -
mysql -u root --execute="source /usr/local/insert.sql; \q"
Above line helped me to execute my command.
Thanks to all for being this much helpful.
Regards,
Shah9il