I download backup of my database every week from my hosting. It generates a .sql file that currently has approximately 800mb. This .sql file contains 44 tables.
Is there any way, through some software that I can split the .sql file with all the tables, in order to individually export each table?
So, if I had to reset the backup at some point, I would do it by table, and I would not have to do to the entire database.
I wouldn't split it after, if you have server access yourself over ssh you could (and in my eyes should) do something like this :
for table in `mysql -u [USER] -p[PASSWORD] -N -B -e 'show tables from [DATABASE]'`;
do
mysqldump --skip-comments --compact -u [USER] -p[PASSWORD] [DATABASE] $table > $table.sql
&& tar -czvf $table.tar.gz $table && rm $table.sql
done;
Which should generate one file per table. Then just .gz whatever directory you put the files in and you should have your backup in the way you want it.
Related
I have a mysqldump of all my databases from all projects created like this:
mysqldump -u username -h localhost --all-databases | gzip -9 > alldb.sql.gz
I now want to copy a specific database of a specific project (project1) of this alldb.sql file (which I have already gunzipped). I need to copy the project1_production database into the anotherproject_development database of an application I'm currently developing.
What is the easiest way to copy (and overwrite if entries already exist) the project1_production database of the alldb.sql file into the anotherproject_development database of my application I'm currently developing?
mysql -D mydatabase -o < dump.sql
A so thread is here
I have a large sql file (500mb) and want to split it in chunks.
I used the shell command split but it doesn't split context-aware before a special pattern (e.g. INSERT) and thus breaks the SQL statement.
The aim is to have two 250mb files both still containing only valid SQL commands. Is this possible?
Use:
mysqldump -u admin -p database1 > /backup/db/database1.sql
or
mysqldump -u admin -p --all-databases > /backup/db/all_databases.sql
If you have only MyISAM tables you can use:
mysqlhotcopy -u admin -p password123 database1 /backup
for faster backups. mysqlhotcopy doesn't generating sql but copying the files of the database.
For recovery of mysqldumped databases use:
mysql -u admin -p database1 < database.sql
or
mysql -u admin -p <all_databases.sql
For mysqlhotcopy:
To restore the backup from the mysqlhotcopy backup, simply copy the files from the backup directory to the /var/lib/mysql/{db-name} directory. Just to be on the safe-side, make sure to stop the mysql before you restore (copy) the files. After you copy the files to the /var/lib/mysql/{db-name} start the mysql again.
See here: http://www.thegeekstuff.com/2008/07/backup-and-restore-mysql-database-using-mysqlhotcopy/
The standard mysqldump command that I use is
mysqldump --opt --databases $dbname --host=$dbhost --user=$dbuser --password=$dbpass | gzip > $filename
To dump multiple databases
mysqldump --opt --databases $dbname1 $dbname2 $dbname3 $dbname_etc --host=$dbhost --user=$dbuser --password=$dbpass | gzip > $filename
My question is how do you dump multiple databases from different MySQL accounts into just one file?
UPDATE: When I meant 1 file, I mean 1 gzipped file with the difference sql dumps for the different sites inside it.
Nobody seems to have clarified this, so I'm going to give my 2 cents.
Going to note here, my experiences are in BASH, and may be exclusive to it, so variables and looping might work different in your environment.
The best way to achieve an archive with separate files inside of it is to use either ZIP or TAR, i prefer to use tar due to its simplicity and availability.
Tar itself doesn't do compression, but bundled with bzip2 or gzip it can provide excellent results. Since your example uses gzip I'll use that in my demonstration.
First, let's attack the problem of MySQL dumps, the mysqldump command does not separate the files (to my knowledge anyway). So let's make a small workaround for creating 1 file per database.
mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; done
So now we have a string that will show databases per file, and export those databases out to where ever you need simply edit the part after the > symbol
Next, let's add some look at the syntax for TAR
tar -czf <output-file> <input-file-1> <input-file-2>
because of this configuration it allows us to specify a great number of files to archive.
The options are broken down as follows.
c - Compress/Create Archive
z - GZIP Compression
f - Output to file
j - bzip compression
Our next problem is keeping a list of all the newly created files, we'll expand our while statement to append to a variable while running through each database found inside of MySQL.
DBLIST=""; mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB"; done
Now we have a DBLIST variable that we can use to have an output of all our files that will be created, we can then modify our 1 line statement to run the tar command after everything has been handled.
DBLIST=""; mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB"; done && tar -czf $filename "$DBLIST"
This is a very rough approach and doesn't allow you to manually specify databases, so to achieve that, using the following command will create you a TAR file that contains all of your specified databases.
DBLIST=""; for db in "<database1-name> <database2-name>"; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB.sql"; done && tar -czf $filename "$DBLIST"
The looping through MySQL databases from the MySQL database comes from the following stackoverflow.com question "mysqldump with db in a separate file" which was simply modified in order to fit your needs.
And to have the script automatically clean it up in a 1 liner simply add the following at the end of the command
&& rm "$DBLIST"
making the command look like this
DBLIST=""; for db in "<database1-name> <database2-name>"; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB.sql"; done && tar -czf $filename "$DBLIST" && rm "$DBLIST"
For every MySQL server account, dump the databases into separate files
For every dump file, execute this command:
cat dump_user1.sql dump_user2.sql | gzip > super_dump.gz
There is a similar post on Superuser.com website: https://superuser.com/questions/228878/how-can-i-concatenate-two-files-in-unix
just in case "multiple db" is literally "all db" for you
mysqldump -u root -p --all-databases > all.sql
I am attempting to restore my mysql database to my website, and all of the tables in my database get dropped into individual files, so I am trying to figure out how can I restore all of the database .sql files through SSH with a single (or easy command) instead of restoring all 100 tables individually.
cat *.sql > data.sql
mysql -u <username> -p < data.sql
It depends on how you created the individual files - if they have all the instructions for recreating the tables (i.e., "Drop if exists...", "Create ...", and "Insert into ..."), then you can either concatenate them into mysql:
cat *.sql | mysql -u xxx -pxxx dbname
or write a script to do it
#!/bin/sh
mysql -u xxx -pxxx dbname < file001.sql
mysql -u xxx -pxxx dbname < file002.sql
The second choice lets you more easily control the order of files processed.
Finally, you might want to create your backups in a more convenient way - check out mysqldump for how to dump a database (or several!) into one file (basically, "mysqldump -u xxx -pxxx dbname > dbname.sql", but there are some helpful flags you might want to add).
Yes it's Windows sorry.
I'm using mysqldump with the option -T which creates a sql and a txt file per table.
mysqldump -u user -ppass db -T path
I use that option to be able to restore easily one table.
Now I'd like to restore all the tables.
mysql -u user -ppass db < path/*.sql
Obvously doesn't work
Also, I don't know where do my funcs/procs go.
You could use a FOR loop with the file wildcard (*.sql) to process each one, like this:
FOR /R %F in (*.sql) DO (
mysql -u user -ppass database %F
)
(Note that if you're running this from a batch file, the variable should be shown as %%F instead of just %F.)