Is there a simple way to do an automated backup of an entire website on a host like GoDaddy via the command-line?
So far, I know I need to backup all the files in my home directory recursively. I could possibly automated SFTP to connect and issue a get -R * command to get the full file dump, or just use SCP.
The other half of the puzzle is getting all of the tables available, mostly WordPress tables. My guess is that maybe there's a command-line command I could issue which dumps the database contents to a flat file, which I could then also pull via SFTP. If such a command exists, my plan is to use a combination of Telnet and EXPECT scripts to login to the GoDaddy site, issue some commands, then disconnect back to my local shell.
The end result should be that I have a folder with all of my server content in it, plus the flat file backup of the SQL database from the server. I know there are WordPress backup plugins, but they tend to provide a slew of ZIP files, when all I want is the raw data directly so I can put it in my private SVN server for backup and versioning.
So my question: how do I extract all of the databases on my GoDaddy server via the command-line to a file?
Thank you.
In the end, I found a working solution.
First, I used 2 separate expect scripts.
Telnet into the server, delete old backups, use mysqldump to extract all tables to a flat file via mysqldump -u db_owner -p --all-databases > output.sql, and create a massive tarball of everything. Logout.
Use SCP to pull the newly created tarball, extract it to a local SVN controlled working copy folder.
Use a second expect script to login to the server and delete the backup. Logout.
From there, I just manually svn add and svn commit as needed.
Related
I am using phpmyadmin on my windows os. I have a database with one table which has 100M records with the size of 20GB. I want to export this table and have the table.sql file. Whenever I try to do this, the size of the exported file is 0 bytes. When I check the apache error log, the following would show up:
Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 1066139648 bytes)
Any idea how to solve this problem?!
Thanks :)
I would suggest to try using the command line and the mysqldump.exe utility, as suggested here
If you have shared hosting and you are using Cpanel then they are providing you the option to backup your database in the following section.
Files => Backup => Download a MySQL Database Backup.
If you are on a shared hosting or you don't have access to shell, then use mysqldumper script; copy it to your server and start it in your browser under "yourDomain.com/path_to_mysqlumper/"
MySQLDumper is a PHP and Perl based tool for backing up MySQL
databases. You can easily dump your data into a backup file and - if
needed - restore it. It is especially suited for shared hosting
webspaces, where you don't have shell access.
If you have shell access to your host servers (if provided by your host since not all shared server hosters give this access) then you may use SSH access like in this tutorial using Putty that you install and configure then import or export your databases like in this third tutorial.
I try mysqldump for many hours but didnt work, until I started a superuser console.
First, start a superuser console
sudo su
Then, try the complete command
/opt/lampp/bin/mysqldump -u root -p [DATABASE NAME] > [PATH_FOR_BACKUPFILE]/[FILE_NAME].sql
I my case, it was something like /opt/lampp/bin/mysqldump -u root -p database > /home/user/backup.sql
MySQLDumper worked like a charm for me at my hosted website. I had to copy one database and "paste" it into a new database. In MySQLDumper, it isn't apparent right away how to do this, but the key is to create a new configuration file in MySQLDumper and that will allow you to copy/restore to different databases.
On the home screen in MySQLDumper, click Configuration, then Configuration Files. There is a text box at the top allowing you to create a new Configuration file. In there, put in the information for the second database you need (you created a connection to the first database when you install MySQLDumper). Save it. Then you can click Restore where you can select the dump of the first database and restore it in the second one.
This was a lifesaver. Thanks!
increase
max_post_size
variable in php.ini file. Then you will be able to download it.
I had a different issue when I was downloading from phpmyadmin in the middle like 180MB download stops with message - network error
So I used ssh connection which you can find in your cpanel sometimes they provide browser based terminal or sometime you have to access it using putty .
In terminal I go inside my public_html folder where all my files are stored . Followed by this command:
mysqldump -u [username] -p [database-you-want-to-dump] > [path-to-place-data-dump.sql]
This did the job in few minutes and saved a sql file in my public_html folder. Then I opened the folder in File manager and downloaded it from there.
You can also use FTP or you can download it directory by accessing by url.
Make sure you delete it after your download finishes.
On my shared webhost, I have very limited ssh access (only via imscp instantSSH plugin). I want to set up a script to download my whole mysql database as an sql file, but I cant figure it out.
I cant use mysqldump so I tried the mysql command, which is available, but it isn working.
I have to specify username, password, host and database, and my password contains special characters.
Anyone can help me?
If your shared host allow mysql remote connections then you can use any MySQL software to connect to the database and then extract whatever information you need. Tools like these are: HeidiSQL, NavicatGui etc.
Another way would be the one suggested by Akshay Khale.
A 3rd one would be to use phpMyAdmin (most shared web hosting have this installed by default).
A 4th one would to create a simple php script that runs mysqldump locally, saves the .sql dump file either locally on your shared hosting or remotely via FTP/SFTP or any other protocal. Also the same script can email you that file (inline or as an attachment to the email). This kind of automation can be configured using a cron job.
There are multiple ways to achieve this. It all depends on which one suits you best.
As the author of the InstantSSH plugin for i-MSCP, I can say you that you should ask your HPs to make the mysqldump command available from your chrooted shell. The host administrator can add any command to the chroot by editing the InstantSSH plugin configuration file.
I am using phpmyadmin on my windows os. I have a database with one table which has 100M records with the size of 20GB. I want to export this table and have the table.sql file. Whenever I try to do this, the size of the exported file is 0 bytes. When I check the apache error log, the following would show up:
Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 1066139648 bytes)
Any idea how to solve this problem?!
Thanks :)
I would suggest to try using the command line and the mysqldump.exe utility, as suggested here
If you have shared hosting and you are using Cpanel then they are providing you the option to backup your database in the following section.
Files => Backup => Download a MySQL Database Backup.
If you are on a shared hosting or you don't have access to shell, then use mysqldumper script; copy it to your server and start it in your browser under "yourDomain.com/path_to_mysqlumper/"
MySQLDumper is a PHP and Perl based tool for backing up MySQL
databases. You can easily dump your data into a backup file and - if
needed - restore it. It is especially suited for shared hosting
webspaces, where you don't have shell access.
If you have shell access to your host servers (if provided by your host since not all shared server hosters give this access) then you may use SSH access like in this tutorial using Putty that you install and configure then import or export your databases like in this third tutorial.
I try mysqldump for many hours but didnt work, until I started a superuser console.
First, start a superuser console
sudo su
Then, try the complete command
/opt/lampp/bin/mysqldump -u root -p [DATABASE NAME] > [PATH_FOR_BACKUPFILE]/[FILE_NAME].sql
I my case, it was something like /opt/lampp/bin/mysqldump -u root -p database > /home/user/backup.sql
MySQLDumper worked like a charm for me at my hosted website. I had to copy one database and "paste" it into a new database. In MySQLDumper, it isn't apparent right away how to do this, but the key is to create a new configuration file in MySQLDumper and that will allow you to copy/restore to different databases.
On the home screen in MySQLDumper, click Configuration, then Configuration Files. There is a text box at the top allowing you to create a new Configuration file. In there, put in the information for the second database you need (you created a connection to the first database when you install MySQLDumper). Save it. Then you can click Restore where you can select the dump of the first database and restore it in the second one.
This was a lifesaver. Thanks!
increase
max_post_size
variable in php.ini file. Then you will be able to download it.
I had a different issue when I was downloading from phpmyadmin in the middle like 180MB download stops with message - network error
So I used ssh connection which you can find in your cpanel sometimes they provide browser based terminal or sometime you have to access it using putty .
In terminal I go inside my public_html folder where all my files are stored . Followed by this command:
mysqldump -u [username] -p [database-you-want-to-dump] > [path-to-place-data-dump.sql]
This did the job in few minutes and saved a sql file in my public_html folder. Then I opened the folder in File manager and downloaded it from there.
You can also use FTP or you can download it directory by accessing by url.
Make sure you delete it after your download finishes.
If there are two machines client and server .From client how to do a mysqldump to the server such that the dump is avaliable on the client and not stored in the server
Thanks..
Here is a PHP script that generates a mysqldump. It outputs directly to the client, and does not create any files on the server.
https://github.com/tylerl/web-scripts/tree/master/mysqldump
Do this in two steps:
dump data on server
transfer to client (possibly compress first)
If you need to do it often, then write a script on the server that dumps, compresses and copies the data to the client (don't forget to archive/delete old backups on server, as neded)
You could write a simple script, that could run in your crontab to create such dump and move it to some particular area of your file system, like an http accessible folder, or an ftp folder.
Then you could write an script to run in your clients that would fetch such dumps if you need this to be automatic too.
Either you do the backup serverside (if you have access to the server), using mysqldump to dump it, gzip or bzip2 to zip the file, and ftp/sftp/scp to transfer the file to the client afterwards. You can later script this, and afterwards crontab it to have it run automatically each X time. Checkout logrotate to avoid storing too many backups.
Or you use a tool on the client to fetch the data. The default (free) MySQL Workbench can back-up an entire database, or you can select which tables to backup (and interestingly, afterwards which tables to restore - nice if you only need to reset 1 table)
See the answer to similar question elsewhere:
https://stackoverflow.com/a/2990732/176623
In short, you can use mysqldump on the client to connect to and dump the server data directly on the client.
I have a test database on a separate remote server than my production DB. Every once in awhile, I want to try and test things by uploading a copy of my production DB to my testing DB. Unfortunately, the backup file is now half a gig and I'm having trouble transferring it via FTP or SSH. Is there an easy way that I can use the mysql restore command between servers? Also, is there another way to move over large files that I'm not considering? Half a gig doesn't seem that big, I would imagine that people run into this issue frequently.
Thanks!
Are the servers accessible to each other?
If so, you can just pipe the data from one db to another without using a file.
ex: mysqldump [options] | mysql -h test -u username -ppasswd
0.Please consider whether you really need production data (especially if it contains some sensitive information)
1.The simplest solution is to compress the backup on the source server (usually gzip), transfer it across the wire, then decompress on the target server.
http://www.techiecorner.com/44/how-to-backup-mysql-database-in-command-line-with-compression/
2.If you don't need the exact replica of production data (e.g. you don't need some application logs, errors, some other technical stuff) you can consider creating a backup and restore on a source server to a different DB name, delete all unnecessary data and THEN take a backup that you will use.
3.Restore full backup once on your reference server in your Dev environment and then copy transaction logs only (to replay them on the reference server). Depending on the usage pattern transaction logs may take a lot less space as the whole database.
Mysql allows you to connect to a remote database server to run sql commands. Using this feature, we can pipe the output from mysqldump and ask mysql to connect to the remote database server to populate the new database.
mysqldump -u root -p rootpass SalesDb | mysql --host=185.32.31.96 -C SalesDb
Use an efficient transfer method, rather than ftp.
If you have a dump file created by mysqldump, on the test db server, and you update it every so often. I think you could save time (if not disk space) by using rsync to transfer it. Rsync will use ssh and compress data for the transfer, but I think both the local and remote files should/could be uncompressed.
Rsync will only transfer the changed portion of a file.
It may take some time to decide what, precisely, has changed in a dump file, but the transfer should be quick.
I must admit though, I've never done it with a half-gigabyte dump file.