Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am administrating mediawiki for my organisation. We use it as our Intranet site. It has accumulated a huge organisational knowledge base. I have make sure that mediawiki is always up and running. Knowledge base always backed up.
Is there a way to take continuous back of mediawiki files and databases? My mediawiki is hosted on LAMPP server with Debian OS.
I am trying to find a way to automate backup process.
It depends on what you mean by "continuous". If you want a copy of the database running that is always the same as the main database, you will need to set up "replication" - see http://dev.mysql.com/doc/refman/5.1/en/replication.html for how to do that.
If you want a database backup that is relatively current, then running mysqldump every hour or so is a pretty good solution.
You'll need to backup the files separately, because they are in your file system not the database. Look at running rsync every hour or so.
Why do you want a "continuous" backup and how would you use it? Do either of these approaches answer your question?
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I am trying to learn more about wordpress and am bothered with this question.
I hosted my site on bluehost and decided to make a backup. I made a copy of public_html and downloaded it, I was later told by the support that I also need to export the database. That got me thinking where the actual data is located, if not in the public_html directory, where are the actual files that contains all the tables etc?
I asked the support but didn't get a good answer. Surely, all the data must be stored in some files somewhere, so shouldn't I be able to find them, download them, and not have to use an interface like PhpMyAdmin?
On BlueHost ask them to give you an export of SQL File of your database.
If you go the management panel and find MySQL (Database Software) where WordPress database files are stored.
But its best to install plugins like Wordpress DB backup in your WordPress and then use that plugin to download a backup of the database file. That may be easier for you considering you are a non-technical user.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I'm quite new to mysql and in database management in general....
I've to solve this scenario:
In the development stage the web site has the database in the local machine and some tables are dedicated to contain information data used by the application ,during the development the records of that tables grow and when we move to production we want to update the production server with the new data...
Can someone advise the best practice to automate the update process from the local to the production database.
Thanks in advance
The road to doing this successfully is to have each database know how far it has migrated.
You should absolute use something like Liquibase or Flyway to do it. If you have a simple database environment these two will work. Both of these will track changes in version files that the database keep track of.
If you need more complexity, like in a sharded environment, you probably need to roll your own tool for this.
You should mention different .sql files for each environment like,
development.sql,
staging.sql,
production.sql
And you need to write shellscript to execute this script while deployment process.
Also, you need to maintain one constant to get current environment.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a MySQL DB which should be versioned with SVN. I dont want the full DB, only the structure and selected tables. I searched the net and found some information, but nothing seems to really work in a reliable way. Any experience or hints?
Thanks :)
Check out this script which automates the process, allowing for specific selection of databases and exclusion of tables [disclaimer I am the author] - http://mysql-svn-backup.redant.com.au/
Use mysqldump to export the data you want into a file and put this into SVN. Using cron, you can automize this to run in specific timeslots
The question is: what exactly do you want to version, and why?
I propose you version an SQL file that you can import to create your database. Any tool can be used to create this SQL file (basic tool: mysqldump), which you can then save into your SVN repository. You will be able to track new tables being created by comparing revisions of SQL files.
You can automate this process by adding a CRON job to automatically dump and commit the file every 2 hours.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
So the only method i like using and think is the simplest to use is mysqldump to backup mysql databases. Right now, im using phpmyadmin to backup the tables. Is there any way i can code a script that does it automatically (preferably everyday).
And how do i back up files exactly on my server. I have an images file that i need to back up. I'm not sure exactly how to go about backing those up.
Of course -- use MySQL Dumper. You can automatically backup your databases to another host if you like!
Features
Send dumpfiles via FTP to up to 3 different server. This is also working using the multipart feature.
Automatic file-deletion: set your own rules to delete old backups. Specify the number of backups you want to hold and let MySQLDumper automatically delete the older ones to save server webspace.
MySQLDumper can do Multipart-Backups. That means: it can automatically split the dumpfile if it gets bigger than your chosen size. When you want to restore a backup and choose the wrong part - it doesn' matter: MySQLDumper will notice that and will get the correct startfile automatically.
Security: MySQLDumper can generate a .htaccess-file to protect itself and all of your backup-files
Good reading resource for alternatives
10 Ways to Automatically & Manually Backup MySQL Database
Since Gary answered your first question, I'll answer your second.
For backing up the server:
I'm assuming you are talking about your web applications and the images contained in folders used by those applications. Source control will work for this. Set up a Subversion server or something like it.
http://subversion.tigris.org/
Hope this helps. Good luck.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Do you have any experience about this question? I have currently 1900 MySQL databases in a single domain in my plesk control panel and I wonder if my MySQL server gets overloaded or out-of-service due to such high number of databases in the system.
Do you have any suggestions? Each database is for a user in my service by the way.
MySQL itself doesn't place any restrictions on the number of databases you can have, and I doubt Plesk does either, I'm sure it just displays all the databases present on the MySQL server.
However, your host may have a limit (which you'd have to ask them about), or if you start getting a huge number of databases, you may actually run into a filesystem limit. As the MySQL documentation says, each database is stored as a directory, so you could hypothetically hit the filesystem's upper limit for how many subdirectories are allowed.
ive got well over 5000 databases running on alinux based plesk cluster (one db one web server) and its running fine, though i have had to increase open files limits due to the huge amounts of files. i cant run the mysql tuning primer any more though, well i can but it takes about 4 hours