Bitbucket Server DIY Backup deleting database directory - mysql

I have a local copy of Bitbucket Server on my machine and i'm running tests before putting it on a server.
I'm trying to use the Bitbucket DIY Backup but every time I run the backup it completely deletes the directory the database should be backed up into and then tells me is cannot find the directory.
It backs up to the home and archive directories as it should with no issues but won't work for the database.
Here is the line used for creating the dump that seems to be causing the directory to be deleted:
mysqldump -username=root -password= --databases=bitbucket_server > ../bitbucket-backups-diy/bitbucket-database/bitbucket_server.sql
I have tested the connection settings on the line above with the following line and am getting a list of the tables in the database as I would expect:
mysql -D bitbucket_server -u root -p -e "show tables"
Any help would be greatly appreciated, thanks in advance.
Sam

I have stopped the bash file from deleting the directory and now it stores the dump in there.
Thanks to #BerndBuffen I altered the way the dump was accessing my database. Instead of using:
mysqldump -username=root -password= --databases=bitbucket_server > ../bitbucket-backups-diy/bitbucket-database/bitbucket_server.sql
I now use:
mysqldump -uroot bitbucket_server > ../bitbucket-backups-diy/bitbucket-database/bitbucket_server.sql
You also need to add the following code to the line above mysqldump to create the folder:
mkdir -p ../bitbucket-backups-diy/bitbucket-database
Because my root user doesn't have a password on my local database I don't need to list a password, this looks to be the reason it was failing. For when I put the backup on to a live server I will just need to add -p back in to the script and it should work.
Hopefully this can help anyone else having this problem.
Sam

Related

How can I copy contents out of MySQL folder without giving ownership of the folder to root in Linux?

I am attempting to back up a MySQL database on a Linux server before I install some upgrades to the software (Omeka) which is using the database.
The command supplied by Omeka documentation for that is the following:
mysqldump -h localhost -u username -p omeka_db_name > omeka_db_backup.sql
However, when I run this, I get the ever so helpfully vague message of "permission denied." It does this if I run the command as sudo. It does this no matter what directory I try to save the backup file to. It doesn't prompt me for a MySQL password when I run mysql dump, but it does when I run "mysql" command and it accepts the password I put in so I know the issue isn't that I'm using the wrong credentials.
I cannot navigate to the MySQL folder directly in shell and when I use WinSCP to access the server, the MySQL folder is listed as owned by "MySQL" and not by "root." So I'm assuming that I don't have permission to copy anything from this folder and that is my problem. I don't want to willy nilly assign ownership of the MySQL folder to root because I'm afraid it might break MySQL's ability to read and write from this folder.
All I want to do is copy the database files somewhere as backup. Heck, I'll copy the whole MySQL folder someplace if I have to do that. How can I do that without breaking MySQL?
Root has permissions for everything. There may be some additional safeguards, depending (there is some security software that limits root permissions).
You can just use:
mysqldump -h localhost -u username -p omeka_db_name > /path/to/some/other/directory/omeka_db_backup.sql
And put backup in directory you can normally access. If you use the mysqldump you don't need to write to mysql dir.

how to know which user is dumping mysql database

I have some confidential database in MySQL. I want to restrict user to dump database and I want to know, which user has dumped the database. I didn't get any way to know about this. I tried with general logs, but I didn't find anything in MySQL.
Please let me know is there any way to get info about this .
If you are on Linux server then you can block any command for users to take mysqldump. Find below procedure to block a command or to check mysql/bash history.
-Check mysql/bash history.
Goto home directory of user e.g. /home/user. In this directory you will find two files named .mysql_history and .bash_history. All command have stored in these files.
-Block a command
1- Try to find mysqldump command path
root#localhost:[~]: which mysqldump
/usr/bin/mysqldump
root#localhost:[~]:
As in above mysqldump command exist here /usr/bin/mysqldump
2- Remove command access from all other user except root
root#localhost:[~]: cd /usr/bin/
root#localhost:[/usr/bin]: chmod g-rwx mysqldump
root#localhost:[/usr/bin]: chmod o-rwx mysqldump
Above command will remove mysqldump command access to group and all other users except root.

Magento can't connect MySQL

After someone messed up the server, Magento could not connect MySql DB.
First try, I used mysql -u <username> -h localhost -p and failed to authenticate.
After a lot of struggle this guy helped me (the solution is in the comments), so I finally succeeded connecting to the DB using Magento's credentials. But then I couldn't connect remotely, this one didn't help since --skip-networking disables remote connection, but I finally figured it out as well (now I don't remember what I did, either changed something in my.cnf or /etc/hosts).
So now I can connect with Magento username/password (configured in configuration.php) both locally and remotely.
Still, Magento prints to screen errors that it can't connect MySql.
I checked both local.xml and config.xml (under <Magento root>/app/etc) and both seems to be configured correctly.
I started thinking about installing the whole thing from scratch, the problem is that there isn't any good backup and I'm not sure what/if I'm going to loose data by doing that, but if I'll have to, I'll backup the files+DB and go for it...
Any ideas ?
UPDATE
After endless digging, apparently there were other XML files in the same directory with local.xml and config.xml. Removing these files (which were created as backups, but were left with the .xml extension) the problem was solved.
Conclusion: if you backup xml files, save the backup as file.xml.backup so it won't be treated the same as a file with an xml extension!
If you're thinking about reinstalling the whole thing, may I, as a foreword, advise to do that on a different server than the messed-up one - just in order to keep data on the old one in case things turn bad. You may also want to do that on the same server but with a different vhost, home folder and mysql database.
Here is the procedure I use when making Magento project migrations, imports and other stuff related to Magento moves from one server to another.
This requires that you can access mysql + mysqldump from the shell.
This is a procedure I use regularly on Debian based distros with LAMP.
On source server
1. Clean the BD
This is necessary if you consider that your DB is to heavy to be downloaded from your new destination server.
Also, please make sure that you really know which tables you are truncating. I cannot tell which precisely as this depends on your Magento version.
Roughly, truncate index tables + core_url_rewrite, log tables, cron_schedule, flat catalog tables, dataflow batch tables and profile history, reports aggregation tables.
2. Backup the DB
mysqldump -h [host] -u [user] -p'[password]' [dbname] > magento.sql
3. Clean your Magento filesystem
From you Magento root folder:
rm -rf var/session/* && rm -rf var/cache/* && rm -rf var/log/*
4. Archive your Magento filesystem
From your Magento root folder:
tar -zcvf magento.tar.gz .
On the destination server
Retrieve your magento.sql and magento.tar.gz any way you like (wget, copy/paste from SSH GUI client...) and put them in your new Magento root directory.
5. Import your DB
mysql -h [your_host] -u [user] -p'[password]' [dbname]
That will open the mysql shell on your new DB
mysql> SET FOREIGN_KEY_CHECKS = 0;
mysql> source /full/path/to/magento.sql
...
mysql> SET FOREIGN_KEY_CHECKS = 1;
6. Extract your magento.tar.gz
From your new Magento root directory
tar -zxvf magento.tar.gz
You should now be able to see your site. Some permissions modification and a fine tuning of app/etc/local.xml may be needed to make it fit to your destination server MySql configuration.
Try to flush cache from backend or delete /var/cache/*

Back up MYSQL data after loosing access to PHPMYADMIN interface

I just want to know how to back up data From MYSQL database and Still be able to use that data
again on reinstalling MYSQL ?
NOTE-- i got into this problem after changing my ROOT password . I posted a question regarding fixing this .But unfortunately don't got the solution .So i decided the last step will be to BACK UP my data !
Here is a LINK to that
post
Can't connect after changing MySQL root password in WAMP
P.S----I HAVE NO ACCESS TO PHPMYADMIN
go to your wamp installation drive, and then go to wamp folder
traverse it if the drive is C: and installation folder is WAMP
C:\wamp\mysql\data\
you will see the folder as the name of your database here
copy it to other place and paste it after reinstall phpmyadmin in the same position
Okay, assuming that you know the new/old/current root passwords run the following
mysqldump -u root --all-databases -p
This should ask you the password and dump sql onto the stdout.
To save it to backup.sql,
mysqldump -u root --all-databases -p > backup.sql

Create mysql backup from server, download locally with scp and replace mamp database

I'm creating a snippet to be used in my Mac OS X terminal (bash) which will allow me to do the following in one step:
Log in to my server via ssh
Create a mysqldump backup of my Wordpress database
Download the backup file to my local harddrive
Replace my local Mamp Pro mysql database
The idea is to create a local version of my current online site to do development on. So far I have this:
ssh server 'mysqldump -u root -p'mypassword' --single-transaction wordpress_database > wordpress_database.sql' && scp me#myserver.com:~/wordpress_database.sql /Users/me/Downloads/wordpress_database.sql && /Applications/MAMP/Library/bin/mysql -u root -p'mylocalpassword' wordpress_database < /Users/me/Downloads/wordpress_database.sql
Obviously I'm a little new to this, and I think I've got a lot of unnecessary redundancy in there. However, it does work. Oh, and the ssh command ssh server is working because I've created an alias in a local .ssh file to do that bit.
Here's what I'd like help with:
Can this be shortened? Made simpler?
Am I doing this in a good way? Is there a better way?
How could I add gzip compression to this?
I appreciate any guidance on this. Thank you.
You can dump it out of your server and into your local database in one step (with a hint of gzip for compression):
ssh server "mysqldump -u root -p'mypassword' --single-transaction wordpress_database | gzip -c" | gunzip -c | /Applications/MAMP/Library/bin/mysql -u root -p'mylocalpassword' wordpress_database
The double-quotes are key here, since you want gzip to be executed on the server and gunzip to be executed locally.
I also store my mysql passwords in ~/.my.cnf (and chmod 600 that file) so that I don't have to supply them on the command line (where they would be visible to other users on the system):
[mysql]
password=whatever
[mysqldump]
password=whatever
That's the way I would do it too.
To answer your question #3:
Q: How could I add gzip compression to this?
A: You can run gzip wordpress_database.sql right after the mysqldump command and then scp the gzipped file instead (wordpress_database.sql.gz)
There is a python script that would download the sql dump file to local. You can take a look into the script and modify a bit to perform your requirements:
download-remote-mysql-dump-local-using-python-script