Script to get a database from a remote server and deploy locally - mysql

I want to use a script that will get my database from a remote server and automatically deploy locally.
Desired Algorithm:
Get base
Remove the previous drop database localDbName
Create a new create database localDbName
Expand my dump use localDbName; source / var / www / html / TEST $ currentDate.sql
I tried to implement this with bash but it didn't work. At the last step, the script stops working, the term stops waiting for input.
Tell me how to fix it. Or advise a working script in bash or another language
##!/bin/sh
remoteDbUser="remoteDbUser"
remoteDbName="remoteDbName"
localDbName="localDbName"
localPort="3306"
currentDate=$(date +%Y-%m-%d_%H-%M)
ssh test#123.123.123.123 -p 123 "mysqldump -v --insert-ignore --skip-lock-tables --single-transaction=TRUE -u $remoteDbUser -p $remoteDbName" > /var/www/html/TEST$currentDate.sql
mysql -uroot -proot -h127.0.0.1 --port="$localPort" -e"source /var/www/html/TEST$currentDate.sql" --show-warnings $localDbName;

Related

Run mysql commands in bash script without logging in or adding -u root to every command

I'm writing a bash script to do some db stuff. New to MySQL. I'm on Mac and have MySQL installed via homebrew.
Am using username "root" right now and there isn't a pw set. I included the pw syntax below just to help others out that may have a pw.
My goal is to have mysql commands be as "clean" as possible in my bash script
Not a hige deal, but would like to do this if possible.
Example
# If I can do it without logging in (*ideal)
mysql CREATE DATABASE dbname;
# Or by logging in with - mysql -u root -pPassword
CREATE DATABASE dbname;
# Instead of
mysql -u root -pPassword -e"CREATE DATABASE dbname";
Tried to simplify it. I have a handful of things I gotta do, so would rather keep my code cleaner if possible. I tried logging in with the bash script, but the script stopped once logged into MySQL and didn't run any commands.
Another option I was considering (but don't really like) would be just to keep username and pw string in a var and call it for every commmand like so
# Set the login string variable
login_details="-u root -p password -e"
# example command
mysql $login_details"CREATE DATABASE dbname";
So any ideas?
Write a new bash script file and run this file after putting all your commands into it. Don't forget to give right username and password in your bash script.
For bash script:
#!/bin/bash
mysql -u root -pSeCrEt << EOF
use mysql;
show tables;
EOF
If you want to run single mysql command.
mysql -u [user] -p[pass] -e "[mysql commands]"
Example:
mysql -h 192.168.1.10 -u root -pSeCrEt -e "show databases"
To execute multiple mysql commands:
mysql -u $user -p$passsword -Bse "command1;command2;....;commandn"
Note: -B is for batch, print results using tab as the column separator, with each row on a new line. With this option, mysql does not use the history file. Batch mode results in nontabular output format and escaping of special characters. -s is silent mode. Produce less output. -e is to execute the statement and quit

Pull Mysql data remotely using Shell Script

I have two servers A and B.
Mysql is installed on Server B. I want to run a shell script from server A which will do following things:
Login to server B.
Run a Mysql query (eg. show databases)
I want the output of above command in a txt file on server A.
Please help me in this. I am new in shell scripting.
Let me know if you need any future clarification
With out pem file u can use the below code
#!/bin/sh
ssh username#remote mysqlshow -uroot -proot > /folder/databases.txt
scp username#remote:/folder/databases.txt /folder/
With pem file u can use the below code
#!/bin/sh
ssh -i filename.pem username#remote mysqlshow -uroot -proot > /folder/databases.txt
scp -i filename.pem username#remote:/folder/databases.txt /folder/

mysqldump doesn't work in crontab

I'm trying to add a cronjob in the crontab (ubuntu server) that backups the mysql db.
Executing the script in the terminal as root works well, but inserted in the crontab nothing happens. I've tried to run it each minutes but no files appears in the folder /var/db_backups.
(Other cronjobs work well)
Here is the cronjob:
* * * * * mysqldump -u root -pHERE THERE IS MY PASSWORD
--all-databases | gzip > /var/db_backups/database_`date +%d%m%y`.sql.gz
what can be the problem?
You need to escape % character with \
mysqldump -u 'username' -p'password' DBNAME > /home/eric/db_backup/liveDB_`date +\%Y\%m\%d_\%H\%M`.sql
I was trying the same but I found that dump was created with 0KB. Hence, I got to know about the solution which saved my time.
Command :
0 0 * * * mysqldump -u 'USERNAME' -p'PASSWORD' DATEBASE > /root/liveDB_`date +\%Y\%m\%d_\%H\%M\%S`.sql
NOTE:
1) You can change the time setting as per your requirement. I have set every day in above command.
2) Make sure you enter your USERNAME, PASSWORD, and DATABASE inside single quote (').
3) Write down above command in Crontab.
I hope this helps someone.
Check cron logs (should be in /var/log/syslog) You can use grep to filter them out.
grep CRON /var/log/syslog
Also you can check your local mail box to see if there are any cron mails
/var/mail/username
You can also set up other receiving mail in you crontab file
MAILTO=your#mail.com
Alternatively you can create a custom command mycommand. To which you can add more options. You must give execute permissions.
It is preferable to have a folder where they store all your backups, in this case using a writable folder "backup" which first create in "your home" for example.
My command in "usr/local/bin/mycommand":
#!/bin/bash
MY_USER="your_user"
MY_PASSWORD="your_pass"
MY_HOME="your_home"
case $1 in
"backupall")
cd $MY_HOME/backup
mysqldump --opt --password=$MY_PASSWORD --user=$MY_USER --all-databases > bckp_all_$(date +%d%m%y).sql
tar -zcvf bckp_all_$(date +%d%m%y).tgz bckp_all_$(date +%d%m%y).sql
rm bckp_all_$(date +%d%m%y).sql;;
*) echo "Others";;
esac
Cron: Runs the 1st day of each month.
0 0 1 * * /usr/local/bin/mycommand backupall
I hope it helps somewhat.
Ok, I had a similar problem and was able to get it fixed.
In your case you could insert that mysqldump command to a script
then source the profile of the user who is executing the mysqldump command
for eg:
. /home/bla/.bash_profile
then use the absolute path of the mysqldump command
/usr/local/mysql/bin/mysqldump -u root -pHERE THERE IS MY PASSWORD --all-databases | gzip > /var/db_backups/database_`date +%d%m%y`.sql.gz
Local Host mysql Backup:
0 1 * * * /usr/local/mysql/bin/mysqldump -uroot -ppassword --opt database > /path/to/directory/filename.sql
(There is no space between the -p and password or -u and username - replace root with a correct database username.)
It works for me. no space between the -p and password or -u and username
Create a new file and exec the code there to dump into a file location and zip it . Run that script via a cron
I am using Percona Server (a MySQL fork) on Ubuntu. The package (very likely the regular MySQL package as well) comes with a maintenance account called debian-sys-maint. In order for this account to be used, the credentials are created when installing the package; and they are stored in /etc/mysql/debian.cnf.
And now the surprise: A symlink /root/.my.cnf pointing to /etc/mysql/debian.cnf gets installed as well.
This file is an option file read automatically when using mysql or mysqldump. So basically you then had login credentials given twice - in that file and on command line. This was the problem I had.
So one solution to avoid this condition is to use --no-defaults option for mysqldump. The option file then won't be read. However, you provide credentials via command line, so anyone who can issue a ps can actually see the password once the backup runs. So it's best if you create an own option file with user name and password and pass this to mysqldump via --defaults-file.
You can create the option file by using mysql_config_editor or simply in any editor.
Running mysqldump via sudo from the command line as root works, just because sudo usually does not change $HOME, so .my.cnf is not found then. When running as a cronjob, it is.
You might also need to restart the service to load any of your changes.
service cron restart
or
/etc/init.d/cron restart

execute mysql commands via desktop shortcut?

On a windows machine, every day i have to login to mysql via phpmyadmin, go to a particular table and run the same sql command to do some cleanup.
I want to automate this process without setting up a TRIGGER.....is there a command prompt solution for doing this, or an automatic process that can be simply run from a desktop shortcut?
Write a script Run.bat and copy the following contents in it:
mysql -h localhost -u root -ppassword -D database_name -e "cleanup command".
Schedule this command in your windows scheduler.
If you have multiple cleanup commands, then add them in a sql file and then schedule the following command:
mysql -h localhost -u root -ppassword -D database_name < path_to_sql_file
For windows you could use the task scheduler. You might not be able to get it to login to phpMyAdmin though. Perhaps cmd line for mysql?

bash script command to inject a schema.sql into mysql db

i found this code but do not quite understand what the command is doing.
sudo -u test-user mysql -U test_traffic traffic < ./phoenix/data/sql/lib.model.schema.sql
i know the last part is using lib.model.schema.sql to create the tables and fields
the first part i dont quite understand: sudo -u test-user mysql -U test_traffic traffic
i know the command sudo and mysql
please explain?
thanks
Let's look at it bit by bit. Firstly the format
sudo -u username command
is an instruction to run command (which might be simple or complex) as the user username. So in your example, you are running the mysql command as the user test-user. You should note that this includes all the parameters to the mysql command - that's the entire rest of the line.
The command
mysql -U test_traffic traffic < ./phoenix/data/sql/lib.model.schema.sql
appears corrupt (certainly running it on 5.0.51a fails). It would make sense if the -U was a -u which would indicate that that the command was to be executed for mysql user test_traffic. If it was a -u you would then have an instruction to import the sql file into the traffic database.
So the combined instruction says, import the lib.model.schema.sql file into the database test_traffic using the mysql user test_traffic and executing the entire command as if you were logged-in as the user test-user.
Try Below steps for mysql:
mysql > -h hostname -u username -p password
mysql > use databasename;
mysql > source path/to/scriptfile
If you want to inject theschema.sql file into your database, with a shell script, simply use :
mysql -h [host] -u [username] -p[password] -D [database] < your_file
If you want to dynamicly tell which file should be loaded, replace your_file by $1 and pass the name of the file as an argument to your script.
Take care also to the -p option. There is no space between the -p and your password.