Pull Mysql data remotely using Shell Script - mysql

I have two servers A and B.
Mysql is installed on Server B. I want to run a shell script from server A which will do following things:
Login to server B.
Run a Mysql query (eg. show databases)
I want the output of above command in a txt file on server A.
Please help me in this. I am new in shell scripting.
Let me know if you need any future clarification

With out pem file u can use the below code
#!/bin/sh
ssh username#remote mysqlshow -uroot -proot > /folder/databases.txt
scp username#remote:/folder/databases.txt /folder/
With pem file u can use the below code
#!/bin/sh
ssh -i filename.pem username#remote mysqlshow -uroot -proot > /folder/databases.txt
scp -i filename.pem username#remote:/folder/databases.txt /folder/

Related

How to copy a sql file from one server another server mysql database

I tried copy a sql file from one server another server mysql database
ssh -i keylocation user#host 'mysql --user=root --password="pass" --host=ipaddress additional_content Additional_Content' | < databasedump.sql
databasedump.sql - file on server A, I wanna copy data from that database file into database on server B, i tried to connec via ssh, to that server,and i need keyfile for that, and then copy the data, but when i run this command into console, nothing happens, any help?
Are you able to secure copy the file over to server B first, and then ssh in for the mysql dump? Example:
scp databasedump.sql user#server-B:/path/to/databasedump.sql
ssh -i keylocation user#host 'mysql --user=root --password="pass" --database=db_name < /path/to/databasedump.sql'
Edit: type-o
I'm not entirely sure how mysql handles stdin, so one thing you can do that should work in one command is
ssh -i keylocation user#host 'cat - | mysql --user=root --password="pass" --host=ipaddress additional_content Additional_Content' < databasedump.sql
However it's better to copy the files first with scp, and then import it with mysql. See Dan's answer.

Script to get a database from a remote server and deploy locally

I want to use a script that will get my database from a remote server and automatically deploy locally.
Desired Algorithm:
Get base
Remove the previous drop database localDbName
Create a new create database localDbName
Expand my dump use localDbName; source / var / www / html / TEST $ currentDate.sql
I tried to implement this with bash but it didn't work. At the last step, the script stops working, the term stops waiting for input.
Tell me how to fix it. Or advise a working script in bash or another language
##!/bin/sh
remoteDbUser="remoteDbUser"
remoteDbName="remoteDbName"
localDbName="localDbName"
localPort="3306"
currentDate=$(date +%Y-%m-%d_%H-%M)
ssh test#123.123.123.123 -p 123 "mysqldump -v --insert-ignore --skip-lock-tables --single-transaction=TRUE -u $remoteDbUser -p $remoteDbName" > /var/www/html/TEST$currentDate.sql
mysql -uroot -proot -h127.0.0.1 --port="$localPort" -e"source /var/www/html/TEST$currentDate.sql" --show-warnings $localDbName;

gunzip and mysql import - No such file or directory

I have created a bash script for the automation of my database restore. When I run the following commands, I get /my/sql/file/path.sql: No such file or directory.
ssh $USER#"$cloneMysqlHost" gunzip /path/file.sql.gz && MySQL -u root -p db_name < /path/file.sql
I did an ls -lrot on the host I ssh to, just to make sure the file exists the permissions are correct, and they are.
Any ideas what I'm doing wrong?
Thanks in advance!
The && is causing the local shell to split the command and run the MySQL command locally.
The < redirection is also being done locally (and the cause of your error).
The gunzip is being performed on the remote host though.
You need to quote the entire argument to ssh if you want it all run on the remote system.
ssh "$USER#$cloneMysqlHost" 'gunzip /path/file.sql.gz && MySQL -u root -p db_name < /path/file.sql'
Are you providing the password properly? Also, not sure what's going on with the &&, should use a pipe there. MySql is probably not valid, use mysql. See here for more details.
ssh $USER#"$cloneMysqlHost" gunzip /path/file.sql.gz | mysql -u root -p [password] db_name

mysqldump from remote host

Is it possible to dump a database from a remote host through an ssh connection and have the backup file on my local computer.
If so how can this be achieved?
I am assuming it will be some combination of piping output from the ssh to the dump or vice versa but cant figure it out.
This would dump, compress and stream over ssh into your local file
ssh -l user remoteserver "mysqldump -mysqldumpoptions database | gzip -3 -c" > /localpath/localfile.sql.gz
Starting from #MichelFeldheim's solution, I'd use:
$ ssh user#host "mysqldump -u user -p database | gzip -c" | gunzip > db.sql
ssh -f user#server.com -L 3306:server.com:3306 -N
then:
mysqldump -hlocalhost > backup.sql
assuming you also do not have mysql running locally. If you do you can adjust the port to something else.
I have created a script to make it easier to automate mysqldump commands on remote hosts using the answer provided by Michel Feldheim as a starting point:
mysqldump-remote
The script allows you to fetch a database dump from a remote host with or without SSH and optionally using a .env file containing environment variables.
I plan to use the script for automated database backups. Feel free to create issues / contribute - hope this helps others as well!

Create mysql backup from server, download locally with scp and replace mamp database

I'm creating a snippet to be used in my Mac OS X terminal (bash) which will allow me to do the following in one step:
Log in to my server via ssh
Create a mysqldump backup of my Wordpress database
Download the backup file to my local harddrive
Replace my local Mamp Pro mysql database
The idea is to create a local version of my current online site to do development on. So far I have this:
ssh server 'mysqldump -u root -p'mypassword' --single-transaction wordpress_database > wordpress_database.sql' && scp me#myserver.com:~/wordpress_database.sql /Users/me/Downloads/wordpress_database.sql && /Applications/MAMP/Library/bin/mysql -u root -p'mylocalpassword' wordpress_database < /Users/me/Downloads/wordpress_database.sql
Obviously I'm a little new to this, and I think I've got a lot of unnecessary redundancy in there. However, it does work. Oh, and the ssh command ssh server is working because I've created an alias in a local .ssh file to do that bit.
Here's what I'd like help with:
Can this be shortened? Made simpler?
Am I doing this in a good way? Is there a better way?
How could I add gzip compression to this?
I appreciate any guidance on this. Thank you.
You can dump it out of your server and into your local database in one step (with a hint of gzip for compression):
ssh server "mysqldump -u root -p'mypassword' --single-transaction wordpress_database | gzip -c" | gunzip -c | /Applications/MAMP/Library/bin/mysql -u root -p'mylocalpassword' wordpress_database
The double-quotes are key here, since you want gzip to be executed on the server and gunzip to be executed locally.
I also store my mysql passwords in ~/.my.cnf (and chmod 600 that file) so that I don't have to supply them on the command line (where they would be visible to other users on the system):
[mysql]
password=whatever
[mysqldump]
password=whatever
That's the way I would do it too.
To answer your question #3:
Q: How could I add gzip compression to this?
A: You can run gzip wordpress_database.sql right after the mysqldump command and then scp the gzipped file instead (wordpress_database.sql.gz)
There is a python script that would download the sql dump file to local. You can take a look into the script and modify a bit to perform your requirements:
download-remote-mysql-dump-local-using-python-script