What is wrong with this bash script (cron + mysql) - mysql

Im using a bash script (sync.sh), used by cron, that is supposed to sync a file to a MySQL database. It works by copying a file from automatically uploaded location, parse it by calling SQL script which calls other MySQL internally stored scripts, and at the end emails a report text file as an attachment.
But, seems like something is not working as nothing happens to MySQL databases. All other commands are executed (first line and last line: copy initial file and e-mail sending).
MySQL command when run separately works perfectly.
Server is Ubuntu 16.04.
Cron job is run as root user and script is part of crontab for root user.
Here is the script:
#!/bin/bash
cp -u /home/admin/web/mydomain.com/public_html/dailyxchng/warehouse.txt /var/lib/mysql-files
mysql_pwd=syncit4321
cd /home/admin/web/mydomain.com/sync
mysql -u sync -p$mysql_pwd --database=database_name -e "call sp_sync_report();" > results.txt
echo "<h2>Report date $(date '+%d/%m/%Y %H:%M:%S')</h2><br/><br/> <strong>results.txt</strong> is an attached file which contains sync report." | mutt -e "set content_type=text/html" -s "Report date $(date '+%d/%m/%Y %H:%M:%S')" -a results.txt -- recipient#mydomain.com

cron will execute the script using a very stripped environment. you probably want to add the full path to the mysql command to the cron script
you can find the full path by
which mysql
at the prompt,
or you can add an expanded path to the cron invocation
1 2 * * * PATH=/usr/local/bin:$PATH scriptname

Related

Crontab command not running

I have a database backup command that takes a mysql dump and then uploads that dump file to AWS S3, when I run the command as a normal user it works perfectly but when I use the same command in a cron job it fails.
I have checked the syslog and there is no error message saying there was a problem after the job. There is only a line saying the job is run and then goes on to run the next cron job.
The command is as follows, I have removed the sensitive parts:
mysqldump -u {{ db_user }} -p{{ db_password }} {{ db_name }} > /home/db_backup.sql | aws s3 cp /home/db_backup.sql s3://{{ s3_url }}/$(date --iso-8601=seconds)_db.sql --profile backupprofile
When this command is run by a normal user there is a warning output not to use the the mysql password in command line but this is essential for the command to work without interaction. There is also a second line ofor the S3 to say that the upload worked. Could these outputs be effecting the cronjob is someway?
You will need to have full paths on your cronjobs, I see you missed them out on mysqldump and also your aws for the connection URL. I would do whereis mysqldump and whereis aws to find which full path you need to run it.
Try checking the the environment variable, cron passes a minimal set of environment variables to your jobs. You can set the PATH easily inside crontab
PATH=/usr/local/bin:/usr/sbin
Also many cron execute command using sh, and you might be using another shell in your script . You can tell cron to run all commands in bash by setting the shell at the top of your crontab:
SHELL=/bin/bash
Cron tries to interpret % symbol, you need to escape it if you have that somewhr in your command.
If the output that comes first after running your command is interactive that is if this asks you to hit enter or something like that thn this is the issue otherwise thr shouldn't be any problem with this.

How to dump remote mysql database and import it locally using bash script

Trying to dump a remote wordpress database using that shell script:
db_name=$(php -r 'include("public_html/local-config.php");echo DB_NAME;');
db_user=$(php -r 'include("public_html/local-config.php");echo DB_USER;');
db_password=$(php -r 'include("public_html/local-config.php");echo DB_PASSWORD;');
db_host=$(php -r 'include("public_html/local-config.php");echo DB_HOST;');
mysqldump --user="$db_user" --password="$db_password" "$db_name";
To have the dump locally I run this in my local terminal (host in .ssh/config):
ssh host 'bash -s' < thatscript.sh > dump.sql
Then I would do mysql localdb < dump.sql to import it locally.
Now I tried to put the ssh and local mysql part into the script as well, so that I can just fire up the script and it does everything.
I tried to put the original script into a function and then call it using
mysql localdb < $(ssh host 'bash -s' < myDumpFunction)
but that assumes a file called myDumpFunction and obviously does not work. How would I call the function in that case correctly?
Any other ideas on how to achieve a one line database sync for a local dev system are appreciated as well.
What about writing another .sh file and starting it with ssh connection ? Do not forget to change chmod. Script should be executable.
example:
ssh username#ipaddr(remotePc) start.sh

How to schedule a MySQL database backup in a remote Ubuntu server to a Dropbox folder in Windows PC?

I have a MySQL database that I want to daily backup to my Dropbox folder on my Windows PC.
How can I do that automatically from Windows 7?
One of the simplest ways to backup a mysql database is by creating a dump file. And that is what mysqldump is for. Please read the documentation for mysqldump.
In its simplest syntax, you can create a dump with the following command:
mysqldump [connection parameters] database_name > dump_file.sql
where the [connection parameters] are those you need to connect your local client to the MySQL server where the database resides.
mysqldump will create a dump file: a plain text file which contains the SQL instructions needed to create and populate the tables of a database. The > character will redirect the output of mysqldump to a file (in this example, dump_file.sql). You can, of course, compress this file to make it more easy to handle.
You can move that file wherever you want.
To restore a dump file:
Create an empty database (let's say restore) in the destination server
Load the dump:
mysql [connection parameters] restore < dump_file.sql
There are, of course, some other "switches" you can use with mysqldump. I frequently use these:
-d: this wil tell mysqldump to create an "empty" backup: the tables and views will be exported, but without data (useful if all you want is a database "template")
-R: include the stored routines (procedures and functions) in the dump file
--delayed-insert: uses insert delayed instead of insert for populating tables
--disable-keys: Encloses the insert statements for each table between alter table ... disable keys and alter table ... enable keys; this can make inserts faster
You can include the mysqldump command and any other compression and copy / move command in a batch file.
My solution to extract a backup and push it onto Dropbox is as below.
A sample of Ubuntu batch file can be downloaded here.
In brief
Prepare a batch script backup.sh
Run backup.sh to create a backup version e.g. backup.sql
Copy backup.sql to Dropbox folder
Schedule Ubuntu/Windows task to run backup.sh task e.g. every day at night
Detail steps
All about backing up and restoring an MySQL database can be found here.
Back up to compressed file
mysqldump -u [uname] -p [dbname] | gzip -9 > [backupfile.sql.gz]
How to remote from Windows to execute the 'backup' command can be found here.
plink.exe -ssh -pw -i "Path\to\private-key\key.ppk" -noagent username#server-ip
How to bring the file to Dropbox can be found here
Create a app
https://www2.dropbox.com/developers/apps
Add an app and choose Dropbox API App. Note the created app key and app secret
Install Dropbox API in Ubuntu; use app key and app secret above
$ wget https://raw.github.com/andreafabrizi/Dropbox-Uploader/master/dropbox_uploader.sh
$ chmod +x dropbox_uploader.sh
Follow the instruction to authorize access for the app e.g.
http://www2.dropbox.com/1/oauth/authorize?oauth_token=XXXXXXX
Test the app if it is working right - should be ok
$ ./dropbox_uploader.sh info
The app is created and a folder associating with it is YourDropbox\Apps\<app name>
Commands to use
List files
$ ./dropbox_uploader.sh list
Upload file
$ ./dropbox_uploader.sh upload <filename> <dropbox location>
e.g.
$ ./dropbox_uploader.sh upload backup.sql .
This will store file backup.sql to YourDropbox\Apps\<app name>\backup.sql
Done
How to schedule a Ubuntu can be view here using crontab
Call command
sudo crontab -e
Insert a line to run backup.sh script everyday as below
0 0 * * * /home/userName/pathTo/backup.sh
Explaination:
minute (0-59), hour (0-23, 0 = midnight), day (1-31), month (1-12), weekday (0-6, 0 = Sunday), command
Or simply we can use
#daily /home/userName/pathTo/backup.sh
Note:
To mornitor crontab tasks, here is a very good guide.

Executing MySQL commands in shell script?

I’m looking to create a deploy script that I can run from a terminal and it automatically deploys my site from a repository. The steps I’ve identified are:
Connect to remote server via SSH
Fetch latest version of site from remote repository
Run any SQL patches
Clean up and exit
I’ve placed the SSH connection and git pull commands in my shell file, but what I’m stuck with is MySQL with it being an (interactive?) shell itself. So in my file I have:
#!/bin/bash
# connect to remote server via SSH
ssh $SSH_USER#$SSH_HOST
# update code via Git
git pull origin $GIT_BRANCH
# connect to the database
mysql --user $MYSQL_USER --password=$MYSQL_PASSWORD --database=$MYSQL_DBNAME
# run any database patches
# disconnect from the database
# TODO
exit 0
As you can see, I’m connecting to the database, but not sure how to then execute any MySQL statements.
At the moment, I have a directory containing SQL patches in numerical order. So 1.sql, 2.sql, and so on. Then in my database, I have a table that simply records the last patch to be run. So I’d need to do a SELECT statement, read the last patch to be ran, and then run any neccesary patches.
How do I issue the SELECT statement to the mysql prompt in my shell script?
Then what would be the normal flow? Close the connection and re-open it, passing a patch file as the input? Or to run all required patches in one connection?
I assume I’ll be checking the last patch file, and doing a do loop for any patches in between?
Help here would be greatly appreciated.
Assuming you want to do all the business on the remote side:
ssh $SSH_USER#$SSH_HOST << END_SSH
git pull origin $GIT_BRANCH
mysql --user $MYSQL_USER --password=$MYSQL_PASSWORD --database=$MYSQL_DBNAME << END_SQL
<sql statements go here>
END_SQL
END_SSH
You could get the output from mysql using Perl or similar. This could be used to do your control flow.
Put your mysql commands into a file as you would enter them.
Then run as: mysql -u <user> -p -h <host> < file.sqlcommands.
You can also put queries on the mysql command line using '-e'. Put your 'select max(patch) from .' and read the output in your script.
cat *.sql | mysql --user $MYSQL_USER --password=$MYSQL_PASSWORD --database=$MYSQL_DBNAME

How can I run multiple Stored Procedures Files & Triggers (.sql) From MySQL Workbench

I am trying to run a set of sql files with stored procedures and triggers in my windows XAMPP environment. Some suggested to me using a batch script but I do not know how to do this in windows.
Is it possible to run all these .sql files from within MySQL Workbench? How? If not, can anyone tell me how to run a batch file within windows?
Thank you.
It seems Workbench doesn't support the command "SOURCE" so the next best thing is is (at least in windows) is to run a batch job. Simply create a new .sql file and add the full path to each .sql file like so:
Create the batch file:
In windows, the batch file can be a .sql with the sql comman SOURCE which calls the other .sql files, like so:
create run.sql
SOURCE C:\xampp\htdocs\mysite\sql\procs\sp_article_delete.sql
SOURCE C:\xampp\htdocs\mysite\sql\procs\sp_article_insert.sql
SOURCE C:\xampp\htdocs\mysite\sql\procs\sp_article_load.sql
Open Command Line and CD to MySQL Folder
Open the command line, and cd to MySQL. If you are using XAMPP, the command/location should be something like:
cd C:\xampp\mysql\bin\
Execute the Batch File by pressing ENTER
Last, simply load mysql and run the batch file using the following command:
mysql -u root -h 127.0.0.1 my_database_name -vvv < C:\xampp\htdocs\mysite\sql\procs\run.sql
The execution above means the following:
mysql -u <username> -h <host> <database> -vvv < <batch_path_file_name>
-vvv shows all the queries being executed and the rows affected for debugging.
That's it. All .sql files mentioned in the run.sql file will be executed.