How to queue a shell command if [conditon] is not fulfilled? - mysql

I am uploading data to a MySQL database from a shell script (from cron) every 5 minutes.
But if my connection is down, it does not insert it in to my database.
I would like the script to try to insert again and again (for example every 30 minutes) until it is successful.
And if my connection is down more than 5 minutes, I would like to these requests to stand in a queue and proceed when I have connection again.
How can I do that?
Example code:
#!/bin/bash
cputemp=$(somecommands...)
/usr/bin/mysql -h 127.0.0.1 -u admin -pwpassword -e "USE logs; INSERT INTO cpu (cputemp) VALUES ('$cputemp');"

You'd do it by writing in that logic into your code. Should be fairly easy detect it couldn't detect, have it sleep, then try again.

Related

batch on windows not finishing mysql job

I have a .sql file 25 sql statements, that runs ok when all statements are copypasted into mysql client It takes app 20 mins to finish and no warnings are displayed. However, if I call this file through batch file on windows (I would like to add this to task scheduler) - - the job does not get finished. I can see which statement is the problem - however - no warnings get displayed - I can only see in (show processes), that mysql process end and no subsequent commands from the file get executed - this happens somewhere in the middle on my initial sql file. Again - the statement get executed ok when copypasted into mysql console. If anyone has any idea - would appreciate a lot..
c:
cd\program files\mysql\mysql server 5.7\bin
mysql --show-warnings -h12.24.56.78 -u user -ppass --port=3306 --default-character-set=utf8 < "C:\SVN\dis\777e\777a_777e\777a_ua_into_777e.sql" >> "C:\Users\user1\db_transfers\777a_ua\777a_ua_into_777e_warnings.log" 2>&1

Get MySQL processlist log every 5 seconds

How can I write a cron job that runs the MySQL "show processlist" command and stores in log file every 5 seconds between 5 am to 7 am?
I know the lowest possible timing I can have in cron is a minute not second. If I need a script, I am looking for a solution in Bash.
I think this cron job works for every 5 minutes between 5 am to 7 am.
*/5 5-7 * * * mysql -ufoo --password='' -te "show full processlist" > /home/foo/log/show_processlist.log.`date +\%Y\%m\%d-\%H\%M` 2>&1
You can use mysqladmin which is MySQL CLI command to administrate database,
mysqladmin -u root -p -i 5 processlis
Press CTRL+C to stop the running command anytime and use "--verbose" to see the full query.
Set a chron task to run at 05:00 which executes a script that loops over a time value and sleeps for 5 seconds between those time values. May not be exactly 5 seconds as sleep is usually the min time to sleep, but should be close enough.
You can write shell/Python/php/something else script which should be run by cron job minutely.
This script should have the following logic (pseudo code):
i = 0
while i < 20:
i++
show full processlist
delay 4 sec
Want a bash command on every second to see FULL queries?
while [ true ]; do mysql -u MYSQLUSERNAME -p'MYSQLUSERPASSWORD' --execute='SHOW FULL processlist;'; sleep 1; done;
Assuming you're on a safe environment to enter raw MYSQL password ;).

crontab behaviour difference for mysql

I did tried to search, but nothing comes up that really works for me.
So i would start this thread to see if anyone can help. I hope this is not a stupid question that i overlook something simple.
I have a mac mini, that running with a MySQL server.
There is some day end job, so i put them into a script, trigger by a crontab (Actually I also tried launched as this is mac OS X, but same behavior)
crontab looks like this
15 00 * * * /Users/fgs/Documents/database/process_db.sh > /Users/fgs/Documents/database/output.txt 2>&1
the script looks like this
#!/bin/bash
#some data patching task before everything start
#This sql takes 3 sec
/usr/local/bin/mysql dbname -u root "-ppassword" < /Users/fgs/Documents/database/loadrawdata.sql
#This sql takes 90 sec
/usr/local/bin/mysql dbname -u root "-ppassword" < /Users/fgs/Documents/database/LongLongsql.sql
#This sql takes 1 sec
/usr/local/bin/mysql dbname -u root "-ppassword" < /Users/fgs/Documents/database/anothersql.sql
Behavior:
A. When i execute the shell script directly in terminal, all the 3 sql works
B. When i execute this with crontab, the 90 sec SQL doesn't work (it is an insert into with a very big join, so there is no output printed, i did also tried to > output file, adding 2>&1, also no output), but the SQL before and after it works as expected.
C. To simulate crontab behavior, I tried to use
env - /bin/sh
and then start the shell script manually.
It appears that, the 90 sec longlongsql.sql was running only 5 sec, and skipped to the next line. No error message was displayed
I am wondering if there is any kind of timeout for crontab? (I did searched but found nothing)
I did checked ulimit is unlimited (checked within "env - /bin/sh", and also did tried to put into the script)
I believe it is not related to mysql command, since it works fine by running same scripts (I also did searched this topic, and nothing interesting)
Just wondering if anyone can shed some light on me, a direction or whatever will help.
Thanks everyone in advance.
Don't forget that cron will start an isolated shell where it may not be able to read the file.
I would recommend to put your mysql-stuff inside a script. If you are able to execute the script, cron should also be able to do so.
#!/bin/bash
/usr/local/bin/mysql dbname -u root "-ppassword" < /Users/fgs/Documents/database/LongLongsql.sq
Or:
#!/bin/bash
/usr/local/bin/mysql --user=root --password=xxxxxx -e "/Users/fgs/Documents/database/LongLongsql.sq"
Then call the script from crontab...

MySql Import freezed

I use a mysql import script like this.
sudo mysql -u root -p < /var/tmp/db.sql
I see that my data is beeing imported but the console is freezed.
root#****:/var/tmp# sudo mysql -u root -p < /var/tmp/db.sql
Enter password:
I have to type STRG+C to get the console back. Firstly i thought that the import just needs so much time, but i can wait hours, the console doesnt come back.
I am on ubuntu and the mysql file is round about 1GB.
Do you have any idea why the script freeze the console?
Thanks for helping
Importing 1GB took hours? That's not normal. You need to know what process is spending its time.
Try this:
$ ps -ef|grep [m]ysql
Identify the process id then,
$ strace -cp <pid>
Leave it 10 seconds or a minute then ^C. That will tell you where the process is spending its time, e.g. it could just be waiting for the disk if you seen read and write dominate.
I think i found the answer on my own.
Setting this in /etc/mysql/my.cnf
[mysqld]
init_connect='SET autocommit=0'
Now, the import takes about 2 Minutes.

How to monitor and automatically restart mysql?

I'm running mysql on Debian.
Is there a way to monitor mysql and restart it automatically if it locks up? For example sometimes the server starts to take 100% of cpu and starts running very slowly. If I restart mysql, things clear up and the server starts working fine, but I'm not always present to restart it manually.
Is there a way to monitor mysql and if the cpu is above 95% for more than 10 minutes straight then then mysql will automatically be restarted
You can write a cronjob to use
show processlist;
show processlist will return column Time and Id,
you can add more logic to check,
like query stuck for more than 600 seconds and the query is SELECT,
you can use Id value to perform kill $id;
This is safer than blindly restarting your server.
And if you have segregate between read/write (meaning read only SQL will use user with read privileges only), this can even simpler.
Use this bash script to check every minute.
#!/bin/bash
#Checking whether MySQL is alive or not
if mysqladmin ping | grep "alive"; then
echo "MySQL is up"
else
sudo service mysql restart
fi
`