Iterating over elements of environment variable in PuTTY script file - mysql

I created a bat file where I asked it to load a putty session and passed another bat file using -m to execute all the commands necessary. In the second bat file, I have some repeated mysql commands to be executed. So I created a for loop, and wanted to pass the list of DB names as parameters to it, (which should be given along with the file I am asking to execute in the first bat file). I am not able to pass the list of DB names inside the file name using % symbol.
What is the solution for this.
SET file=%1
SET DB=%2
Putty.exe -load xyz -m %file% %DB% -t
This is my first bat file, I want to pass the DB name as list along with file. Can someone please help me achieve this?
I have some MySQL commands that are repeated for different DB's. Can I achieve that using for loop inside a bat file?
FOR %%q IN (%DB%) DO mysql -h xyzdb -e 'drop database %%q;'
This is giving a syntax error, can someone suggest where I went wrong?

The script file that you pass to putty is not a batch file. It's a list of commands for remote shell. So:
You have to use remote shell constructs and commands, not batch file commands.
You cannot refer to local environment variables.
But you can generate the file using the batch file
FOR %%q IN (%DB%) DO echo mysql -h xyzdb -e 'drop database %%q;' > script.txt
Putty.exe -load xyz -m script.txt
Also, for automation, you should use plink, not putty.
I'm not sure what you wanted to achieve by %DB% in the putty commandline.

Related

Crontab command not running

I have a database backup command that takes a mysql dump and then uploads that dump file to AWS S3, when I run the command as a normal user it works perfectly but when I use the same command in a cron job it fails.
I have checked the syslog and there is no error message saying there was a problem after the job. There is only a line saying the job is run and then goes on to run the next cron job.
The command is as follows, I have removed the sensitive parts:
mysqldump -u {{ db_user }} -p{{ db_password }} {{ db_name }} > /home/db_backup.sql | aws s3 cp /home/db_backup.sql s3://{{ s3_url }}/$(date --iso-8601=seconds)_db.sql --profile backupprofile
When this command is run by a normal user there is a warning output not to use the the mysql password in command line but this is essential for the command to work without interaction. There is also a second line ofor the S3 to say that the upload worked. Could these outputs be effecting the cronjob is someway?
You will need to have full paths on your cronjobs, I see you missed them out on mysqldump and also your aws for the connection URL. I would do whereis mysqldump and whereis aws to find which full path you need to run it.
Try checking the the environment variable, cron passes a minimal set of environment variables to your jobs. You can set the PATH easily inside crontab
PATH=/usr/local/bin:/usr/sbin
Also many cron execute command using sh, and you might be using another shell in your script . You can tell cron to run all commands in bash by setting the shell at the top of your crontab:
SHELL=/bin/bash
Cron tries to interpret % symbol, you need to escape it if you have that somewhr in your command.
If the output that comes first after running your command is interactive that is if this asks you to hit enter or something like that thn this is the issue otherwise thr shouldn't be any problem with this.

execute .sql file in bash with params

I want to execute a .sql file in a bash file but I need to pass parameters to the .sql from the bash file. I think it's a simple solution but I can't figure it out.
This is what I have so far:
.SQL File
SET #columnValue = &1;
UPDATE tblTest SET Description = #columnValue;
Bash File
#!/bin/bash
columnValue=$1
mysql -uroot -ppassword testDB < SQLscript.sql $columnValue
Shell
sh myBashFile.sh "testColumnValue"
Can anyone tell me what i am doing wrong?
You need a combination of setting SQL parameters first and then loading your .sql file.
In your bash script you can use mysql with the -e option to execute an SQL statement. The same method must then be used to tell mysql to load the file instead of reading it using bash.
See this example script.
If you only want to execute a single statement, see this question.

How do i import imdb list files into mysql database on windows?

Am using MYSQL, imdbpy(to import data). I have never used imdbpy before and have no idea of how python script works.Here is the list of text files from which am gonna use a few to import into my database.
ftp://ftp.fu-berlin.de/pub/misc/movies/database/
Here is the link of imdbpy link that am trying to follow.
http://imdbpy.sourceforge.net/docs/README.sqldb.txt
I quite don't understand this part :
*Create a database named "imdb" (or whatever you like),
using the tool provided by your database; as an example, for MySQL
you will use the 'mysqladmin' command:
# mysqladmin -p create imdb
For PostgreSQL, you have to use the "createdb" command:
# createdb -W imdb
To create the tables and to populate the database, you must run
the imdbpy2sql.py script:
# imdbpy2sql.py -d /dir/with/plainTextDataFiles/ -u 'URI'
Where the 'URI' argument is a string representing the connection
to your database, with the schema:
scheme://[user[:password]#]host[:port]/database[?parameters]*
How do i run that imdbpy2sql.py script?
Assuming you have Python installed, you would run "imdbpy2sql.py -d /dir/with/plainTextDataFiles/ -u 'URI'" from the command prompt.
Note: The command prompt will need to be 'sitting' in the directory where imdbpy2sql.py exists or you will need to predicate imdbpy2sql.py with the full directory path.
The script imdbpy2sql.py can be found on their site: https://bitbucket.org/alberanid/imdbpy/src/ed8b3d354c9f1e2de7056d78b21e49f64ee52591/bin/imdbpy2sql.py?at=default
Since the script doesn't seem dependent on any other files, it should be sufficient to copy and paste the code to your current repository.

Executing MySQL commands in shell script?

I’m looking to create a deploy script that I can run from a terminal and it automatically deploys my site from a repository. The steps I’ve identified are:
Connect to remote server via SSH
Fetch latest version of site from remote repository
Run any SQL patches
Clean up and exit
I’ve placed the SSH connection and git pull commands in my shell file, but what I’m stuck with is MySQL with it being an (interactive?) shell itself. So in my file I have:
#!/bin/bash
# connect to remote server via SSH
ssh $SSH_USER#$SSH_HOST
# update code via Git
git pull origin $GIT_BRANCH
# connect to the database
mysql --user $MYSQL_USER --password=$MYSQL_PASSWORD --database=$MYSQL_DBNAME
# run any database patches
# disconnect from the database
# TODO
exit 0
As you can see, I’m connecting to the database, but not sure how to then execute any MySQL statements.
At the moment, I have a directory containing SQL patches in numerical order. So 1.sql, 2.sql, and so on. Then in my database, I have a table that simply records the last patch to be run. So I’d need to do a SELECT statement, read the last patch to be ran, and then run any neccesary patches.
How do I issue the SELECT statement to the mysql prompt in my shell script?
Then what would be the normal flow? Close the connection and re-open it, passing a patch file as the input? Or to run all required patches in one connection?
I assume I’ll be checking the last patch file, and doing a do loop for any patches in between?
Help here would be greatly appreciated.
Assuming you want to do all the business on the remote side:
ssh $SSH_USER#$SSH_HOST << END_SSH
git pull origin $GIT_BRANCH
mysql --user $MYSQL_USER --password=$MYSQL_PASSWORD --database=$MYSQL_DBNAME << END_SQL
<sql statements go here>
END_SQL
END_SSH
You could get the output from mysql using Perl or similar. This could be used to do your control flow.
Put your mysql commands into a file as you would enter them.
Then run as: mysql -u <user> -p -h <host> < file.sqlcommands.
You can also put queries on the mysql command line using '-e'. Put your 'select max(patch) from .' and read the output in your script.
cat *.sql | mysql --user $MYSQL_USER --password=$MYSQL_PASSWORD --database=$MYSQL_DBNAME

How can I run multiple Stored Procedures Files & Triggers (.sql) From MySQL Workbench

I am trying to run a set of sql files with stored procedures and triggers in my windows XAMPP environment. Some suggested to me using a batch script but I do not know how to do this in windows.
Is it possible to run all these .sql files from within MySQL Workbench? How? If not, can anyone tell me how to run a batch file within windows?
Thank you.
It seems Workbench doesn't support the command "SOURCE" so the next best thing is is (at least in windows) is to run a batch job. Simply create a new .sql file and add the full path to each .sql file like so:
Create the batch file:
In windows, the batch file can be a .sql with the sql comman SOURCE which calls the other .sql files, like so:
create run.sql
SOURCE C:\xampp\htdocs\mysite\sql\procs\sp_article_delete.sql
SOURCE C:\xampp\htdocs\mysite\sql\procs\sp_article_insert.sql
SOURCE C:\xampp\htdocs\mysite\sql\procs\sp_article_load.sql
Open Command Line and CD to MySQL Folder
Open the command line, and cd to MySQL. If you are using XAMPP, the command/location should be something like:
cd C:\xampp\mysql\bin\
Execute the Batch File by pressing ENTER
Last, simply load mysql and run the batch file using the following command:
mysql -u root -h 127.0.0.1 my_database_name -vvv < C:\xampp\htdocs\mysite\sql\procs\run.sql
The execution above means the following:
mysql -u <username> -h <host> <database> -vvv < <batch_path_file_name>
-vvv shows all the queries being executed and the rows affected for debugging.
That's it. All .sql files mentioned in the run.sql file will be executed.