PS Script to restore latest .sql backup file in to database - mysql

The scenario is that we have bunch of .sql dump files and new ones are created before every deployment.
If anything goes wrong with migration scripts someone has to manually drop/create schema and use command line to restore dump from the latest backup.
I am writing a PS script to automate this process.
find latest dump from given path
drop schema
create schema
restore dump.
I have accomplished first 3 steps but have wasted a lot of time on the 4th one:
Write-Host "Restoring: " $path
$command = '"mysql.exe -uUsername -pPassword ' + $dbname + ' < ' + $path + '"'
Write-Host $command
cmd /C $command
It says "The system cannot find the file specified."
If i use cmd $command without /C it starts cmd in powershell but doesn't execute $command.
I have tried different variations to execute the command in cmd but it doesn't seem to work, and the reason i will have to use cmd is because powershell doesn't play well with '<'.
I tried Invoke-Item and Invoke-Expression but can't guarantee i used correct syntax.
Any suggestions would be greatly appreciated.
Thanks

Related

sqlcmd runs at command prompt, not as .bat

I found a few threads related to this issue, but can't seem to find an answer that works.
I am using sqlcmd to create a .csv from a SQL table. I would like this to run as a scheduled task, and have had success in the past using sqlcmd code in a .bat file.
My syntax:
sqlcmd -S SERVERNAME -d DATABASE -U sa -P PASSWORD -q "select * from TABLE" -o "C:\Export\sqlexport.csv" -s","
When I paste this code into the Command Prompt, it runs perfectly.
I then copy the code and save it with Notepad as a .bat file. When I run the batch, either from the Command Prompt or by double-click, it errors:
"Login failed for user 'sa'.
I am not understanding the reason for the different outcomes, nor have a found a solution after many hours of searching. I'm sure its simple, but I'm new to this sqlcmd process. I appreciate any advice!

What is wrong with this bash script (cron + mysql)

Im using a bash script (sync.sh), used by cron, that is supposed to sync a file to a MySQL database. It works by copying a file from automatically uploaded location, parse it by calling SQL script which calls other MySQL internally stored scripts, and at the end emails a report text file as an attachment.
But, seems like something is not working as nothing happens to MySQL databases. All other commands are executed (first line and last line: copy initial file and e-mail sending).
MySQL command when run separately works perfectly.
Server is Ubuntu 16.04.
Cron job is run as root user and script is part of crontab for root user.
Here is the script:
#!/bin/bash
cp -u /home/admin/web/mydomain.com/public_html/dailyxchng/warehouse.txt /var/lib/mysql-files
mysql_pwd=syncit4321
cd /home/admin/web/mydomain.com/sync
mysql -u sync -p$mysql_pwd --database=database_name -e "call sp_sync_report();" > results.txt
echo "<h2>Report date $(date '+%d/%m/%Y %H:%M:%S')</h2><br/><br/> <strong>results.txt</strong> is an attached file which contains sync report." | mutt -e "set content_type=text/html" -s "Report date $(date '+%d/%m/%Y %H:%M:%S')" -a results.txt -- recipient#mydomain.com
cron will execute the script using a very stripped environment. you probably want to add the full path to the mysql command to the cron script
you can find the full path by
which mysql
at the prompt,
or you can add an expanded path to the cron invocation
1 2 * * * PATH=/usr/local/bin:$PATH scriptname

Executing Mass MySQL database restore from batch file using PowerShell

So what I have is about 80-90 MySQL DBs to restore and using < gives the 'saved for future use' error when used directly through PowerShell. The next step I tried was using just straight & cmd /c which also prompted the same error.
I tried the code below, but I get either Access Denied or 'system can't find the file specified' and I'm trying to figure out where it's messing up. Any help is definitely appreciated.
PowerShell
$list= gci "C:\Scripts\Migration\mysql\" -name
$mysqlbinpath="C:\Program Files\MySQL\MySQL Server 5.6\bin\mysql"
$mysqluser="admin"
$mysqlpass=getpass-mysql
pushd "C:\Program Files\MySQL\MySQL Server 5.6\bin"
foreach($file in $list){
$dbname=$file.Substring(0, $file.LastIndexOf('.'))
$location='C:\Scripts\Migration\mysql\'+$file
& $bt $mysqluser, $mysqlpass, $dbname, $location
write-host "$file restored"
}
The batch file called by $bt
Batch
"cmd /c "mysql" -u %1 -p%2 --binary-mode=1 %3 < %4"
getpass-mysql is a function I have that pulls the MySQL root pass (our root user is 'admin') as a string.
I hope this isn't a duplicate as I checked other posts first and tried those solutions (perhaps incorrectly applied) and they didn't seem to help.
I have updated the code block with the changes I have made. I am currently receiving the error about using the right syntax near ' ■-'.

Mysql compressed backup. Creating zero byte empty file

$command = 'mysqldump.exe -u '.$username.' '.$database . ' > '.$location.'\\'.$filename;
system($command);
this can create sql file correctly.
Now i'm trying to create compressed backup.
$filename = "xyz.sql.gz";
$command = 'mysqldump.exe -u '.$username.' '.$database . ' |gzip > '.$location.'\\'.$filename;
system($command);
This is creating a gz file with zero byte size.
please help, where im am doing wrong.
I didnot put -p[password] as the user i used is without password.
Thanks in advance.
Are you missing Gzip application on windows? or is it in PATH or in same directory than script?

Powershell pipe file contents into application without loading file in memory

With cmd I'd run mysql -uroot database < filename.sql to import a database dump (read from file and pass to MySQL). However, < is "reserved" in powershell.
Instead, in powershell I use get-content filename.sql | mysql -uroot database. The caveat is that powershell reads filename.sql completely into memory before passing it along to MySQL, and with large database dumps it simply runs out of memory.
Obviously, I could execute this via cmd but I have a handful of powershell scripts automating various tasks like this and I don't want to have to rewrite them all in batch. In this particular case, filename.sql is a variable that's specified via PS parameters when the automation kicks off.
So how do I get around this memory limitation? Is there another way to pipe the file contents into MySQL directly?
You can Try
mysql -uroot -pYourPassword -e "source C:\temp\filename.SQL"
or
mysql --user=root --password=YourPassword --execute="source C:\temp\filename.SQL"
If things start to get complicated maybe you should write a C# Console application that does the complex tasks.
Not sure if this will work for your application or not (it should process the file in chunks of 1000 records at a time, rather than all at once):
get-content filename.sql -readcount 1000 |% {$_ | mysql -uroot database}
I'd say stay away from the cmdlets for large files. I've been doing something similar with files that are 30+ million lines long and have not had a issue by using the below code. It performs extremely well both speed-wise and memory consumption-wise.
$reader = [IO.File]::OpenText($filetoread)
while ($reader.Peek() -ge 0) {
$line = $reader.ReadLine()
#do your thing here
}
$reader.Close()
Windows cmd (example for postgre, translate it individually):
psql -h 127.0.0.1 -p 5432 -f database.sql -U .... ....