Execute mysql dump using power shell - mysql

I am setting up a new Azure database using mysql server with a database. I want to dump a .sql file to this database using powershell
Executing the below code throws exception in '<', since it is reserved for future use, as a workaround I tired putting the entire statement in quotes.Still facing same issue.
$path = "C:\tools\mysql\current\bin\mysql.exe"
&"$path" -h $servername -u $username -p $databasename < filename.sql
The expected result is to dump the filename.sql into mysql server database.

Try something like this and see if it helps:
Use Get-Content to read the file and pipe | it to your command. Use & to run the command.
get-content 'c:\folder\backup.sql' | &"C:\Program Files\MySQL\MySQL Server 5.7\bin\mysql.exe" -u user --password=pass dbnamedbname

Related

Import SQL dumps trough bash script

I'm trying to import GZiped MySQL databases listed in a folder.
GZiped files are located at .mysqldumps/.
$NAME tries to extract database name (as files are always named database_name.sql.gz) and pass it to mysql command line.
Also, as username and database name are the same, the same argument is passed ($NAME).
As files are GZiped, we try to zcat them (so gunzip -c) before pipe them to mysql.
The full script is:
#!/bin/bash
FILES='.mysqldumps/*'
PASSWORD='MyPassword'
for f in $FILES
do
NAME=dbprefix_`basename $f .sql.gz`
echo "Processing $f"
set -x
zcat $f | mysql -u "$NAME" -p$PASSWORD "$NAME"
done
But, when i run the script it outputs:
./.mysqlimport
Processing .mysqldumps/first_database.sql.gz
+ mysql -u dbprefix_first_database -pMyPassword dbprefix_first_database
+ zcat .mysqldumps/first_database.sql.gz
ERROR 1044 (42000) at line 22: Access denied for user 'dbprefix_first_database'#'localhost' to database 'first_database'
As you can see, the selected database is 'first_database' instead of 'dbprefix_first_database' and this just trowns an error of corse, and i just can't understand why $NAME is not correctly parse as database name.
What i'm doing wrong?
After some investigation, the problem comes from the DUMP and not from the script.
While using mysqldump the option --databases was used which includes the USE 'dbname'; and when importing, that name was used instead of $NAME.
Problem solved!

SQL syntax error near gunzip when restoring a database using .sql.gz file

I am trying to restore a mysql db using a .sql.gz file. I am using mySql console to run a command because file size is too large for phpMyAdmin. Command I am using is
gunzip C:/Vik/Gya/Source/beed_2013-04-06.sql.gz | mysql -u root -p bd
where root is the user id. There is no password for root. bd is the database to which I am trying to import. mysql is running on my local machine (Windows 8). I have a wamp setup.
This is the error I am getting:
ERROR 1064 (42000): You have an error in your SQL syntax; check the
manual that corresponds to your MySQL server version for the right
syntax to use near 'gunzip
C:/Vikalp/Gyankosh/Source/beedictionary_2013-04-06.sql | mysql -u root
-p' at line 1.
You need -c option (output to stdout)
gunzip -c xxx.sql.gz |mysql -u root -p
While Kisoft´s answer is the correct one, I just wanted to point out that you don´t need the -c, it works just fine as it is.
this command will unzip the database dump and import it into the database at the same time.
gunzip < output.sql.gz | mysql -u <username> -p<password> <database>
If you type gunzip and you get a SQL syntax error that complaints about gunzip, you are already logged into the mysql console. The mysql console is not a general purpose shell!
You are using Windows and I suspect you haven't installed gzip in your computer (it isn't a builtin utility). It's a classical Unix tool but you can find binaries for Windows. Install it and run your original command with a couple of tweaks:
Make sure you're in Windows prompt (C:\>)
Redirect gunzip result to stdout rather than a file:
gunzip --stdout C:/Vik/Gya/Source/beed_2013-04-06.sql.gz | mysql -u root -p bd
Alternatively, you can run the dump from within MySQL promt (mysql>) if you uncompress it first (you don't need specifically command-line gzip, most GUI archivers such as 7-Zip support this format):
mysql> \. C:/Vikalp/Gyankosh/Source/beedictionary_2013-04-06.sql
you do not need to gunzip
just:
zcat myfile.gz | mysql -uuser -ppassword mydatabase
it is faster this way
Your answer is already here
phpMyAdmin: Can't import huge database file, any suggestions?
Under php.ini file, normally located in c:\xampp\php or wampp whatever you called
post_max_size=128M
upload_max_filesize=128M
Changing value there will get you what you want.Good luck
Dont forget to restart , apache and mysql .
Try this following steps to restore db using .gz files:
1. Run command : gunzip C:/Vik/Gya/Source/beed_2013-04-06.sql.gz
This will uncompress the .gz file and will just store beed_2013-04-06.sql in the same location.
2. Type the following command to import sql data file:
mysql -u username -p bd < C:/Vik/Gya/Source/beed_2013-04-06.sql

mysql database backup with mysqldump

I want to backup my database using mysql dump. This is the code I run in Command prompt when the location is mysql bin.
mysqldump -u root -pabc Db -r C:\Documents and Settings\All Users\Desktop\ttttt.sql
abc is the password. and I try to backup to a .sql file in desktop. I use mysql 5.5.
But the following error occured. mysqldump: Couldn't find table: "and"
But there is no table called 'and' in database and I didn't create such a table.But the error say about a 'and' table. How can I back up mysql database without this error.
Try instead:
mysqldump -u root -pabc Db -r "C:\Documents and Settings\All Users\Desktop\ttttt.sql"
Your command shell is breaking apart the pathname into multiple arguments. The quotes tell the shell to pass it all as a single argument to the mysqldump program.
I think there is some problem with syntax of command you are running. try something like this :
mysqldump -u root -p dbName > path\nameOfFile.sql
It will automatically ask for your password. You don't need to write it in command.

using mysqldump to restore database to mysql located on another computer

Any idea how to do this restore ?
I looked into help of mysqldump but couldn't see it there .
If so can you give me some example.
With mysqldump you will generate a script you can use for restore on a different computer like this:
$ mysql -U user_name < your_backup.sql
Run on your favorite shell (windows command prompt, bash, csh...).
I think you can use CMD to navigate to the mysqldump location, then type this command,
mysqldump database_name -u username >location\to\save\dump.sql
change database_name to the database you want to backup, username to the username associated with the database, and location\to\save\dump.sql to the location where you want to save the output sql file, for me I wrote it D:\dump.sql
Then on the other machine you can import the SQL file using the PHPMyAdmin.
You can just execute the SQL using the mysql command-line command. There is a switch to specify which file to import, I think it is -I but I'm not sure.
It's just plain SQL. Pass the file to mysql (the mysql command line tool) and it will execute it:
mysql < backup.sql
From the shell prompt, using
parameters form the mysqldump
doc, mysqldump the database using a > redirect to a
human readable .sql file. E.g.
$ mysqldump --databases src_db > src_db.sql
Transfer the human readable file to
another machine.
After making sure the destination database exists has been created, redirect < the .sql file into the destination database.
$ mysql dest_db < src_db.sql

Powershell pipe file contents into application without loading file in memory

With cmd I'd run mysql -uroot database < filename.sql to import a database dump (read from file and pass to MySQL). However, < is "reserved" in powershell.
Instead, in powershell I use get-content filename.sql | mysql -uroot database. The caveat is that powershell reads filename.sql completely into memory before passing it along to MySQL, and with large database dumps it simply runs out of memory.
Obviously, I could execute this via cmd but I have a handful of powershell scripts automating various tasks like this and I don't want to have to rewrite them all in batch. In this particular case, filename.sql is a variable that's specified via PS parameters when the automation kicks off.
So how do I get around this memory limitation? Is there another way to pipe the file contents into MySQL directly?
You can Try
mysql -uroot -pYourPassword -e "source C:\temp\filename.SQL"
or
mysql --user=root --password=YourPassword --execute="source C:\temp\filename.SQL"
If things start to get complicated maybe you should write a C# Console application that does the complex tasks.
Not sure if this will work for your application or not (it should process the file in chunks of 1000 records at a time, rather than all at once):
get-content filename.sql -readcount 1000 |% {$_ | mysql -uroot database}
I'd say stay away from the cmdlets for large files. I've been doing something similar with files that are 30+ million lines long and have not had a issue by using the below code. It performs extremely well both speed-wise and memory consumption-wise.
$reader = [IO.File]::OpenText($filetoread)
while ($reader.Peek() -ge 0) {
$line = $reader.ReadLine()
#do your thing here
}
$reader.Close()
Windows cmd (example for postgre, translate it individually):
psql -h 127.0.0.1 -p 5432 -f database.sql -U .... ....