With cmd I'd run mysql -uroot database < filename.sql to import a database dump (read from file and pass to MySQL). However, < is "reserved" in powershell.
Instead, in powershell I use get-content filename.sql | mysql -uroot database. The caveat is that powershell reads filename.sql completely into memory before passing it along to MySQL, and with large database dumps it simply runs out of memory.
Obviously, I could execute this via cmd but I have a handful of powershell scripts automating various tasks like this and I don't want to have to rewrite them all in batch. In this particular case, filename.sql is a variable that's specified via PS parameters when the automation kicks off.
So how do I get around this memory limitation? Is there another way to pipe the file contents into MySQL directly?
You can Try
mysql -uroot -pYourPassword -e "source C:\temp\filename.SQL"
or
mysql --user=root --password=YourPassword --execute="source C:\temp\filename.SQL"
If things start to get complicated maybe you should write a C# Console application that does the complex tasks.
Not sure if this will work for your application or not (it should process the file in chunks of 1000 records at a time, rather than all at once):
get-content filename.sql -readcount 1000 |% {$_ | mysql -uroot database}
I'd say stay away from the cmdlets for large files. I've been doing something similar with files that are 30+ million lines long and have not had a issue by using the below code. It performs extremely well both speed-wise and memory consumption-wise.
$reader = [IO.File]::OpenText($filetoread)
while ($reader.Peek() -ge 0) {
$line = $reader.ReadLine()
#do your thing here
}
$reader.Close()
Windows cmd (example for postgre, translate it individually):
psql -h 127.0.0.1 -p 5432 -f database.sql -U .... ....
Related
I am setting up a new Azure database using mysql server with a database. I want to dump a .sql file to this database using powershell
Executing the below code throws exception in '<', since it is reserved for future use, as a workaround I tired putting the entire statement in quotes.Still facing same issue.
$path = "C:\tools\mysql\current\bin\mysql.exe"
&"$path" -h $servername -u $username -p $databasename < filename.sql
The expected result is to dump the filename.sql into mysql server database.
Try something like this and see if it helps:
Use Get-Content to read the file and pipe | it to your command. Use & to run the command.
get-content 'c:\folder\backup.sql' | &"C:\Program Files\MySQL\MySQL Server 5.7\bin\mysql.exe" -u user --password=pass dbnamedbname
I have a database backup with 400+ sql files. foreach table there is a separate sql file. Is it possible to import all this files together to a database? If so could you tell me how to do this?
Also the backup is a gzipped tar file. Is there a way to restore from a compressed file.?
If you are using linux Concatenate all the sql files using and
cat *.sql > fullBackup.sql
then you can restore the database using this backup file
I have found the answer for my question here. Import Multiple .sql dump files into mysql database from shell
find . -name '*.sql' | awk '{ print "source",$0 }' | mysql --batch works perfectly. Thanks for #Haim to pointing out the correct post.
Nowdays processors have many cores. To use all the cores:
for s in *.sql.gz ; do gunzip -c $s | mysql -u sql_user -p'password' database_name & done
This command opens background process for each sql-dump file.
Or, with pv installed, you can see also the progress by using:
pv -p *.sql | mysql database
A mysqldump command like the following:
mysqldump -u<username> -p<password> -h<remote_db_host> -T<target_directory> <db_name> --fields-terminated-by=,
will write out two files for each table (one is the schema, the other is CSV table data). To get CSV output you must specify a target directory (with -T). When -T is passed to mysqldump, it writes the data to the filesystem of the server where mysqld is running - NOT the system where the command is issued.
Is there an easy way to dump CSV files from a remote system ?
Note: I am familiar with using a simple mysqldump and handling the STDOUT output, but I don't know of a way to get CSV table data that way without doing some substantial parsing. In this case I will use the -X option and dump xml.
mysql -h remote_host -e "SELECT * FROM my_schema.my_table" --batch --silent > my_file.csv
I want to add to codeman's answer. It worked but needed about 30 minutes of tweaking for my needs.
My webserver uses centos 6/cpanel and the flags and sequence which codeman used above did not work for me and I had to rearrange and use different flags, etc.
Also, I used this for a local file dump, its not just useful for remote DBs, because I had too many issues with selinux and mysql user permissions for SELECT INTO OUTFILE commands, etc.
What worked on my Centos+Cpanel Server
mysql -B -s -uUSERNAME -pPASSWORD < query.sql > /path/to/myfile.txt
Caveats
No Column Names
I cant get column names to appear at the top. I tried adding the flag:
--column-names
but it made no difference. I am still stuck on this one. I currently add it to the file after processing.
Selecting a Database
For some reason, I couldn't include the database name in the commandline. I tried with
-D databasename
in the commandline but I kept getting permission errors, so I ended using the following the top of my query.sql:
USE database_name;
On many systems, MySQL runs as a distinct user (such as user "mysql") and your mysqldump will fail if the MySQL user does not have write permissions in the dump directory - it doesn't matter what your own write permissions are in that directory. Changing your directory (at least temporarily) to world-writable (777) will often fix your export problem.
I would like to import a 350MB MySQL .sql file on a Windows 7 machine. I usually do this by using
mysql -uuser -p -e "source c:/path/to/file.sql" database
since < doesn't work in Powershell.
My .sql file has an error in it this time. I'd prefer to just skip the bad row and continue the import. How can I force the import to continue on Windows?
On a unix/linux based system, I could use
mysql --force ... < file.sql
but --force doesn't seem to work with the -e "source ..." command (understandably so).
Thanks,
Mike
You're probably going to have to have Powershell execute this in the standard console in order to use < properly. Technically you could use get-content and pipe the output to mysql, but I've always found that to be slow, and it somehow still keeps the file contents in memory of the Powershell session.
This is how I would execute it from the Powershell prompt (changed file path to include spaces to demonstrate inner quotes, just in case):
cmd /C 'mysql -uuser -p --force < "C:\path\with spaces\to\file.sql"'
[GC]::collect() would apparently clear it up the memory, but you can't do that until after it's done anyway. When it comes to mysql and mysqldump, I don't bother with Powershell. The default encoding used in > is Unicode, making dump files twice as big out of Powershell as out of cmd unless you remember to write | out-file dump.sql -enc ascii instead of > dump.sql.
I'd suggest to also have a look at this SO answer, that takes advantage of source SQL command:
I want to execute a query inside a db.sql file using MySQL 5.1 in the Windows environment. Can any one help me with this? I want to do this by running a .bat file.
Col Shrapnel's answer is the way to do it. The batch file would look like this:
mysql < %1
...which really just shows that it'd just be easier to use that directly instead of a .bat.
runsql myfile.sql // (assuming you name your batch file "runsql.bat")
// vs
mysql < myfile.sql
Reading your comment on the other answer, it seems like you might actually get some use out of the batch file as long as you don't mind storing your password in plain text on your computer.
Make the batch file like this:
mysql -u myusername -pmypassword < %1
Note that there is no space between -p and mypassword. If you use that above, every script would have to specify which database to use, which might be a hassle. The other way you could do it is like this:
mysql -u myusername -pmypassword -D %1 < %2
And the usage would be:
mysql database_name input.sql
mysql.exe < db.sql