Import large MySQL .sql file on Windows with Force - mysql

I would like to import a 350MB MySQL .sql file on a Windows 7 machine. I usually do this by using
mysql -uuser -p -e "source c:/path/to/file.sql" database
since < doesn't work in Powershell.
My .sql file has an error in it this time. I'd prefer to just skip the bad row and continue the import. How can I force the import to continue on Windows?
On a unix/linux based system, I could use
mysql --force ... < file.sql
but --force doesn't seem to work with the -e "source ..." command (understandably so).
Thanks,
Mike

You're probably going to have to have Powershell execute this in the standard console in order to use < properly. Technically you could use get-content and pipe the output to mysql, but I've always found that to be slow, and it somehow still keeps the file contents in memory of the Powershell session.
This is how I would execute it from the Powershell prompt (changed file path to include spaces to demonstrate inner quotes, just in case):
cmd /C 'mysql -uuser -p --force < "C:\path\with spaces\to\file.sql"'
[GC]::collect() would apparently clear it up the memory, but you can't do that until after it's done anyway. When it comes to mysql and mysqldump, I don't bother with Powershell. The default encoding used in > is Unicode, making dump files twice as big out of Powershell as out of cmd unless you remember to write | out-file dump.sql -enc ascii instead of > dump.sql.

I'd suggest to also have a look at this SO answer, that takes advantage of source SQL command:

Related

Unable to restore a MYSQL backup to a new Database

I successfully created a mysqldump file myDump.sql of a myDb1 database using guidelines from this thread. Also I created a second database myDb2, navigated to the directory containing myDump.sql and trying to restore it into the new database myDb2 but failing, Two methods I tried:
> mysql -u root -p myDb2 < myDump.sql;
> -- entered password
and:
> mysql -u root -p
mysql> -- entered password
mysql> USE myDb2;
mysql> SOURCE myDump.sql;
Both have the same error message:
ERROR:
ASCII '\0' appeared in the statement, but this is not allowed unless option --binary-mode is enabled and mysql is run in
non-interactive mode. Set --binary-mode to 1 if ASCII '\0' is expected. Query: ' ■-'.
I'd also like to know if I need to use the same database name as the old db for the new one. I tried with a different and same names, but with this same result error.
This is probably caused by coding systems.
My dump file is generated using redirection (">") in powershell and I encountered the same problem. The output redirection generated a file with UTF-16 Little endian.
However, this can be solved by converting the dumpfile into utf-8. This can be done in emacs as:
M-x set-buffer-file-coding-system
Then save the file and import again.
The coding system of a file can be detected using GNU "file" utility, and it also available in windows and can be found here: http://gnuwin32.sourceforge.net/packages/file.htm
For future use, a better dump command like:
mysqldump <dbname> -r <filename>
Check the myDump.sql file, it maybe a coding error in the file. These garbage characters cause this problem. Delete the garbage characters to solve the issue.
Open with Sequel Pro shows this
`í}k¯]ÇÝçðWÜ?øy«««_%c�sè;¶`Ìô hãEE¤"8Áü÷ô>ûqzW¯:êmX0`²¸yyëÔºµë¹êGw?û+þ{ð£»g¯ÿçw¯¿ºû/ß¾¹{ö/ï^}÷§oªô__ûöË7_ß'éÁªà¿¿{÷ÍÇ}ôý÷ßOo/ãoßL_¼ùÓG×?ûâÍ«×Óß¼ùãW¯/òÍGË?`

How do I get a tab delimited MySQL dump from a remote host ?

A mysqldump command like the following:
mysqldump -u<username> -p<password> -h<remote_db_host> -T<target_directory> <db_name> --fields-terminated-by=,
will write out two files for each table (one is the schema, the other is CSV table data). To get CSV output you must specify a target directory (with -T). When -T is passed to mysqldump, it writes the data to the filesystem of the server where mysqld is running - NOT the system where the command is issued.
Is there an easy way to dump CSV files from a remote system ?
Note: I am familiar with using a simple mysqldump and handling the STDOUT output, but I don't know of a way to get CSV table data that way without doing some substantial parsing. In this case I will use the -X option and dump xml.
mysql -h remote_host -e "SELECT * FROM my_schema.my_table" --batch --silent > my_file.csv
I want to add to codeman's answer. It worked but needed about 30 minutes of tweaking for my needs.
My webserver uses centos 6/cpanel and the flags and sequence which codeman used above did not work for me and I had to rearrange and use different flags, etc.
Also, I used this for a local file dump, its not just useful for remote DBs, because I had too many issues with selinux and mysql user permissions for SELECT INTO OUTFILE commands, etc.
What worked on my Centos+Cpanel Server
mysql -B -s -uUSERNAME -pPASSWORD < query.sql > /path/to/myfile.txt
Caveats
No Column Names
I cant get column names to appear at the top. I tried adding the flag:
--column-names
but it made no difference. I am still stuck on this one. I currently add it to the file after processing.
Selecting a Database
For some reason, I couldn't include the database name in the commandline. I tried with
-D databasename
in the commandline but I kept getting permission errors, so I ended using the following the top of my query.sql:
USE database_name;
On many systems, MySQL runs as a distinct user (such as user "mysql") and your mysqldump will fail if the MySQL user does not have write permissions in the dump directory - it doesn't matter what your own write permissions are in that directory. Changing your directory (at least temporarily) to world-writable (777) will often fix your export problem.

Powershell pipe file contents into application without loading file in memory

With cmd I'd run mysql -uroot database < filename.sql to import a database dump (read from file and pass to MySQL). However, < is "reserved" in powershell.
Instead, in powershell I use get-content filename.sql | mysql -uroot database. The caveat is that powershell reads filename.sql completely into memory before passing it along to MySQL, and with large database dumps it simply runs out of memory.
Obviously, I could execute this via cmd but I have a handful of powershell scripts automating various tasks like this and I don't want to have to rewrite them all in batch. In this particular case, filename.sql is a variable that's specified via PS parameters when the automation kicks off.
So how do I get around this memory limitation? Is there another way to pipe the file contents into MySQL directly?
You can Try
mysql -uroot -pYourPassword -e "source C:\temp\filename.SQL"
or
mysql --user=root --password=YourPassword --execute="source C:\temp\filename.SQL"
If things start to get complicated maybe you should write a C# Console application that does the complex tasks.
Not sure if this will work for your application or not (it should process the file in chunks of 1000 records at a time, rather than all at once):
get-content filename.sql -readcount 1000 |% {$_ | mysql -uroot database}
I'd say stay away from the cmdlets for large files. I've been doing something similar with files that are 30+ million lines long and have not had a issue by using the below code. It performs extremely well both speed-wise and memory consumption-wise.
$reader = [IO.File]::OpenText($filetoread)
while ($reader.Peek() -ge 0) {
$line = $reader.ReadLine()
#do your thing here
}
$reader.Close()
Windows cmd (example for postgre, translate it individually):
psql -h 127.0.0.1 -p 5432 -f database.sql -U .... ....

how to write a bat file to execute the .sql file

I want to execute a query inside a db.sql file using MySQL 5.1 in the Windows environment. Can any one help me with this? I want to do this by running a .bat file.
Col Shrapnel's answer is the way to do it. The batch file would look like this:
mysql < %1
...which really just shows that it'd just be easier to use that directly instead of a .bat.
runsql myfile.sql // (assuming you name your batch file "runsql.bat")
// vs
mysql < myfile.sql
Reading your comment on the other answer, it seems like you might actually get some use out of the batch file as long as you don't mind storing your password in plain text on your computer.
Make the batch file like this:
mysql -u myusername -pmypassword < %1
Note that there is no space between -p and mypassword. If you use that above, every script would have to specify which database to use, which might be a hassle. The other way you could do it is like this:
mysql -u myusername -pmypassword -D %1 < %2
And the usage would be:
mysql database_name input.sql
mysql.exe < db.sql

mysql import on windows

I have a MySQL file, db.sql. I have tried to import it using:
mysql -uroot -p[password] db < db.sql
All I get is a listing of mysql commands, or I get a syntax error. The weird thing is I used this file last week and, as far as I know, I'm doing it the same way.
I create the database, then in command line enter the above but it's not working. I've tried being inside mysql and just at command line and nothing seems to be working.
Is there something I should be doing differently in windows or MySQL5? I don't know how the heck I got it to work the first time...
TIA
Try this instead:
mysql -u root -p
(prompts for password)
use db;
source db.sql
I found out it is different to run this command from Windows Command Line (cmd.exe) and Windows PowerShell.
Using CMD.exe the command works okay, but in PowerShell I get this error:
mysql -uroot exampledb < exampledb.sql
The '<' operator is reserved for future use.
Not sure if your example was a typo or not, but for starters you need to have a space in between your flags and their values, roughly like this:
mysql -u root -p [password] db < db.sql
If you are already logged in the try this it will be very useful, but depend upon the MySQL version, it works on MySQL 5.0
For log in if you are not already logged in.
mysql>[your password]
Other wise, use the database to which you want to import the SQLDump file by command.
mysql>use [your database name]
And then give source the database Dump file path as blow command(If not works the copy Dump database file to the bin folder where the MySQL installed for eg. "C:/programfiles/mysql/mqlserver5.0/bin")
mysql> source [dataBasePath+name.sql or dataBaseName.sql]
I've been using PHP script called "BigDump":
http://www.ozerov.de/bigdump.php
This perfectly works
mysql>[your password]
Other wise, use the database to which you want to import the SQLDump file by command.
mysql>use [your database name]
And then give source the database Dump file path as blow command(If not works the copy Dump database file to the bin folder where the MySQL installed for eg. "C:/programfiles/mysql/mqlserver5.0/bin")
mysql> source [dataBasePath+name.sql or dataBaseName.sql]EG: source C:.....sql
I am using mysql server 5.5
In Windows PowerShell, you can pipe in the contents like so:
Get-Content db.sql | mysql -u root -p [password]