Get Mysql query file execution result saved in a file using windows CMD - mysql

I need to get the result of mysql query execution results saved into a txt file.
The file contains over 600000 lines of insert statements.
The command that I used is:
mysql -h10.100.109.123 -f -uUserName -pPassword
--database=databaseName < AFC.sql > abc.txt
Abc.txt gets created but there is no data in it. From console, I can see there are many errors. -f is used so that it wont stop on error.
Have gone through and tried out many suggestions of similar question. But none of them worked.
MySql version 5.7

Related

batch on windows not finishing mysql job

I have a .sql file 25 sql statements, that runs ok when all statements are copypasted into mysql client It takes app 20 mins to finish and no warnings are displayed. However, if I call this file through batch file on windows (I would like to add this to task scheduler) - - the job does not get finished. I can see which statement is the problem - however - no warnings get displayed - I can only see in (show processes), that mysql process end and no subsequent commands from the file get executed - this happens somewhere in the middle on my initial sql file. Again - the statement get executed ok when copypasted into mysql console. If anyone has any idea - would appreciate a lot..
c:
cd\program files\mysql\mysql server 5.7\bin
mysql --show-warnings -h12.24.56.78 -u user -ppass --port=3306 --default-character-set=utf8 < "C:\SVN\dis\777e\777a_777e\777a_ua_into_777e.sql" >> "C:\Users\user1\db_transfers\777a_ua\777a_ua_into_777e_warnings.log" 2>&1

What is the error in my syntax preventing me from loading my data file into mysql

I am attempting to upload a .txt file into my sql database I just created.
I was able to load several lines of data into the table using INSERT INTO, but when I tried to utilize LOAD DATA LOCAL INFILE '/pathto/file.txt' INTO TABLE mytable, it first gave me the error that command is not allowed in my version of mysql.
So after I read How can I correct MySQL Load Error, I used the --local-infile=1 -u mysqlname -p followed by the above command I have repeatedly been awarded the syntax error.
I've tried this to load the .txt file with all sorts of different combinations of the above, and still get one of the two errors.
Below is a screen shot.
This is with ubuntu 15.10 and mysql version 5.6.28-0ubuntu0.15.10.1.
Screen shot of terminal in question
--local-infile is a server and client parameter. It's not valid syntax as part of a statement such as LOAD DATA or INSERT statement.
You would specify server variables and options either in the appropriate sections of the my.cnf file, or as command line parameters to the MySQL program being executed.
For example, at the OS prompt...
# mysql -h myserverhost -u mysqlname -p --local-infile=1
That option has to be specified for the MySQL server.
If you are connecting as user#localhost, you don't need LOCAL. You can give the MySQL user (whichever OS user the mysql server is running under) read privilege on the file you want to load... chmod ugo+r /mypath/myfile (and read execute on the directories in the path.
You only need LOCAL if the msyql user isn't #'localhost'.

How do I get a tab delimited MySQL dump from a remote host ?

A mysqldump command like the following:
mysqldump -u<username> -p<password> -h<remote_db_host> -T<target_directory> <db_name> --fields-terminated-by=,
will write out two files for each table (one is the schema, the other is CSV table data). To get CSV output you must specify a target directory (with -T). When -T is passed to mysqldump, it writes the data to the filesystem of the server where mysqld is running - NOT the system where the command is issued.
Is there an easy way to dump CSV files from a remote system ?
Note: I am familiar with using a simple mysqldump and handling the STDOUT output, but I don't know of a way to get CSV table data that way without doing some substantial parsing. In this case I will use the -X option and dump xml.
mysql -h remote_host -e "SELECT * FROM my_schema.my_table" --batch --silent > my_file.csv
I want to add to codeman's answer. It worked but needed about 30 minutes of tweaking for my needs.
My webserver uses centos 6/cpanel and the flags and sequence which codeman used above did not work for me and I had to rearrange and use different flags, etc.
Also, I used this for a local file dump, its not just useful for remote DBs, because I had too many issues with selinux and mysql user permissions for SELECT INTO OUTFILE commands, etc.
What worked on my Centos+Cpanel Server
mysql -B -s -uUSERNAME -pPASSWORD < query.sql > /path/to/myfile.txt
Caveats
No Column Names
I cant get column names to appear at the top. I tried adding the flag:
--column-names
but it made no difference. I am still stuck on this one. I currently add it to the file after processing.
Selecting a Database
For some reason, I couldn't include the database name in the commandline. I tried with
-D databasename
in the commandline but I kept getting permission errors, so I ended using the following the top of my query.sql:
USE database_name;
On many systems, MySQL runs as a distinct user (such as user "mysql") and your mysqldump will fail if the MySQL user does not have write permissions in the dump directory - it doesn't matter what your own write permissions are in that directory. Changing your directory (at least temporarily) to world-writable (777) will often fix your export problem.

mysql won't import database dump file on Windows XP

I created a data base using mysql. I used MySQLDump to create one database backup file in text format (MySql 5.5 on Windows XP). The database is local on my machine (local host).
I am having trouble using the MySQL command to load the dump file to restore the database. I have done the following:
Research stack overflow for how to do it. I noticed there's a bug using the MySQL command to restore the data from a post. Before I run the command, I DROP the database and CREATE the database using MySQL workbench.
I type the following command in the DOS prompt to restore the database:
mysql -u root -p -h localhost -D matlab_data -o < backup.sql
backup.sql is a the backup file in text format created by MySqlDump.
I am then asked for the password which I enter. I get the DOS prompt right away with no error message. I've waited several hours for the command to run and the database is still empty.
I have tried various command formats over the last few days. If I enter incorrect data in the command line (non existen file, database, etc), I get an error message.
I feel I would not see the DOS prompt until the database is restored. If I don't DROP and CREATE the database, I get an error message. Otherwise, not.
Does anybody have any idea what the issue is? I realize that I could be making a stupid mistake.
Thank you for your help.
shell into the mysql console and run the sql file as this
If you are already running mysql, you can execute an SQL script file using the source command or . command:
mysql> source file_name
mysql> \. file_name
note that file_name must be an absolut path

write results of sql query to a file in mysql

I'm trying to write the results of a query to a file using mysql. I've seen some information on the outfile construct in a few places but it seems that this only writes the file to the machine that MySQL is running on (in this case a remote machine, i.e. the database is not on my local machine).
Alternatively, I've also tried to run the query and grab (copy/paste) the results from the mysql workbench results window. This worked for some of the smaller datasets, but the largest of the datasets seems to be too big and causing an out of memory exception/bug/crash.
Any help on this matter would be greatly appreciated.
You could try executing the query from the your local cli and redirect the output to a local file destination;
mysql -user -pass -e"select cols from table where cols not null" > /tmp/output
This is dependent on the SQL client you're using to interact with the database. For example, you could use the mysql command line interface in conjunction with the "tee" operator to output to a local file:
http://dev.mysql.com/doc/refman/5.1/en/mysql-commands.html
tee [file_name], \T [file_name]
Execute the command above before executing the SQL and the result of the query will be output to the file.
Specifically for MySQL Workbench, here's an article on Execute Query to Text Output. Although I don't see any documentation, there are indications that there should be also be an "Export" option under Query, though that is almost certainly version dependent.
You could try this, if you want to write MySQL query result in a file.
This example write the MySQL query result into a csv file with comma separated format
SELECT id,name,email FROM customers
INTO OUTFILE '/tmp/customers.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
If you are running mysql queries on the command line. Here I suppose you have the list of queries in a text file and you want the output in another text file. Then you can use this. [ test_2 is the database name ]
COMMAND 1
mysql -vv -u root -p test_2 < query.txt > /root/results.txt 2>&1
Where -vv is for the verbose output.
If you use the above statement as
COMMAND 2
mysql -vv -u root -p test_2 < query.txt 2>&1 > /root/results.txt
It will redirect STDERR to normal location (i.e on the terminal) and STDOUT to the output file which in my case is results.txt
The first command executes the query.txt until is faces an error and stops there.
That's how the redirection works. You can try
#ls key.pem asdf > /tmp/output_1 2>&1 /tmp/output_2
Here key.pm file exists and asdf doesn't exists. So when you cat the files you get the following
# cat /tmp/output_1
key.pem
#cat /tmp/output_2
ls: cannot access asdf: No such file or directory
But if you modify the previous statement with this
ls key.pem asdf > /tmp/output_1 > /tmp/output_2 2>&1
Then you get the both error and output in output_2
cat /tmp/output_2
ls: cannot access asdf: No such file or directory
key.pem
mysql -v -u -c root -p < /media/sf_Share/Solution2.sql 2>&1 > /media/sf_Share/results.txt
This worked for me. Since I wanted the comments in my script also to be reflected in the report I added a flag -c