Creating a cron to execute a daily mysql dump - mysql

I want to create a cron job to execute the following mysql dump on a daily basis and save a csv file to a designated directory. However, I am trying to execute the command from my command line and it throws an error. I can't figure out what I am doing wrong.
mysql -u dom_hemma -ppasswordhere -D dom_hemma -e "SELECT * FROM `wp7f_posts` WHERE `post_type` LIKE 'flamingo_inbound' INTO OUTFILE '/home/domhemma/public_html/access/access.csv' FIELDS TERMINATED BY `,` LINES TERMINATED BY `\n`;"
I expect a csv file to be produced
Instead I get the "syntax error near unexpected token `('"

I think you have an error related to the quotation

Related

SQL Query that Create CSV file with Timestamp on it

I created This shell script below, which fetches data from MySQL and create output as csv file.
but I want to create this csv with a timestamp.
so I just want filename to be like this result_ddmmyy.csv
Is there any way to do this from script or from SQL query.
#!/usr/bin/bash
#scirpt to connect with db
master_db_user='root'
master_db_passwd='123'
master_db_port='3306'
master_db_host='localhost'
master_db_name='sagar_tb'
#Preparing script
#MySql Command to connect to a database
mysql -u$master_db_user -p$master_db_passwd -D$master_db_name -e 'select * into outfile "/created_files/result931.csv" fields terminated by "," lines terminated by "\n" from sagar_tb.test_table;'
echo "End of the Script"
Thanks and Regards,
Sagar Mandal
Add below commands in your script and update here if it works
now=$(date +"%y%m%d")
And later add this in your file creation
"/created_files/result_${now}.csv"

SQL Query not working inside a shell script

I really need help on this one...
so I'm writing this shell script which basically connects to MySQL database and fetches the data and give the output as a CSV file.
I'm able to connect to database and also able to get data from a simple query "select * from test_table;"
but when I try to write this query to make the give output as csv file from script it's giving an a syntax error.
QUERY> "select * into outfile '/Path/.cvs' fields terminated by ',' lines terminated by '\n' from test_table;"
this query is not working inside the script but it is working in MySQL database (CLI).
Really need help on this guys, if there is any way around of making output as csv file do tell me otherwise helpme out on this..
Error Msg I get is "ERROR 1064 (42000)" I know it's a syntax error but its only not working inside the script otherwise I don't how its working in mysql.
#!/usr/bin/bash
#scirpt to connect with db
master_db_user='root'
master_db_passwd='123'
master_db_port='3306'
master_db_host='localhost'
master_db_name='sagar_tb'
#Preparing script
SQL_Query='select * INTO OUTFILE '/created_files/RESULT3.CSV' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' from test_table;'
#MySql Command to connect to a database
mysql -u$master_db_user -p$master_db_passwd -P$master_db_port -h$master_db_host -D$master_db_name <<EOF
$SQL_Query
EOF
echo "End of the Script"
Really need help here guys
thanks and regards,
Sagar Mandal
The script tries to construct the command into SQL_Query. However, the implementation does take into account the quoting rules (you have to escape the quotes like ',')
SQL_Query='select * INTO OUTFILE '/created_files/RESULT3.CSV' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' from test_table;'
For simple solution, just inline the SQL into the mysql command (formatted for readability). Using the Here document ('<
mysql ... <<EOF
select * INTO OUTFILE '/created_files/RESULT3.CSV' ...
EOF
You can run any query from script using below style
myaql -uroot -ppassword -hlocalhost -e 'SELECT * FROM database.table_name';
I have edited my reply according to your script
#!/usr/bin/bash
#scirpt to connect with db
master_db_user='root'
master_db_passwd='123'
master_db_port='3306'
master_db_host='localhost'
master_db_name='sagar_tb'
#Preparing script
SQL_Query='select * INTO OUTFILE '/created_files/RESULT3.CSV' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' from test_table;'
#MySql Command to connect to a database
mysql -u$master_db_user -p$master_db_passwd -P$master_db_port -h$master_db_host -D$master_db_name -e '$SQL_Query';
echo "End of the Script"

LOAD DATA LOCAL INFILE in a script not populating data (mysql)

I am trying to build an automated CSV to MYSQL dump which occurs everytime the CSV file in a certain directory is updated.
Once the file is updated, the following line of code is executed in a bash script:
mysql -u root -p$MASTER_DB_PASSW < /usr/local/scripts/order.sql
The contents of order.sql are as follows:
use test;
truncate test.ORDER;
load data infile '/home/test/ORDER.csv' into table test.ORDER fields terminated by ','
enclosed by '"'
lines terminated by '\r\n'
IGNORE 1 LINES
(order_date);
For some reason, when I run order.sql manually in Workbench, it works perfectly... but when I run it on the server, the contents of the csv file do not get dumped. Any tips/advice? Thanks in advance.
I solved it.
I had an error in my bash script. I found this by running:
bash -x scriptname

MySQL OUTFILE query complains that "file already exists" but the file does not exist at all

I am writing a very simple shell script to dump a table into CSV file. Here is part of it:
day=`/bin/date +'%Y-%m-%d'`
file="/tmp/table-$day.csv"
rm $file
query="SELECT * FROM table INTO OUTFILE '$file' FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\\n'"
echo "$query" | mysql <connection_parameters>
I put in rm $file to make sure that the file does not exist prior to the query's execution.
However, when I execute the script, I get conflicting messages:
rm: cannot remove `/tmp/table-2013-02-08.csv': No such file or directory
ERROR 1086 (HY000) at line 1: File '/tmp/table-2013-02-08.csv' already exists
I cannot find the OUTFILE anywhere in the machine.
So what is wrong.. ?
Thank you.
I have found the answer.
OUTFILE creates the file on the MySQL server, rather than on my MySQL client's machine.
Check if you have this in /etc/passwd
mysql:x:27:27:MariaDB Server:/var/lib/mysql:/sbin/nologin
change shell to /bin/bash instead

How do I store a MySQL query result into a local CSV file?

How do I store a MySQL query result into a local CSV file? I don't have access to the remote machine.
You're going to have to use the command line execution '-e' flag.
$> /usr/local/mysql/bin/mysql -u user -p -h remote.example.com -e "select t1.a,t1.b from db_schema.table1 t1 limit 10;" > hello.txt
Will generate a local hello.txt file in your current working directory with the output from the query.
Use the concat function of mysql
mysql -u <username> -p -h <hostname> -e "select concat(userid,',',surname,',',firstname,',',midname) as user from dbname.tablename;" > user.csv
You can delete the first line which contains the column name "user".
We can use command line execution '-e' flag and a simple python script to generate result in csv format.
create a python file (lets name = tab2csv) and write the following code in it..
#!/usr/bin/env python
import csv
import sys
tab_in = csv.reader(sys.stdin, dialect=csv.excel_tab)
comma_out = csv.writer(sys.stdout, dialect=csv.excel)
for row in tab_in:
comma_out.writerow(row)
Run the following command by updating mysql credentials correctly
mysql -u orch -p -h database_ip -e "select * from database_name.table_name limit 10;" | python tab2csv > outfile.csv
Result will be stored in outfile.csv.
I haven't had a chance to test it against content with difficult characters yet, but the fantastic mycli may be a solution for many.
Command line
mycli --csv -e "select * from table;" mysql://user#host:port/db > file.csv
Interactive mode:
\n \T csv ; \o ~/file.csv ; select * from table1; \P
\n disables pager that requires pressing space to display each page
\T csv ; - sets the output format to csv
\o <filename> ; - appends next output to a file
<query> ;
\P turns the pager back on
I am facing this problem and I've been reading some time for a solution: importing into excel, importing into access, saving as text file...
I think the best solution for windows is the following:
use the command insert...select to create a "result" table. The ideal scenario whould be to automatically create the fields of this result table, but this is not possible in mysql
create an ODBC connection to the database
use access or excel to extract the data and then save or process in the way you want
For Unix/Linux I think that the best solution might be using this -e option tmarthal said before and process the output through a processor like awk to get a proper format (CSV, xml, whatever).
Run the MySQl query to generate CSV from your App like below
SELECT order_id,product_name,qty FROM orders INTO OUTFILE '/tmp/orders.csv'
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'
It will create the csv file in the tmp folder of the application.
Then you can add logic to send the file through headers.
Make sure the database user has permissions to write into the remote file-system.
You can try doing :
SELECT a,b,c
FROM table_name
INTO OUTFILE '/tmp/file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
We use the OUTFILE clause here to store the output into a CSV file.We enclose the fields in double-quotes to handle field values that have the comma in them and we separate the fields by comma and separate individual line using newline.