cron job - how to export data table to csv/xls file? - mysql

i'm trying to configure cron job to export automatically everyday data table from mysql DB to csv/xls.
and i'm stuck in te command field.
which command should i use ?
thank you.
mor

This is an old topic, but for posterity, what I've found would work in this case is to simply write the SQL in a separate file, then source the file into mysql. That way, you don't have to worry about quotes and escapes etc.
So, for example, I would have a SQL file like so, called exportinventory.sql
SELECT *
FROM inventory INTO OUTFILE '/tmp/inventory.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
And then, from the command line:
mysql -u user -p database < exportinventory.sql

There are three steps to accomplish:
write proggy, that exports the data as you want (php, java)
write a bash script, which properly sets the environment and calls
this proggy
insert this script in to a users crontab
you might find information for all of the three over here.

You can try something like:
mysql -h $DB_HOST -u $DB_USER --password=$DB_PASSWORD $DB_NAME -e "$YOUR_QUERY INTO OUTFILE '$PATH_TO_OUTPUT_FILE' FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n'"

Related

Creating Sql File From Sql Query

I have a huge amount of data in MariaDB. I need to do create dump files from queries. So far I got something like this.
SELECT DISTINCT uretici INTO OUTFILE '/home/admin/web/example.com/public_html/public/manufacturers.sql' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM data;
This gives me a SQL file which is cant be imported. I need an importable SQL file. What to do about it?
You can use the following command
mysqldump -u [database_user] -p [database_name] | gzip > [filename_to_compress.sql.gz]

SQL Query not working inside a shell script

I really need help on this one...
so I'm writing this shell script which basically connects to MySQL database and fetches the data and give the output as a CSV file.
I'm able to connect to database and also able to get data from a simple query "select * from test_table;"
but when I try to write this query to make the give output as csv file from script it's giving an a syntax error.
QUERY> "select * into outfile '/Path/.cvs' fields terminated by ',' lines terminated by '\n' from test_table;"
this query is not working inside the script but it is working in MySQL database (CLI).
Really need help on this guys, if there is any way around of making output as csv file do tell me otherwise helpme out on this..
Error Msg I get is "ERROR 1064 (42000)" I know it's a syntax error but its only not working inside the script otherwise I don't how its working in mysql.
#!/usr/bin/bash
#scirpt to connect with db
master_db_user='root'
master_db_passwd='123'
master_db_port='3306'
master_db_host='localhost'
master_db_name='sagar_tb'
#Preparing script
SQL_Query='select * INTO OUTFILE '/created_files/RESULT3.CSV' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' from test_table;'
#MySql Command to connect to a database
mysql -u$master_db_user -p$master_db_passwd -P$master_db_port -h$master_db_host -D$master_db_name <<EOF
$SQL_Query
EOF
echo "End of the Script"
Really need help here guys
thanks and regards,
Sagar Mandal
The script tries to construct the command into SQL_Query. However, the implementation does take into account the quoting rules (you have to escape the quotes like ',')
SQL_Query='select * INTO OUTFILE '/created_files/RESULT3.CSV' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' from test_table;'
For simple solution, just inline the SQL into the mysql command (formatted for readability). Using the Here document ('<
mysql ... <<EOF
select * INTO OUTFILE '/created_files/RESULT3.CSV' ...
EOF
You can run any query from script using below style
myaql -uroot -ppassword -hlocalhost -e 'SELECT * FROM database.table_name';
I have edited my reply according to your script
#!/usr/bin/bash
#scirpt to connect with db
master_db_user='root'
master_db_passwd='123'
master_db_port='3306'
master_db_host='localhost'
master_db_name='sagar_tb'
#Preparing script
SQL_Query='select * INTO OUTFILE '/created_files/RESULT3.CSV' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' from test_table;'
#MySql Command to connect to a database
mysql -u$master_db_user -p$master_db_passwd -P$master_db_port -h$master_db_host -D$master_db_name -e '$SQL_Query';
echo "End of the Script"

How to export mysql table from the terminal

I'm trying to bypass the -secure-file-priv option that MySQL has enabled to export a table into a .csv file.
I've been running SELECT * FROM final_table INTO OUTFILE 'test.csv' FIELDS ENCLOSED BY '"' TERMINATED BY ';' ESCAPED BY '"' LINES TERMINATED BY '\r\n'; into the terminal, but it has been returning The MySQL server is running with the --secure-file-priv option so it cannot execute this statement
Previously, I bypassed this error when importing files by running LOAD DATA LOCAL INFILE instead of LOAD DATA INFILE which had the same error and was wondering if there was a way to do it with exporting as well.
Thanks!
Answering my own question since I found a solution that might help out other people.
mysql -u root -p [database] -e '[SELECT * FROM TABLE_NAME]' > data_export.csv
got me exactly what I needed without disabling secure-file-priv!

Exporting comma delimited data from a very large table

I am trying to get all the data from a very large table from a remote host having around 13million entries into a text file. I have tried the following command but after sometime process gets killed and shows a message called "Killed." in the console.
mysql --user=username --password -h host -e "select * from db.table_name" >> output_file.txt
My primary goal is to copy data from mysql to redshift, which I am doing it by getting all the data with "," delimited int a text file uploading it on s3 and executing COPY query on redshift.
P.S for small tables the above command is working properly but not for large tables.
You could try mysqldump instead. It can be parameterized to output CSV if I recall correctly. I haven't tried this myself, so you might want to check the docs, but this should work:
mysqldump --user=username --password -h host \
--fields-terminated-by="," --fields-enclosed-by="\"" --lines-terminated-by="\n" \
dbname tablename > output_file.txt
If that does not work, you could try SELECT INTO OUTFILE. You will need to do that directly on the MySQL host like this:
SELECT * INTO OUTFILE '/tmp/data.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\' LINES TERMINATED BY '\n'
FROM db.table_name

how to add properties with quote to a MySQL request design to output a CSV

I am trying to output an SQL request into a CSV file but I can't add two options that will help me to set it up. Here is the part of the request that works well:
mysql --host=localhost --user=root --password=pass --quick -e 'SELECT * FROM DB.TABLE' > '/stupidpath withaspace/stuff/myrep/export.csv'
I would like to add those two options to this request, but there something with the quote I don't get:
FIELDS TERMINATED BY ','
and
ENCLOSED BY '"'
How can I integrate this?
Probably the easiest way is to put your exporting SQL in a separate file and then feed that into mysql. The SQL file, exporter.sql, would look like this:
SELECT * INTO OUTFILE '/stupidpath withaspace/stuff/myrep/export.csv'
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
FROM DB.TABLE;
And then run it with:
mysql --host=localhost --user=root --password=pass --quick < exporter.sql
Putting the SQL in a separate file avoids the usual escaping and quoting problems of trying to send quotes into something from the shell.