Creating Sql File From Sql Query - mysql

I have a huge amount of data in MariaDB. I need to do create dump files from queries. So far I got something like this.
SELECT DISTINCT uretici INTO OUTFILE '/home/admin/web/example.com/public_html/public/manufacturers.sql' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM data;
This gives me a SQL file which is cant be imported. I need an importable SQL file. What to do about it?

You can use the following command
mysqldump -u [database_user] -p [database_name] | gzip > [filename_to_compress.sql.gz]

Related

MySQL - LOAD DATA LOCAL INFILE shell script not working

can someone please assist, I am new to mysql, and i have noticed that "LOAD DATA LOCAL INFILE" does not work in mysql event scheduler to update my databases from a normal .csv
So I'm trying to setup a "cron job" in linux to run a shell script to do the LOAD DATA INFILE to my databases, but im getting errors on the following shell script, please help correct it, see my script layout below...
#!/bin/bash
mysql -u root -p xxxxxxx testdb --local_infile=1 -e"LOAD DATA LOCAL INFILE '/mnt/mysqldb/mysqldb-new/mysql/CK-BATCH-FTP/Acelity/activity.csv'
INTO TABLE acelity_activity
FIELDS TERMINATED BY ';'
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Thank you for helping me
You must enclose the Double quotes in ENCLOSED BY and i think that you have a finalizing Double quote in you original code
#!/bin/bash
mysql -u root -p xxxxxxx testdb --local_infile=1 -e"LOAD DATA LOCAL INFILE '/mnt/mysqldb/mysqldb-new/mysql/CK-BATCH-FTP/Acelity/activity.csv'
INTO TABLE acelity_activity
FIELDS TERMINATED BY ';'
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;"
And you should test the command, in wokbench or phpmyadmin
Thanks for the help #nbk
So i modified the script above a little, and the following worked for me: (I removed the db name in the mysql syntax and add it in the query)
mysql -u root -pxxxxxxx -e"LOAD DATA LOCAL INFILE '/mnt/mysqldb/mysqldb-new/mysql/CK-BATCH-FTP/File/XXXX.csv'
INTO TABLE <DB NAME>.<TABLE NAME>
FIELDS TERMINATED BY ';'
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;"
If you are using Mysql8 then this will help you.
mysql -udbuser -pXXXXX -h host-pc test1 --local_infile=1 -e "LOAD DATA LOCAL INFILE '/home/abc/data.csv' INTO TABLE tempTable FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\n' (col1, col2);";
Make sure there is no space in username and password i.e. it should be like this as mentioned in the above command.
Also ENCLOSED BY needs to write in the mentioned format only.

How to export mysql table from the terminal

I'm trying to bypass the -secure-file-priv option that MySQL has enabled to export a table into a .csv file.
I've been running SELECT * FROM final_table INTO OUTFILE 'test.csv' FIELDS ENCLOSED BY '"' TERMINATED BY ';' ESCAPED BY '"' LINES TERMINATED BY '\r\n'; into the terminal, but it has been returning The MySQL server is running with the --secure-file-priv option so it cannot execute this statement
Previously, I bypassed this error when importing files by running LOAD DATA LOCAL INFILE instead of LOAD DATA INFILE which had the same error and was wondering if there was a way to do it with exporting as well.
Thanks!
Answering my own question since I found a solution that might help out other people.
mysql -u root -p [database] -e '[SELECT * FROM TABLE_NAME]' > data_export.csv
got me exactly what I needed without disabling secure-file-priv!

How do I chose which db to use?

I want to insert data with a csv-file. I want to do this by having this row in a bash script:
mysql -uusername -ppassword "LOAD DATA LOCAL INFILE 'CSVname.csv' INTO TABLE table_name FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'"
But I'm not sure where in this string I can select which DB to use. Only which table.
I get it to work if I enter it manually with "use dbname" and then LOAD DATA etc..
Can anyone please help?
Thanks in advance!
A simple look at the manual would have helped you here, it did me.
All you need to do is add the database name in front of the table name like this:
mysql -uxxx -p yyy "LOAD DATA LOCAL INFILE 'CSVname.csv' INTO TABLE
databasename.table_name FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'"

cron job - how to export data table to csv/xls file?

i'm trying to configure cron job to export automatically everyday data table from mysql DB to csv/xls.
and i'm stuck in te command field.
which command should i use ?
thank you.
mor
This is an old topic, but for posterity, what I've found would work in this case is to simply write the SQL in a separate file, then source the file into mysql. That way, you don't have to worry about quotes and escapes etc.
So, for example, I would have a SQL file like so, called exportinventory.sql
SELECT *
FROM inventory INTO OUTFILE '/tmp/inventory.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
And then, from the command line:
mysql -u user -p database < exportinventory.sql
There are three steps to accomplish:
write proggy, that exports the data as you want (php, java)
write a bash script, which properly sets the environment and calls
this proggy
insert this script in to a users crontab
you might find information for all of the three over here.
You can try something like:
mysql -h $DB_HOST -u $DB_USER --password=$DB_PASSWORD $DB_NAME -e "$YOUR_QUERY INTO OUTFILE '$PATH_TO_OUTPUT_FILE' FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n'"

how to add properties with quote to a MySQL request design to output a CSV

I am trying to output an SQL request into a CSV file but I can't add two options that will help me to set it up. Here is the part of the request that works well:
mysql --host=localhost --user=root --password=pass --quick -e 'SELECT * FROM DB.TABLE' > '/stupidpath withaspace/stuff/myrep/export.csv'
I would like to add those two options to this request, but there something with the quote I don't get:
FIELDS TERMINATED BY ','
and
ENCLOSED BY '"'
How can I integrate this?
Probably the easiest way is to put your exporting SQL in a separate file and then feed that into mysql. The SQL file, exporter.sql, would look like this:
SELECT * INTO OUTFILE '/stupidpath withaspace/stuff/myrep/export.csv'
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
FROM DB.TABLE;
And then run it with:
mysql --host=localhost --user=root --password=pass --quick < exporter.sql
Putting the SQL in a separate file avoids the usual escaping and quoting problems of trying to send quotes into something from the shell.