Simple mysql into text file: How to avoid the first line being written? - mysql

This is a simple .bat file which currently 'works'; I'm looking to avoid having the field name as the first line in the text file.
C:\xampp\mysql\bin\mysql -u sample -pnotpass --database test -e "SELECT url FROM single WHERE fold = 'bb' AND dir= 'test_folder' ; " > test123.txt
Obviously, it's not a "Windows related question"; Is there a way to ask mysql to only print the results and skip the field name?
Thanks

There's a "column-names" paramater which defaults to true. Just set it to false.
C:\xampp\mysql\bin\mysql -u sample -pnotpass --database test --column-names=false -e "SELECT url FROM single WHERE fold = 'bb' AND dir= 'test_folder' ; " > test123.txt

It depends on the precise format you require, but I'm guessing that using SELECT INTO OUTFILE would probably be a step in the right direction. It would also remove the need to redirect the content to a file, although you'll need to remove that file once you've finished with it (or at the start of the batch file) otherwise MySQL will spit tacks the next time it tries to create it and discovers it already exists.
For example, to create a CSV style file you could use:
SELECT url INTO OUTFILE "\wherever\test123.txt"
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY "\n"
FROM single WHERE fold = 'bb' AND dir= 'test_folder' ;
Incidentally, the user in question must have the FILE privilege for INTO OUTFILE to work.

Related

Export a table as csv in mysql from shell script [duplicate]

This question already has answers here:
How can I output MySQL query results in CSV format?
(40 answers)
Closed 3 years ago.
I am trying to export a result set into a csv file and load it to mysql.
mysql -e "select * from temp" > '/usr/apps/{path}/some.csv'
The out put file is not importable. It has the query, headers and bunch of unwanted lines. All I want is just the COMMA delimited VALUES in the file, so that I can import it back.
What did I try so far?
Added | sed 's/\t/,/g' - Did not help
Tried OUTFILE but it did not work.
Tried SHOW VARIABLES LIKE "secure_file_priv" which gave null.
OUTFILE will not work for me because I get the error "The MySQL server is running with the --secure-file-priv option so it cannot execute this statement". I cannot edit the variable secure-file-priv. And it has a null value right now.
I get the file output as below image. I used the alias mysql2csv='sed '\''s/\t/","/g;s/^/"/;s/$/"/;s/\n//g'\'''
This page shows you how to export to a CSV using the command line:
https://coderwall.com/p/medjwq/mysql-output-as-csv-on-command-line
From that page:
# add alias to .bashrc
alias mysql2csv='sed '\''s/\t/","/g;s/^/"/;s/$/"/;s/\n//g'\'''
$ mysql <usual args here> -e "SELECT * FROM foo" | mysql2csv > foo.csv
Since you're trying things, why not try something like the example given in the MySQL Reference Manual?
https://dev.mysql.com/doc/refman/8.0/en/select-into.html
Excerpt:
Here is an example that produces a file in the comma-separated values (CSV) format used by many programs:
SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
From a shell script, if we are going to have literal SQL in the script, we may need to take care with the single quote and the backslash characters. For example, the \n (backslash n characters) need to be sent to the MySQL server as part of the SQL text. We have to be careful that the backslash character doesn't get swallowed by the shell script.

Loading multiple csv files to MariaDB/MySQL through bash script

After trying for a full day, I'm hoping someone here can help me make below script work. I've combined information from multiple threads (example) and websites, but can't get it to work.
What I'm trying to do:
I'm trying to get a MariaDB10 database called 'stock_db' on my Synology NAS to load all *.csv files from a specific folder (where I save downloaded historical prices of stocks) and add these to a table called 'prices'. The files are all equally named "price_history_'isin'.csv".
Below SQL statement works when running it individually from HeidiSQL on my Windows machine:
Working SQL
LOAD DATA LOW_PRIORITY LOCAL INFILE 'D:\\Downloads\\price_history_NL0010366407.csv'
IGNORE INTO TABLE `stock_db`.`prices`
CHARACTER SET utf8
FIELDS TERMINATED BY ';'
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 2 LINES
(#vdate, #vprice)
SET
isin = 'NL0010366407',
date = STR_TO_DATE(#vdate, '%d-%m-%Y'),
price = #vprice
;
The issue
Unfortunately, when I try to batch loading all csv's from a folder on my NAS through below script, I keep getting the same error.
#!/bin/bash
for filename in ./price_history/*.csv; do
echo $filename
isin=${filename:30:12}
echo $isin
/volume1/#appstore/MariaDB10/usr/local/mariadb10/bin/mysql -u root -p \
"LOAD DATA LOW_PRIORITY LOCAL INFILE '$filename'\
IGNORE INTO TABLE 'stock_db.prices'\
CHARACTER SET utf8\
FIELDS TERMINATED BY ';'\
OPTIONALLY ENCLOSED BY '"'"'"'\
ESCAPED BY '"'"'"'\
LINES TERMINATED BY '\r\n'\
IGNORE 2 LINES (#vdate, #vprice)\
SET\
isin = '$isin',\
date = STR_TO_DATE(#vdate, '%d-%m-%Y'),\
price = #vprice;"
done
ERROR 1102 (42000): Incorrect database name
What I've tried
Took the database name out of stock_db.prices and mentioned it separately as [database] outside of the quoted SQL statement - Doesn't work
Changed quotes around 'stock_db.prices' in many different ways - Doesn't work
Separated the SQL into a separate file and referenced it '< stmt.sql' - Complicates things even further and couldn't get it to work at all (although probably preferred)
Considered (or even preferred) using a PREPARE statement, but seems I can't use this in combination with LOAD DATA (reference)
Bonus Question
If someone can help me do this without having to re-enter the user's password or putting the password in the script, this would be really nice bonus!
Update
Got the 'Incorrect Database Error' resolved by adding '-e' option
Now I have a new error on the csv files:
ERROR 13 "Permission Denied"
While the folder and files are full access for everyone.
Anyone any thoughts to this?
Thanks a lot!
Try to set database using -D option: change the first line to
/volume1/#appstore/MariaDB10/usr/local/mariadb10/bin/mysql -D stock_db -u root -p \ ...
You may have an error in this line IGNORE INTO TABLE 'stock_db.prices'\ - try to remove the single quotes.
Create file .my.cnf in your user's home directory and put the following information into it:
[client]
password="my password"
Info about option files.
'stock_db.prices'
Incorrect quoting. This will work since neither are keywords:
stock_db.prices
This will also work:
`stock_db`.`prices`
Note that the db name and the table name are quoted separately, using backtics.
I can't predict what will happen with this nightmare:
'"'"'"'

Running mysql command in a while loop

I have a list of mysql databases, and I wish to export a couple of tables from databases whose name ends in "_abc" using a linux script. However, I keep getting the error "Unexpected end of file".
This is my script, where DB_FILE is a text file containing the database names of those that end in "_abc".
while read db_name;
do
OUT_FILENAME="${db_name}_table.csv"
mysql -u$MYSQLUSER -p$MYSQLPASS -h $MYSQLHOST $db_name << EOFMYSQL
SELECT * FROM table_abc INTO OUTFILE '$OUT_FILENAME' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
EOFMYSQL
done < $DB_FILE
rm -f $DB_FILE
If I remove the 3 MYSQL lines, then the script runs without error.
Another question, if I have 2 SELECT queries, do I have to run each query separately, i.e. each enclosed by the EOFMYSQL statements, or can I have both queries within one EOFMYSQL block?
I think you must have the EOFMYSQL without spaces before it: do not indent that one only.
while read db_name;
do
OUT_FILENAME="${db_name}_table.csv"
mysql -u$MYSQLUSER -p$MYSQLPASS -h $MYSQLHOST $db_name << EOFMYSQL
SELECT * FROM table_abc INTO OUTFILE '$OUT_FILENAME' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
EOFMYSQL
done < $DB_FILE
rm -f $DB_FILE
In some appropriate text editor, it will show up easily, here notepad++:
right indent:
wrong indent:
$OUT_FILENAME will not expand inside single quotes.
I guess you could put the query inside an array like below:
MYQUERY=( SELECT \* FROM table_abc INTO OUTFILE "$OUT_FILENAME" FIELDS TERMINATED BY \",\" LINES TERMINATED BY \"\\n\"\; )
and then change the mysql command to :
mysql -u"$MYSQLUSER" -p"$MYSQLPASS" -h "$MYSQLHOST" -e "${MYQUERY[#]}" "$db_name"
Sidenotes
I have double quoted all the variables to avoid word splitting.
Note that we wish to recreate actual query here. So you need to escape those characters which have special meaning here like \*,\" and so.

Load multiple CSV's into MySQL using shell script

Inside a directory /mylog/ I have a bunch of CSV files. Each CSV file only has one line of data. I need this data to be inputted into a MySQL Database.
An example of a line would be:
2015-08-14 00:00:00,HOSTNAME,10271kB,17182kB,92874kB,10%,/dev/disk1,/
I need to remove the 'kB' from each file size and remove the % from the percentage field. I also need to make sure that the date time and hostname are always unique and no duplicate entries are ever put in.
This is what I started to write so far. But I'm obviously missing the database name to use and removing the kB and %. If there's anything else wrong or missing, let me know. There's also the fact mysql is called each time, is there a way to do multiple load data?
Shell script:
#!/bin/bash
for f in /var/log/mylog/*.csv
do
mysql -e "load data local infile '"$f"' into table myTable fields TERMINATED BY ',' LINES TERMINATED BY '\n'" -u myUser --password=myPassword --local-infile
done

MySQL Results to a File

How do I write the results from a mysql query to file? I just need something quick. Output can be CSV, XML, HTML, etc.
SELECT a,b,a+b
FROM test_table
INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(the docs show INTO OUTFILE up in the SELECT .. portion which may work as well, but I've never tried it that way)
http://dev.mysql.com/doc/refman/5.0/en/select.html
INTO OUTFILE creates a file on the server; if you are on a client and want it there, do:
mysql -u you -p -e "SELECT ..." > file_name
if you have phpMyAdmin installed, it is a nobrainer: Run the query (haven't got a copy loaded, so I can't tell you the details, but it really is easy) and check neer bottom for export options. CSV will be listed, but I think you can also have SQL if you like :)
phpMyAdmin will give CSV in Excels dialect, which is probably what you want...
You can use MySQL Query Browser to run the query and then just go to File -> Export Resultset and choose the output format. The options are CSV, HTML, XML, Excel and PLIST.