I connect to mysql from my Linux shell and use something like this:
SELECT * FROM students INTO OUTFILE '/tmp/students'.
Why do I see \N at line endings? I want each record in a row, but why do I see the \N explicitly printed?
How can I print all column headers in the first row?
SELECT ... INTO OUTFILE exports the result to a rather mysql specific delimited format. \N means a NULL value, not end-of-line.
Run e.g. from a command line:
echo 'select * from students' | mysql mydb >/tmp/students
The documentation for SELECT shows you how what options you have when using INTO OUTFILE, but you can't export the headers directly that way. See the comments in that documentation for a hacky way of adding header columns though.
Related
Is it possible to query Postgresql in order to get correct CSV line? For instance select concat (a,',',b) from t but with correctly escaped commas and quotes.
A couple of options.
Using psql
select * from some_table \g (format=csv) output.csv
This will create a CSV file named output.csv.
\copy cell_per to 'output.csv' WITH(format csv, header, delimiter '|');
The above allows you to use the options as explained here COPY to do things like change the delimiter, quoting, etc.
You can also use COPY directly as a query. Though in that case it is important to note that COPY runs as the server user and can only write files to directories the server user has permissions on. The work around is to make the output go to STDOUT and and capture it. For instance using the Python driver psycopg2 there are copy methods copy.
This question already has answers here:
How can I output MySQL query results in CSV format?
(40 answers)
Closed 3 years ago.
I am trying to export a result set into a csv file and load it to mysql.
mysql -e "select * from temp" > '/usr/apps/{path}/some.csv'
The out put file is not importable. It has the query, headers and bunch of unwanted lines. All I want is just the COMMA delimited VALUES in the file, so that I can import it back.
What did I try so far?
Added | sed 's/\t/,/g' - Did not help
Tried OUTFILE but it did not work.
Tried SHOW VARIABLES LIKE "secure_file_priv" which gave null.
OUTFILE will not work for me because I get the error "The MySQL server is running with the --secure-file-priv option so it cannot execute this statement". I cannot edit the variable secure-file-priv. And it has a null value right now.
I get the file output as below image. I used the alias mysql2csv='sed '\''s/\t/","/g;s/^/"/;s/$/"/;s/\n//g'\'''
This page shows you how to export to a CSV using the command line:
https://coderwall.com/p/medjwq/mysql-output-as-csv-on-command-line
From that page:
# add alias to .bashrc
alias mysql2csv='sed '\''s/\t/","/g;s/^/"/;s/$/"/;s/\n//g'\'''
$ mysql <usual args here> -e "SELECT * FROM foo" | mysql2csv > foo.csv
Since you're trying things, why not try something like the example given in the MySQL Reference Manual?
https://dev.mysql.com/doc/refman/8.0/en/select-into.html
Excerpt:
Here is an example that produces a file in the comma-separated values (CSV) format used by many programs:
SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
From a shell script, if we are going to have literal SQL in the script, we may need to take care with the single quote and the backslash characters. For example, the \n (backslash n characters) need to be sent to the MySQL server as part of the SQL text. We have to be careful that the backslash character doesn't get swallowed by the shell script.
I am trying to query a Sybase using iSQL client and export the query results to a text file or CSV file with column name. However the column headings are not exported to the file. I tried below script it shows error message, below the working script without column heading and error script, appreciate any valuable advice.
working sql:
select * from siebel.S_ORG_EXT;
OUTPUT TO 'C:\\Siebel SQLs\\Account.CSV' FORMAT TEXT
DELIMITED BY ';' QUOTE ''
Not working sql :
select * from siebel.S_ORG_EXT;
OUTPUT TO 'C:\\Siebel SQLs\\Account.CSV' FORMAT TEXT
DELIMITED BY ';' QUOTE '' WITH COLUMN NAMES;
If you are using Sybase iAnywhere the WITH COLUMN NAMES option is not recognized by that Sybase product. Just thought I'd mention this for those like myself who have struggled with a similar issue.
HTH
You can try following query:
SELECT * FROM siebel.S_ORG_EXT; OUTPUT TO 'C:\\Siebel SQLs\\Account.CSV' FORMAT ASCII DELIMITED BY ';' QUOTE '' WITH COLUMN NAMES;
Alternatively you could use a different SQL client. For example Squirrel SQL which supports JDBC connections. In other SQL clients you will need to import the jconn2.jar which is part of your local web client installation.
First off, due to my MySQL user not having FILE rights on the server, I am having to use the below line to pipe my SELECT statement output to a file in shell instead of doing it directly in MySQL and being able to use INTO OUTFILE & FIELDS TERMINATED BY '|' which I'm guessing would solve all my problems.
So I have the following line to grab my fields:
echo "select id, UNIX_TIMESTAMP(time), company from database.table_name" | mysql -h database.mysql.host.com -u username -ppassword user > /root/sql/output.txt
This outputs the following 3 columns:
63 1414574321 person one
50 1225271921 Another person
8 1225271921 Company with many names
10 1414574567 Person with Company
I then use that data in other scripts to do some tasks.
My issue is that some columns, of which the third here, 'company', is an example, has spaces in its data meaning my WHILE loops later get thrown off.
I would like to add a delimiter to my output so it looks like this instead:
63|1414574321|person one
50|1225271921|Another person
8|1225271921|Company with many names
10|1414574567|Person with Company
and that way I could hopefully manipulate the data in blocks using awk -F| and IFS=| later.
There are many many more columns with variable lengths and number of words pr column to be added when I get it working, so I cannot use a method that relies on position to add the delimiter.
I feel the delimiter needs to be set when the data is dumped in the first place.
I've tried things like:
echo "select (id, + '|' + UNIX_TIMESTAMP(time), + '|' + company) from database.table_name" | mysql -h database.mysql.host.com -u username -ppassword user > /root/sql/output.txt
without any luck, its just adds the characters to the header of the output file.
Does anyone out there see a solution to what I could do?
In case anyone wonders, I'm dumping data from 2 databases, comparing timestamps and writing back the latest data to both databases.
You could use concat_ws function to recieve one concateneted string per row:
select concat_ws( '|', id, UNIX_TIMESTAMP(time) , company ) from database.table_name
Edit: Missing comma added, sorry!
I'm looking to change to formatting of the output produced by the mysqldump command in the following way:
(data_val1,data_val2,data_val3,...)
to
(data_val1|data_val2|data_val3|...)
The change here being a different delimiter. This would then allow me to (in python) parse the data lines using a line.split("|") command and end up with the values correctly split (as opposed to doing line.split(",") and have values that contain commas be split into multiple values).
I've tried using the --fields-terminated-by flag, but this requires the --tab flag to be used as well. I don't want use the --tab flag as it splits the dump into several files. Does anyone know how to alter the delimiter that mysqldump uses?
This is not a good idea. Instead of using string.split() in Python, use the csv module to properly parse CSV data, which may be enclosed in quotes and may have internal , which aren't delimiters.
import csv
MySQL dump files are intended to be used as input back into MySQL. If you really want pipe-delimited output, use the SELECT INTO OUTFILE syntax instead with the FIELDS TERMINATED BY '|' option.