I want to output some fields into file using this query:
SELECT
CONCAT('[',
GROUP_CONCAT(
CONCAT(CHAR(13), CHAR(9), '{"name":"', name, '",'),
CONCAT('"id":', CAST(rid AS UNSIGNED), '}')
),
CHAR(13), ']')
AS json FROM `role`
INTO OUTFILE '/tmp/roles.json'
In output file I'm getting something like this:
[
\ {"name":"anonymous user","rid":1},
\ {"name":"authenticated user","rid":2},
\ {"name":"admin","rid":3},
\ {"name":"moderator","rid":4}
]
As you can see, newlines (char(13)) has no backslashes, but tab characters (char(9)) has. How can I get rid of them?
UPDATE
Sundar G gave me a cue, so I modified the query to this:
SELECT
CONCAT('"name":', name),
CONCAT('"rid":', rid)
INTO outfile '/tmp/roles.json'
FIELDS TERMINATED BY ','
LINES STARTING BY '\t{' TERMINATED BY '},\n'
FROM `role`
I don't know why, but this syntax strips backslashes from output file:
{"name":"anonymous user","rid":1},
{"name":"authenticated user","rid":2},
{"name":"admin","rid":3},
{"name":"moderator","rid":4}
This is already pretty nice output, but I also would like to add opening and closing square brackets at the beginning and at the end of the file. Can I do this by means of MySQL syntax or I have to do that manually?
As described in SELECT ... INTO Syntax:
The syntax for the export_options part of the statement consists of the same FIELDS and LINES clauses that are used with the LOAD DATA INFILE statement. See Section 13.2.6, “LOAD DATA INFILE Syntax”, for information about the FIELDS and LINES clauses, including their default values and permissible values.
That referenced page says:
If you specify no FIELDS or LINES clause, the defaults are the same as if you had written this:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\\'
LINES TERMINATED BY '\n' STARTING BY ''
and later explains:
For output, if the FIELDS ESCAPED BY character is not empty, it is used to prefix the following characters on output:
The FIELDS ESCAPED BY character
The FIELDS [OPTIONALLY] ENCLOSED BY character
The first character of the FIELDS TERMINATED BY and LINES TERMINATED BY values
ASCII 0 (what is actually written following the escape character is ASCII “0”, not a zero-valued byte)
If the FIELDS ESCAPED BY character is empty, no characters are escaped and NULL is output as NULL, not \N. It is probably not a good idea to specify an empty escape character, particularly if field values in your data contain any of the characters in the list just given.
Therefore, since you have not explicitly specifying a FIELDS clause, any occurrences of the default TERMINATED BY character (i.e. tab) within a field will be escaped by the default ESCAPED BY character (i.e. backslash): so the tab character that you are creating gets so escaped. To avoid that, explicitly specify either a different field termination character or use the empty string as the escape character.
However, you should also note that the size of your results will be limited by group_concat_max_len. Perhaps a better option would be:
SELECT json FROM (
SELECT 1 AS sort_col, '[' AS json
UNION ALL
SELECT 2, CONCAT('\t{"name":', QUOTE(name), ',"id":', CAST(rid AS UNSIGNED), '}')
FROM role
UNION ALL
SELECT 3, ']'
) t
ORDER BY sort_col
INTO OUTFILE '/tmp/roles.json' FIELDS ESCAPED BY ''
Try this query like
SELECT your_fields
INTO outfile '/path/file' fields enclosed by '"' terminated by ',' lines terminated by '\n'
FROM table;
hope this works..
Related
There is a manual option to export the SQl data into CSV but I am looking for SQl command that can save data directly into CSV.
SELECT ... INTO OUTFILE 'file_name'
For examlpe:
SELECT customer_id, firstname, surname INTO OUTFILE '/exportdata/customers.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM customers;
MYSQL TO CSV COMMAND
SELECT fields
FROM table
WHERE conditions
INTO OUTFILE 'PATH AND FILE'
FIELDS ENCLOSED BY '"'
TERMINATED BY ';'
ESCAPED BY '"'
LINES TERMINATED BY '\r\n';
The CSV file contains lines of rows in the result set. Each line is terminated by a sequence of carriage return and a line feed character specified by the LINES TERMINATED BY '\r\n' clause. Each line contains values of each column of the row in the result set.
Each value is enclosed by double quotation marks indicated by FIELDS ENCLOSED BY '”' clause. This prevents the value that may contain a comma (,) will be interpreted as the field separator. When enclosing the values by the double quotation marks, the commas inside the value are not recognized as the field separators
more options (headers, timestamp columns..):
https://www.mysqltutorial.org/mysql-export-table-to-csv/
Another option is the accepted answer to this stackoverflow question:
stackoverflow.com/questions/3760631/mysql-delimiter-question
DB2 to CSV
db2 CALL SYSPROC.ADMIN_CMD( 'EXPORT TO "C:\UTILS\export.csv" OF DEL MESSAGES ON SERVER SELECT * FROM FASTNET.WLOOKUPTABLEENTRIES' )
I have I have a huge csv file with 149 column and 25K+ rows to upload this file in MySQL table I am using MySQL LOAD DATA Query
MY Query is:
LOAD DATA local INFILE '/Dir/file.csv' INTO TABLE my_table FIELDS TERMINATED BY ',' ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\n' IGNORE 1 LINES
My query is working fine, the problem comes when my file has any Backslash (\) characters, the column value get disturb and file cell value not inserting in correct columns. is there any fix for this issue.
Thanks
LOAD DATA local INFILE '/Dir/file.csv' INTO TABLE my_table FIELDS TERMINATED BY ',' ENCLOSED BY '"' ESCAPED BY '\b' LINES TERMINATED BY '\n' IGNORE 1 LINES
Use '\b' as escape character instead of '\' in your query.
I am trying to import a csv file to mysql table, But I need to remove First two characters on particular column before importing to mysql.
This is my statment :
string strLoadData = "LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets FIELDS terminated by ',' ENCLOSED BY '\"' lines terminated by '\n' IGNORE 1 LINES (SiteId,DateTime,Serial,DeviceId,AgentAID,VehicleRegistration,CarPark,SpaceNumber,GpsAddress,VehicleType,VehicleMake,VehicleModel,VehicleColour,IssueReasonCode,IssueReason,NoticeLocation,Points,Notes)";
Column IssueReasoncode' has data like 'LU12' , But i need to remove the first 2 characters it should have only integers on it and not alpha numeric .
I need to remove 'LU' from that column.
Is it possible to write like this on left(IssueReasonCode +' '2). This column is varchar(45) and cant be changed now because of large data on it.
Thanks
LOAD DATA INFILE has the ability to perform a function on the data for each column as you read it in (q.v. here). In your case, if you wanted to remove the first two characters from the IssueReasonCode column, you could use:
RIGHT(IssueReasonCode, CHAR_LENGTH(IssueReasonCode) - 2)
to remove the first two characters. You specify such column mappings at the end of the LOAD DATA statement using SET. Your statement should look something like the following:
LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets
FIELDS terminated by ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(SiteId, DateTime, Serial, DeviceId, AgentAID, VehicleRegistration, CarPark, SpaceNumber,
GpsAddress, VehicleType, VehicleMake, VehicleModel, VehicleColour, IssueReasonCode,
IssueReason, NoticeLocation, Points, Notes)
SET IssueReasonCode = RIGHT(IssueReasonCode, CHAR_LENGTH(IssueReasonCode) - 2)
Referencing this and quoting this example , you can try the below to see if it works
User variables in the SET clause can be used in several ways. The
following example uses the first input column directly for the value
of t1.column1, and assigns the second input column to a user variable
that is subjected to a division operation before being used for the
value of t1.column2:
LOAD DATA INFILE 'file.txt' INTO TABLE t1 (column1, #var1) SET
column2 = #var1/100;
string strLoadData = "LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets FIELDS terminated by ',' ENCLOSED BY '\"' lines terminated by '\n' IGNORE 1 LINES (SiteId,DateTime,Serial,DeviceId,AgentAID,VehicleRegistration,CarPark,SpaceNumber,GpsAddress,VehicleType,VehicleMake,VehicleModel,VehicleColour,#IRC,IssueReason,NoticeLocation,Points,Notes) SET IssueReasonCode = substr(#IRC,2) ;";
I have a table which has int and string data types in them. I need to export the data and need to retain the data-types. The method I am using to export the data rightnow puts the quotation marks around all of the data
SELECT * FROM passwd INTO OUTFILE '/tmp/tutorials.txt'
-> FIELDS TERMINATED BY ',' ENCLOSED BY '"'
-> LINES TERMINATED BY '\r\n';
I want the double quotes to be around the varchar?
How about
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
If you specify OPTIONALLY, the ENCLOSED BY character is used only to enclose values from columns that have a string data type (such as CHAR, BINARY, TEXT, or ENUM)
Ref https://dev.mysql.com/doc/refman/5.7/en/load-data.html
I am trying to export some query results to a CSV or TAB delimited file. One of the fields is a text blob which includes special characters, possibly including single and double quotation marks (", '), newlines (\n) and tabs (\t).
+------+------+--------------------------------------------------------+
| col1 | col2 | text |
+------+------+--------------------------------------------------------+
| 1 | foo | Oh hey why not "this" or t'is |
or a newline while we are at it |
+------+------+--------------------------------------------------------+
This is the query I am using, with \t instead of , for TAB delimited files.
SELECT col1, col2, text
FROM mytable
INTO OUTFILE '/tmp/foo.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n';
This neither works with CSV or TAB delimited files, as any commas or quotes or tabs in a text field end up splitting the text blob across multiple lines and/or columns in any exported file.
Actual question:
Is there any way to escape special characters like ", ', \t, \n in the text field and write to a CSV or TAB file, or do these need to be replaced before attempting to write to a file?
If these should be replaced, I would try to start with the code in this question using the REPLACE function, but I'd prefer something that preserves the original text.
Thanks.
Well, I didn't figure out how to escape all special characters, so here's a solution using REPLACE on the offending characters:
SELECT col1, col2,
replace(replace(replace(text, '\n', 'n'), ',', '\,'), '\"', '\'') text
FROM mytable
INTO OUTFILE '/tmp/foo.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n';
Just delete all special characters using regular expression.
SELECT col1, col2,
REGEXP_REPLACE(columnName, '[^\\x20-\\x7E]', '') text
FROM mytable
INTO OUTFILE '/tmp/foo.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n';