I have a .txt file in UTF-8 format with information I am trying to import in an already created table that has rows already in it.
The information in the .txt file is structured like this: (the quotes are included in the .txt file)
"Bob,Smith,25,California,,,,Single,"
"John,Doe,72,Nevada,,2,1,Married,"
"Will,Smith,22,Texas,1000005,2,1,Married,"
The query I'm using is:
LOAD DATA LOCAL INFILE 'myfile.txt' INTO TABLE mytable FIELDS ENCLOSED BY '"' TERMINATED BY ',' LINES TERMINATED BY '\n'
What happens is that all of these records get inserted but get inserted like this
Bob,null,null,null,null,null,null,null
John,null,null,null,null,null,null,null
Will,null,null,null,null,null,null,null
It's like the " is not being caught at the end or something weird . Am I doing something wrong here?
According to the example data provided your fields are not enclosed by quotes but rather the whole record is.
You can use the STARTING BY option to ignore the initial quote and the trailing one should be ignore automatically.
I think what you need is this:
LOAD DATA LOCAL INFILE 'your_file.txt' INTO TABLE your_table FIELDS TERMINATED BY ',' LINES STARTING BY '"' TERMINATED BY '\n';
The format of your text file does not make any sense. (Strangely enough, the way the importer handles it does not make any sense either. But we have no control over the importer, so let us ignore that.)
The double quotes must surround each field, not each line. The ENCLOSED BY '"' clause refers to the fields, not to the lines. So, you are telling the importer that your fields are enclosed in quotes, but you are enclosing your lines in quotes. So, the importer considers each one of your lines as a field.
(Then, the importer proceeds to further chop up your lines at the comma, which makes no sense because the commas are within the quotes, so it should be ignoring them, so the importer is kind of brain-damaged too.)
By using the FIELDS ENCLOSED BY '"' statement the input file should enclose EACH field data by a " therefor the input file should be as follows
"Bob","Smith","25","California","","","","Single",
"John","Doe","72","Nevada","","2","1","Married",
"Will","Smith","22","Texas","1000005","2","1","Married",
That should add the data into the fields
Related
Hi there I am new to web development.
I am trying to import a CSV file into mysql workbench using 'Table Data Import Wizard'. However, I have read my file needs to be a CSV (MS-DOS), or I get the following error: Can't analyze file. Please try to change encoding type. If that doesn't help, maybe the file is not: csv, or the file is empty.
I cannot use a CSV (MS-DOS) as my data contains a lot of different special characters including those from Nordic Europe. When I convert my CSV (comma delimited) to CSV (MS-DOS) the special characters are no longer the same.
Is there a way to import a CSV comma delimited file into mysql workbench? Or is there a better solution to getting my data into the table such as keeping the special characters the same in the MS-DOS file somehow?
You can import regular CSVs without an issue, just make sure the encoding matches.
Something like
LOAD DATA
INFILE yourfile.csv
INTO TABLE tablename
FIELDS
TERMINATED BY ','
ENCLOSED BY '"'
LINES
TERMINATED BY '\n'
IGNORE 1 LINES
should work. If your CSV doesn't have headers, remove the ignore 1 lines line from the code. If your formatting is different, change the enclosing and terminating characters accordingly.
You can look up the exact syntax in the manual.
Your CSV should work fine. You just need to Load Data Infile
You will likely need to define these settings though
LOAD DATA INFILE 'c:/tmp/discounts.csv'
INTO TABLE discounts
-- comma seperated? maybe pipes '|'?
FIELDS TERMINATED BY ','
-- what surrounds input and is it optional? then add OPTIONALLY before ENCLOSED
ENCLOSED BY '"'
-- what is at the end of files
LINES TERMINATED BY '\n'
-- how many header rows are there, if any?
IGNORE 1 ROWS;
I'm having a seemingly basic problem with loading data to MySQL. I'm using:
LOAD DATA LOCAL INFILE 'C:/.../Contrato.txt'
INTO TABLE schema.contrato
FIELDS TERMINATED BY ';' ENCLOSED BY '|' LINES TERMINATED BY '/r/n' IGNORE 0 LINES;
Each line on the file looks like this, generated by another program:
|abc|;|cde|;|123|;|456|;|name|\r\n
When executing the load, everything seems to load properly, except the very last field. When I look at the table, the last field actually shows the '|' characters around the name. It's not the end of the world, but it's strange that it would do that. As of now, I'm fixing it by adding a ';' right before the \r\n characters.
Is this the way it's supposed to be done?? Why would I need to add the field terminator before the line terminator in order to delete the field enclosers?
I can duplicate the effect on a file with a single line in it, with multiple lines I only get a single entry which has the final column entry of "name| |abc" .
I changed '/r/n' to '\r\n' & the load worked correctly for files with a single entry & multiple entries & the surrounding | were correctly removed
LOAD DATA LOCAL INFILE 'C:/.../Contrato.txt'
INTO TABLE schema.contrato
FIELDS TERMINATED BY ';' ENCLOSED BY '|' LINES TERMINATED BY '\r\n' IGNORE 0 LINES;
The correct MySQL escape character is backslash not forwardslash which is not treated as a special character - so your original code was looking for the 4 character sequence "forwardslash r forwardslash n" to terminate the lines
So I have data with in which all the fields are enclosed by quotes and its delimited by pipe. Some fields have html text in them so there are new line characters as part of the field. I want these new line characters to be part of the text field. The data looks something like this:
"abcd"|"1"|""|" abcdegf
"|"abcd"
Also, html data is huge amount of text (sample shows ery less data) and I get the error 'multibyte enlose string not supported". I am on infobright. I am okay even if I can remove those fields from CSV file. They are not needed for analysis. What should be the correct LOAD DATA LOCAL INFILE syntax for this?
I am new to this field, help is greatly appreciated.
load data infile '<file>' into table <table> fields enclosed
by '"' terminated by '|';
LOAD DATA INFILE 'filename' "STR '\r\n'"
APPEND INTO TABLE tablename FIELDS TERMINATED BY "|"
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
Please add the below code to the CTL file -
LOAD DATA INFILE 'filename'
APPEND CONTINUEIF LAST != "|"
INTO TABLE IDP.M_ACTION FIELDS TERMINATED BY "|"
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS (..
The previous answer is not supported on all SQL Loader versions. You can try this solution instead.
I have .csv file data like this:
"UPRR 38 PAN AM "M"","1"
and I loaded data into table using below command which is having two columns (a and b).
LOAD DATA LOCAL INFILE 'E:\monthly_data.csv'
INTO TABLE test_data_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
But when I select table, it's giving unexpected results which is shown below.
a contains:
UPRR 38 PAN AM "M","1
... and b is NULL.
Thanks
You can replace all the instances of "Double quote double quote" in your file
either A. open the files and find replace them
or B. make a script to open the files and replace the extra quote that is messing it up
You have this:
ENCLOSED BY '"'
Thus " is not a regular character any more. It's a special character that has a special meaning: it highlights the start and end of a column value. If you want to type a " that does not behave that way you need to escape it. The RFC 4180 - Common Format and MIME Type for Comma-Separated Values (CSV) Files document explains how to do that:
If double-quotes are used to enclose fields, then a double-quote
appearing inside a field must be escaped by preceding it with
another double quote
a;b
"UPRR 38 PAN AM ""M""";1
As they say, garbage in, garbage out ;-)
I need to export data from a mysql table to a csv file. Two fields in particular are giving me trouble. Their data types are varchar and TEXT.
Data in these columns contain all sorts of gross characters, particularly new line (\n) and tabs (\t). I need to export these into a .csv file for a migration.
Unfortunately, I cannot enclose the fields with '"' because the destination database doesn't support that formatting.
So, my query looks like the following:
SELECT `varchar_col`,`text_col` FROM `db`.`tbl` INTO OUTFILE '/path/to/my/file.csv' FIELDS TERMINATED BY '\t' ESCAPED BY "\\" LINES TERMINATED BY '\n';
When I look at my output file (used gedit and nano), I simply see that each new line or tab in the file is preceded by a backslash (see example below). I would like a new line or tab instead to read the literal chacters '\n' or '\t' instead of actual new lines and tabs.
Example of problem with new line-
field value of:
value one
value two
exported to .csv gives me:
value one\
value two\
instead of:
value 1\nvalue 2
Can anyone tell me what im doing wrong?
Thanks
I'm not quite sure. But is this what you want? (Output literal chacters '\n' or '\t' instead of actual new lines and tabs.)
SELECT * FROM block INTO OUTFILE '/var/tmp/d.csv' FIELDS TERMINATED BY '\\t' ESCAPED BY "\\" LINES TERMINATED BY '\\n';