I have tried to load a .csv file into mysql in many different ways. I have most recently tried
load data local infile 'c:\Users\...csv' into table suppression_list;
which works, but I'm told 0 rows affected.
I've tried to drop the local but then it says the file is not found. I've tried to add lines terminated by ... and fields terminated by ... but that doesn't seem to make any difference. I've also tried saving the file as .txt but with no luck. suppression_list is just one column, email text.
Here is a snapshot of my data
覧覧覧覧覧
覧覧覧覧覧
#
# Contacts
*****#une.net.co
.australia#gmail.com
.d.yncly.d.ay#gmail.com
.dy.n.cly.d.a.y#gmail.com
.je.n.i.s.di.ken.s.o.n.1#gmail.com
.jell.yhe.a.d1.2.12#gmail.com
You can see it's just one column of email addresses, some of which look a little funky, I'll admit.
Do a
LOAD DATA INFILE '...csv'
INTO TABLE myTable
LINES TERMINATED BY '\n'
IGNORE 4 LINES
to ignore the Header lines. If you have created mytable like
CREATE TABLE mytable (
`email` VARCHAR(255) NULL
)
This results in
6 row(s) affected Records: 6 Deleted: 0 Skipped: 0 Warnings: 0 0.000 sec
Related
Want to load a tab-delimited-file into single-field text file, while ignoring and not deleting tab characters. I can only succeed in stripping the tabs (I used the bar terminator because it is not in my data):
LOAD DATA LOCAL
INFILE 'c:\\test\\test.txt'
INTO TABLE text_rows
CHARACTER SET latin1
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
This loads
"PROGRAM_NAME PROGRAM_URL CATALOG_NAME "
as
"PROGRAM_NAMEPROGRAM_URLCATALOG_NAME"
But this:
FIELDS TERMINATED BY ''
Loads only this: "PROGRAM_NAME"
mysql> CREATE TABLE test (txt TEXT);
Query OK, 0 rows affected (0.35 sec)
mysql> LOAD DATA
-> INFILE 'C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\test.txt'
-> INTO TABLE test
-> FIELDS TERMINATED BY '|'
-> LINES TERMINATED BY '\r\n';
Query OK, 2 rows affected (0.09 sec)
Records: 2 Deleted: 0 Skipped: 0 Warnings: 0
mysql> SELECT REPLACE(txt, '\t', '\\t') FROM test;
+----------------------------------------------+
| REPLACE(txt, '\t', '\\t') |
+----------------------------------------------+
| PROGRAM_NAME\tPROGRAM_URL\tCATALOG_NAME\t |
| PROGRAM_NAME2\tPROGRAM_URL2\tCATALOG_NAME2\t |
+----------------------------------------------+
2 rows in set (0.00 sec)
Thank you for your help - I just realized that I am stupid, and viewing the table in Navicat it seemed that the tabs didn't go through, but having tried several unsuccessful configurations, I assumed that it had failed without verifying it properly. Never assume!
Looking again, after seeing the suggestion by Akina, I did indeed get the tabs. Putting the bar character in worked just fine. Thanks for getting me straightened out!
mysql> LOAD DATA LOCAL INFILE 'students.csv' INTO TABLE students FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' (studentid,fname,mname,lname,psuid,state,country,major);
Query OK, 997 rows affected, 2276 warnings (0.02 sec)
Records: 997 Deleted: 0 Skipped: 0 Warnings: 2276
students table image
After loading my data into the table my first part of the data was missing the first name is gone. Which is why the major section is empty and everything is shifted over.
Please help! Thank you :)
CSV image
I assumed studentid column is defined with AUTO_INCREMENT attribute. You should remove definition of studentid in your LOAD DATA INFILE statement and let MySQL auto generate it.
Try this:
LOAD DATA LOCAL INFILE 'students.csv'
INTO TABLE students
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
(fname,mname,lname,psuid,state,country,major);
Your data in the file does not seem to be in the correct order.
I am using the following code to upload a table in mysql:
mysql> create table my_db.woof (`start_g` VARCHAR(10) NULL, `v_0` INT NULL, `v_1` INT NULL);
Query OK, 0 rows affected (0.12 sec)
mysql> LOAD DATA LOCAL INFILE '/Users/edamame/test1_data.csv' INTO TABLE woof FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (start_g,v_0,v_1);
Query OK, 0 rows affected (0.00 sec)
Records: 0 Deleted: 0 Skipped: 0 Warnings: 0
The test1_data.csv looks like:
start_g,v_0,v_1
g1,11,12
g2,13,14
Even though there are no errors, no records are uploaded! Similar commands were working fine when uploading other tables ... did I miss anything in this specific case? Thank you!
Ok ... I figured it out ... the reason is some csv files hav line terminator as \r\n but some have \n.
I need to upload a csv file to my MySQL table. For some reason, when I run the command below in Mac Terminal, just one row is inserted. How can I upload all the rows?
load data local infile 'Research/cube_sample_data.csv'
into table ad_spend
fields terminated by ','
lines terminated by '\n';
I tried '\r\n' and ',\r\n' but no success. I keep getting this message:
Query OK, 1 row affected, 0 warnings (0.00 sec)
Records: 1 Deleted: 0 Skipped: 0 Warnings: 0
my file is like this:
entertainment, games, 14
entertainment, movies, 12
and I have an empty table with 3 columns:
category (varchar), subcat (varchar), adspend (int)
It seems that there's not only a comma between fields, but also a whitespace. So try to use:
fields terminated by ', '
I am trying to import one big csv file encoded in UCS2-Big Indian into a mysql table.
This is the mysql code:
DROP TABLE if exists PAPERS;
CREATE TABLE `PAPERS` (
ID_RESEARCHER VARCHAR(20),
PAPER_ACCESSOR_NUMBER VARCHAR(20),
primary key(ID_RESEARCHER,PAPER_ACCESSOR_NUMBER)
) ENGINE=InnoDB DEFAULT CHARSET=ucs2;
load data local infile '...dump_all_papers_test_2.csv'
into table PAPERS
CHARACTER SET ucs2
fields terminated by '\t' enclosed by '"'
lines terminated by '\n'
(ID_RESEARCHER, PAPER_ACCESSOR_NUMBER);
And the content of csv (format ucs-2 Big Indian, said by notepad++)
"100" "A1974U626600001"
"100" "A1974U626600001"
"100" "A1974U626600001"
A copy of csv sample:
http://pastebin.com/HMssuxCf
And the error is:
1 row(s) affected, 2 warning(s):
1265 Data truncated for column 'ID_RESEARCHER' at row 1
1261 Row 1 doesn't contain data for all columns
Records: 1 Deleted: 0 Skipped: 0 Warnings: 2
What is happening here?
The action only fills the first field. Is not mysql support ucs2?
Ok.
The solution is in the documentation:
Note that it is currently not possible to load data files that use the ucs2 character set.
http://dev.mysql.com/doc/refman/5.0/en/load-data.html