MySQL "load data local infile" just loads 1 row - mysql

I need to upload a csv file to my MySQL table. For some reason, when I run the command below in Mac Terminal, just one row is inserted. How can I upload all the rows?
load data local infile 'Research/cube_sample_data.csv'
into table ad_spend
fields terminated by ','
lines terminated by '\n';
I tried '\r\n' and ',\r\n' but no success. I keep getting this message:
Query OK, 1 row affected, 0 warnings (0.00 sec)
Records: 1 Deleted: 0 Skipped: 0 Warnings: 0
my file is like this:
entertainment, games, 14
entertainment, movies, 12
and I have an empty table with 3 columns:
category (varchar), subcat (varchar), adspend (int)

It seems that there's not only a comma between fields, but also a whitespace. So try to use:
fields terminated by ', '

Related

LOAD DATA and ignore tab characters

Want to load a tab-delimited-file into single-field text file, while ignoring and not deleting tab characters. I can only succeed in stripping the tabs (I used the bar terminator because it is not in my data):
LOAD DATA LOCAL
INFILE 'c:\\test\\test.txt'
INTO TABLE text_rows
CHARACTER SET latin1
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
This loads
"PROGRAM_NAME PROGRAM_URL CATALOG_NAME "
as
"PROGRAM_NAMEPROGRAM_URLCATALOG_NAME"
But this:
FIELDS TERMINATED BY ''
Loads only this: "PROGRAM_NAME"
mysql> CREATE TABLE test (txt TEXT);
Query OK, 0 rows affected (0.35 sec)
mysql> LOAD DATA
-> INFILE 'C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\test.txt'
-> INTO TABLE test
-> FIELDS TERMINATED BY '|'
-> LINES TERMINATED BY '\r\n';
Query OK, 2 rows affected (0.09 sec)
Records: 2 Deleted: 0 Skipped: 0 Warnings: 0
mysql> SELECT REPLACE(txt, '\t', '\\t') FROM test;
+----------------------------------------------+
| REPLACE(txt, '\t', '\\t') |
+----------------------------------------------+
| PROGRAM_NAME\tPROGRAM_URL\tCATALOG_NAME\t |
| PROGRAM_NAME2\tPROGRAM_URL2\tCATALOG_NAME2\t |
+----------------------------------------------+
2 rows in set (0.00 sec)
Thank you for your help - I just realized that I am stupid, and viewing the table in Navicat it seemed that the tabs didn't go through, but having tried several unsuccessful configurations, I assumed that it had failed without verifying it properly. Never assume!
Looking again, after seeing the suggestion by Akina, I did indeed get the tabs. Putting the bar character in worked just fine. Thanks for getting me straightened out!

SQL Loading Data But Missing First Section

mysql> LOAD DATA LOCAL INFILE 'students.csv' INTO TABLE students FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' (studentid,fname,mname,lname,psuid,state,country,major);
Query OK, 997 rows affected, 2276 warnings (0.02 sec)
Records: 997 Deleted: 0 Skipped: 0 Warnings: 2276
students table image
After loading my data into the table my first part of the data was missing the first name is gone. Which is why the major section is empty and everything is shifted over.
Please help! Thank you :)
CSV image
I assumed studentid column is defined with AUTO_INCREMENT attribute. You should remove definition of studentid in your LOAD DATA INFILE statement and let MySQL auto generate it.
Try this:
LOAD DATA LOCAL INFILE 'students.csv'
INTO TABLE students
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
(fname,mname,lname,psuid,state,country,major);
Your data in the file does not seem to be in the correct order.

MySQL : LOAD DATA LOCAL INFILE - commands not working consistently

I am using the following code to upload a table in mysql:
mysql> create table my_db.woof (`start_g` VARCHAR(10) NULL, `v_0` INT NULL, `v_1` INT NULL);
Query OK, 0 rows affected (0.12 sec)
mysql> LOAD DATA LOCAL INFILE '/Users/edamame/test1_data.csv' INTO TABLE woof FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (start_g,v_0,v_1);
Query OK, 0 rows affected (0.00 sec)
Records: 0 Deleted: 0 Skipped: 0 Warnings: 0
The test1_data.csv looks like:
start_g,v_0,v_1
g1,11,12
g2,13,14
Even though there are no errors, no records are uploaded! Similar commands were working fine when uploading other tables ... did I miss anything in this specific case? Thank you!
Ok ... I figured it out ... the reason is some csv files hav line terminator as \r\n but some have \n.

LOAD FILE INTO TABLE...0 rows affected

I have tried to load a .csv file into mysql in many different ways. I have most recently tried
load data local infile 'c:\Users\...csv' into table suppression_list;
which works, but I'm told 0 rows affected.
I've tried to drop the local but then it says the file is not found. I've tried to add lines terminated by ... and fields terminated by ... but that doesn't seem to make any difference. I've also tried saving the file as .txt but with no luck. suppression_list is just one column, email text.
Here is a snapshot of my data
覧覧覧覧覧
覧覧覧覧覧
#
# Contacts
*****#une.net.co
.australia#gmail.com
.d.yncly.d.ay#gmail.com
.dy.n.cly.d.a.y#gmail.com
.je.n.i.s.di.ken.s.o.n.1#gmail.com
.jell.yhe.a.d1.2.12#gmail.com
You can see it's just one column of email addresses, some of which look a little funky, I'll admit.
Do a
LOAD DATA INFILE '...csv'
INTO TABLE myTable
LINES TERMINATED BY '\n'
IGNORE 4 LINES
to ignore the Header lines. If you have created mytable like
CREATE TABLE mytable (
`email` VARCHAR(255) NULL
)
This results in
6 row(s) affected Records: 6 Deleted: 0 Skipped: 0 Warnings: 0 0.000 sec

Load data infile for huge file (no warning, no error and no rows affected)

I tried the following script:
LOAD DATA LOCAL INFILE 'myfile.csv'
REPLACE INTO TABLE `mydb`.`mytable`
CHARACTER SET latin1 FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES (`field1`, `field1`, `field1`, `field1`, `field1`, `field1`);
when I use a file of 500K records it works, but when I try a csv file of 4 million record it returns:
Query OK, 0 rows affected (2.79 sec)
Records: 0 Deleted: 0 Skipped: 0 Warnings: 0
And of course nothing will be added in 2.70 secs!
My RAM is 4GB and my input file (the large one) is 370MB.
Can anyone suggest a solution?
It's possible that the line endings in the large file are not '\r\n'.
Change the LINES TERMINATED BY '\r\n' format to '\n'.