SQL Loading Data But Missing First Section - mysql

mysql> LOAD DATA LOCAL INFILE 'students.csv' INTO TABLE students FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' (studentid,fname,mname,lname,psuid,state,country,major);
Query OK, 997 rows affected, 2276 warnings (0.02 sec)
Records: 997 Deleted: 0 Skipped: 0 Warnings: 2276
students table image
After loading my data into the table my first part of the data was missing the first name is gone. Which is why the major section is empty and everything is shifted over.
Please help! Thank you :)
CSV image

I assumed studentid column is defined with AUTO_INCREMENT attribute. You should remove definition of studentid in your LOAD DATA INFILE statement and let MySQL auto generate it.
Try this:
LOAD DATA LOCAL INFILE 'students.csv'
INTO TABLE students
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
(fname,mname,lname,psuid,state,country,major);

Your data in the file does not seem to be in the correct order.

Related

LOAD DATA and ignore tab characters

Want to load a tab-delimited-file into single-field text file, while ignoring and not deleting tab characters. I can only succeed in stripping the tabs (I used the bar terminator because it is not in my data):
LOAD DATA LOCAL
INFILE 'c:\\test\\test.txt'
INTO TABLE text_rows
CHARACTER SET latin1
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
This loads
"PROGRAM_NAME PROGRAM_URL CATALOG_NAME "
as
"PROGRAM_NAMEPROGRAM_URLCATALOG_NAME"
But this:
FIELDS TERMINATED BY ''
Loads only this: "PROGRAM_NAME"
mysql> CREATE TABLE test (txt TEXT);
Query OK, 0 rows affected (0.35 sec)
mysql> LOAD DATA
-> INFILE 'C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\test.txt'
-> INTO TABLE test
-> FIELDS TERMINATED BY '|'
-> LINES TERMINATED BY '\r\n';
Query OK, 2 rows affected (0.09 sec)
Records: 2 Deleted: 0 Skipped: 0 Warnings: 0
mysql> SELECT REPLACE(txt, '\t', '\\t') FROM test;
+----------------------------------------------+
| REPLACE(txt, '\t', '\\t') |
+----------------------------------------------+
| PROGRAM_NAME\tPROGRAM_URL\tCATALOG_NAME\t |
| PROGRAM_NAME2\tPROGRAM_URL2\tCATALOG_NAME2\t |
+----------------------------------------------+
2 rows in set (0.00 sec)
Thank you for your help - I just realized that I am stupid, and viewing the table in Navicat it seemed that the tabs didn't go through, but having tried several unsuccessful configurations, I assumed that it had failed without verifying it properly. Never assume!
Looking again, after seeing the suggestion by Akina, I did indeed get the tabs. Putting the bar character in worked just fine. Thanks for getting me straightened out!

Escape and enclose by same symbol doesn't work

I have a table with a nullable column:
CREATE TABLE test (id INT, value INT);
INSERT INTO test VALUES (1, null);
I export it to csv using
SELECT * FROM test
INTO OUTFILE '/tmp/test.csv'
CHARACTER SET 'utf8'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
The output file looks like this:
1,"N
Then I try to import it:
LOAD DATA LOCAL INFILE '/tmp/test.csv'
INTO TABLE `test`.`test`
CHARACTER SET 'utf8'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"';
Query OK, 1 row affected, 1 warning (0.01 sec)
Records: 1 Deleted: 0 Skipped: 0 Warnings: 1
And it does't work and generates the following warning:
SHOW WARNINGS;
Warning 1366: Incorrect integer value: '"N' for column 'value' at row 1
And it doesn't insert this row. For some reason escaping doesn't work for this case. It does work if field are escaped and enclosed by different symbols.
But I can't use different symbols because I can't change export query because I use Google Cloud SQL and export to csv can be done only from google web console without ability to edit the query.
I've seen multiple examples on the internet that double quote in both cases should work. Am I doing something wrong?

LOAD FILE INTO TABLE...0 rows affected

I have tried to load a .csv file into mysql in many different ways. I have most recently tried
load data local infile 'c:\Users\...csv' into table suppression_list;
which works, but I'm told 0 rows affected.
I've tried to drop the local but then it says the file is not found. I've tried to add lines terminated by ... and fields terminated by ... but that doesn't seem to make any difference. I've also tried saving the file as .txt but with no luck. suppression_list is just one column, email text.
Here is a snapshot of my data
覧覧覧覧覧
覧覧覧覧覧
#
# Contacts
*****#une.net.co
.australia#gmail.com
.d.yncly.d.ay#gmail.com
.dy.n.cly.d.a.y#gmail.com
.je.n.i.s.di.ken.s.o.n.1#gmail.com
.jell.yhe.a.d1.2.12#gmail.com
You can see it's just one column of email addresses, some of which look a little funky, I'll admit.
Do a
LOAD DATA INFILE '...csv'
INTO TABLE myTable
LINES TERMINATED BY '\n'
IGNORE 4 LINES
to ignore the Header lines. If you have created mytable like
CREATE TABLE mytable (
`email` VARCHAR(255) NULL
)
This results in
6 row(s) affected Records: 6 Deleted: 0 Skipped: 0 Warnings: 0 0.000 sec

MySQL "load data local infile" just loads 1 row

I need to upload a csv file to my MySQL table. For some reason, when I run the command below in Mac Terminal, just one row is inserted. How can I upload all the rows?
load data local infile 'Research/cube_sample_data.csv'
into table ad_spend
fields terminated by ','
lines terminated by '\n';
I tried '\r\n' and ',\r\n' but no success. I keep getting this message:
Query OK, 1 row affected, 0 warnings (0.00 sec)
Records: 1 Deleted: 0 Skipped: 0 Warnings: 0
my file is like this:
entertainment, games, 14
entertainment, movies, 12
and I have an empty table with 3 columns:
category (varchar), subcat (varchar), adspend (int)
It seems that there's not only a comma between fields, but also a whitespace. So try to use:
fields terminated by ', '

Load data infile for huge file (no warning, no error and no rows affected)

I tried the following script:
LOAD DATA LOCAL INFILE 'myfile.csv'
REPLACE INTO TABLE `mydb`.`mytable`
CHARACTER SET latin1 FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES (`field1`, `field1`, `field1`, `field1`, `field1`, `field1`);
when I use a file of 500K records it works, but when I try a csv file of 4 million record it returns:
Query OK, 0 rows affected (2.79 sec)
Records: 0 Deleted: 0 Skipped: 0 Warnings: 0
And of course nothing will be added in 2.70 secs!
My RAM is 4GB and my input file (the large one) is 370MB.
Can anyone suggest a solution?
It's possible that the line endings in the large file are not '\r\n'.
Change the LINES TERMINATED BY '\r\n' format to '\n'.