MySQL-MySQL skips some rows when import .csv file - mysql

I wanted to import a .csv file with ~14k rows to MySQL. However I only got a table in MySQL with about 13k rows. I checked and found out that MySQL skips some rows in the middle of my .csv file.
I used LOAD DATA INFILE and I really cannot understand why MySQL skips those rows. Really appreciate if someone could help me with this.
Here is my query:
LOAD DATA LOCAL INFILE 'd:/Data/Tanbinh - gui Tuan/hm2.csv'
INTO TABLE q.hmm
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n' IGNORE 1 LINES;
There is no warning message at all

I assume, that the difference in numbers of rows caused by unique key duplicates.
From the reference:
With LOAD DATA LOCAL INFILE, data-interpretation and duplicate-key
errors become warnings and the operation continues
Actually there are no warnings produced on duplicates. To check wich rows were skipped, you can load data into another similar table without unique keys and compare those tables.

Related

How to upload more than 1000 entries using LOAD DATA INFILE?

I'm trying to import an csv file to my database table, the origin of this file was a previous database with the same structure. My issue is that it imports only 1000 rows instead of the whole 62k+ file. The script i'm using is:
LOAD DATA INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/covid19.csv'
INTO TABLE covid19.covid19
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(id,date,iso2,iso3,country,continent,cases,deaths,population);
Some clients have a option, where they reduce the number of returned Rows with a LIMIT 1000.
You should check, how many rows you actually have with
SELECT COUNT(*) FROM covid19.covid19;
You should see the actual number of inserted rows, as the command didn't show any warnungs or errors.

How to stop mysql from exiting query on error?

I'm loading a lot of data from long files into my database with the LOAD DATA INFILE command, however whenever there's a line that either doesn't conform to the column amount, uses incorrect string values, or is too long for the column, the entire query simply exits.
Is there a way to force the query to just skip the erroring line whenever there's an error? I tried the --force argument, but that doesn't seem to do anything to solve it.
My query:
LOAD DATA CONCURRENT INFILE 'file_name' INTO TABLE table_name FIELDS TERMINATED BY ':DELIMITER:' LINES TERMINATED BY '\n' (#col1,#col2) SET column_1=#col1,column_2=#col2
I don't mind dropping the lines with errors in them since they're not useful data anyway, but I do care about all the data after the erroring line.
Adding IGNORE to your command will skip the erroneous lines
Example:
LOAD DATA CONCURRENT INFILE 'file_name' IGNORE INTO TABLE table_name FIELDS TERMINATED BY ':DELIMITER:' LINES TERMINATED BY '\n' (#col1,#col2) SET column_2=#col1,column_2=#col2

LOAD DATA INFILE not working on my server

I have a csv with 400k records that I am trying to import into an existing database table but it is not working. I get no errors it simply says MYSQL returned an empty result.
Here is the query I am executing.
LOAD DATA INFILE 'C:\\xampp\\htdocs\\projects\\csvTest\\newDataList.csv'
INTO TABLE peeps
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(first, last,address,city,state,zip,imb,email,startDate,endDate,jobName)
SET id = NULL
Funny thing is I run this exact code on my local host machine and it works, but if I run it on my server machine it does not work.
Start and End are reserved words so you either need to enclose the column names in backticks, or better yet re-name those columns

Under what conditions does MySQL skip rows?

I'm reading a data.csv file into MySQL via the LOAD SQL command. My SQL code goes like this:
LOAD DATA LOCAL INFILE 'C:/DATA FILES/foo.csv' INTO TABLE foo FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
This works fine, but out of the 366,000 odd lines, it skips 138 of them. There are no duplicate entries or anything, it just skips them out of the blue. When I try to manually input the row into the table, it works fine. But I can't identify all 138 of the lines, so I'm wondering how exactly to pinpoint the possible reasons mySQL would skip these lines of data.
Any ideas?

bulk insert into mysql

I have a gridview in asp.net page with a checkbox column. I need to insert those checked rows on the gridview, to the mysql table.
One of the most easiest ways would be to find the selected rows and insert them one by one over loop.
However, it is time-consuming considering there may be 10000 rows at any instance of time. Given that this is time-consuming process, there is a risk of losing the connection on the course of insertion.
Is there any way to expedite insertion of huge number of rows?
Thanks,
Balaji G
You can first get all the checked records, then write them into to tab delimited or comma delimited format, then using syntax from LOAD DATA INFILE do the bulk insertion.
Here's the sample format:
LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';