How to upload more than 1000 entries using LOAD DATA INFILE? - mysql

I'm trying to import an csv file to my database table, the origin of this file was a previous database with the same structure. My issue is that it imports only 1000 rows instead of the whole 62k+ file. The script i'm using is:
LOAD DATA INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/covid19.csv'
INTO TABLE covid19.covid19
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(id,date,iso2,iso3,country,continent,cases,deaths,population);

Some clients have a option, where they reduce the number of returned Rows with a LIMIT 1000.
You should check, how many rows you actually have with
SELECT COUNT(*) FROM covid19.covid19;
You should see the actual number of inserted rows, as the command didn't show any warnungs or errors.

Related

Fastest way to load tablular data from a .csv file?

I have a database in mysql workbench, and i want to load data from .csv files. Now i'm doing with the option of 'Table Data Import Wizard' but takes to many time. My .csv files have million of rows, but each one takes about 12 hours to load. And i have about 100 files to load. My version of MySQL is the 8.0.
There is any way to load the data files fastest? Thanks in advice
You can try something like that:
LOAD DATA INFILE 'c:/myfile.csv'
INTO TABLE table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Be careful with the values you give to FIELDS TERMINATED BY, ENCLOSED BY and LINES TERMINATED BY. Only use IGNORE 1 ROWS if you have a header line, containing the field names, for example.
And if you need to store the data on a remote server, you can use
LOAD DATA LOCAL INFILE 'c:/myfile.csv'
INTO TABLE table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
You can find more infos here.

Importing CSV infile

Ive tried to follow examples of how to import a csv infile with no luck. I have the following CSV-
And from what I understand, I need to set up a table in Workbench, with column names that match the CSV columns. I've done this:
Using this code-
LOAD DATA LOCAL INFILE 'C:/Users/Sam/Desktop/Eviction_Notices_2012to2017_AddressthruZip.csv' INTO TABLE evictions.test
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(Address, City, State, Zip);
I've had varying degrees of success. Right now, it only imports first 99 rows even though there is no limit specified.
Also, is there a way to copy your column names from excel easily into the alter table window, rather than creating each column name manually?
Am I doing something wrong here?

exporting data from table to .csv file

I'm trying to load data of a table into a csv file. However rows in file are broken in between. The data of table seems fine and there is no \n anywhere. Then why am I getting breaks in rows?
Query I'm executing is
select * from `abc_csv` into outfile '/home/abc.csv'
fields terminated by ',' lines terminated by '\n';
And the result I'm getting is:
A,B,C,D,
,E,F
While all these entries belong to the same record

MySQL-MySQL skips some rows when import .csv file

I wanted to import a .csv file with ~14k rows to MySQL. However I only got a table in MySQL with about 13k rows. I checked and found out that MySQL skips some rows in the middle of my .csv file.
I used LOAD DATA INFILE and I really cannot understand why MySQL skips those rows. Really appreciate if someone could help me with this.
Here is my query:
LOAD DATA LOCAL INFILE 'd:/Data/Tanbinh - gui Tuan/hm2.csv'
INTO TABLE q.hmm
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n' IGNORE 1 LINES;
There is no warning message at all
I assume, that the difference in numbers of rows caused by unique key duplicates.
From the reference:
With LOAD DATA LOCAL INFILE, data-interpretation and duplicate-key
errors become warnings and the operation continues
Actually there are no warnings produced on duplicates. To check wich rows were skipped, you can load data into another similar table without unique keys and compare those tables.

How to import large csv into mysql table?

For smaller csv files, the following has always worked for me...
LOAD DATA LOCAL INFILE 'C:/store_emails2.csv' INTO TABLE
mytable FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
Now, I am trying to do this with a file that has 100K rows. Mysql only takes about 50K rows, though. The odd part is that it takes random rows, not just the first 50K. I have no idea what the problem could be, as the file is formatted correctly and I've imported data this way many times. Any ideas as to what may be the problem?
EDIT: including what part of the csv looks like...
CUST_NUM,LAST_NAME,FIRST_NAME,MIDDLE_INIT,PREFIX,SUFFIX,ADDRESS_1,ADDRESS_2,CITY,STATE,ZIP,HOME_TEL,BUS,MAILING,MOD_DATE,SIGN_UP_DATE,BIRTHDAY_DATE,CASHIER,STR,EMAIL
1234556767,name,name,,,,,,,,,9999999999,,Y,6007,607,0101,341,71,email