I'm trying to load data into a mysql table using LOAD DATA LOCAL INFILE using the code below.
Mysql:
LOAD DATA INFILE '/var/www/vhosts/domain.com/httpdocs/test1.csv' INTO TABLE temp_table FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (recloc,client_acc)
Edit: changed LOAD DATA LOCAL INFILE to LOADA DATA INFILE, removed SET id=null, added IGNORE 1 LINES
I'm getting no errors and no imported records. I believe the issue is related to the column names but i'm having a hard time fully understanding what those names should be. Should they be the actual column names within the CSV? or the field names in the DB Table? I would also like the have an auto_incremented primary key (id).
CSV:
recloc,client_acc
"NLGSX3","CORPORATE"
"7SC3BA","QUALITY ASSURANCE"
"3B9OHF","90717-6710"
Any suggestions to what I may be doing wrong? thanks!
Column names in CSV are not necessary, so you should add IGNORE 1 LINES clause.
Columns in your query (recloc,client_acc) need to match columns in table.
First column from CSV will be inserted into recloc, second into client_acc.
If you don't specifu AUTO_INCREMENT column in the statement, but there is one in the table, it should fill automatically.
Short and sweet solution for excel to mysql data import:
Working good for txt file formats.
IN DETAIL:
tbl name=t1
feilds are= name varchar,email varchar;
text.txt file <<== this text file first lines table column names:
name, email
"n1", "e1" next line
"n2", "e2" next line
"n3", "e3" next line
"n4", "e4" next line
"n5", "e5" next line
"n6", "e6" next line
"n7", "e7" next line
pls ignore next line statements
SQL query in wamp
LOAD DATA INFILE 'c:/wamp/www/touch/text.txt' INTO TABLE t1 FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES(name,email)
For this commnad run successfully we have create folders for separately.
Real one is
C:\wamp\mysql\data\wamp\www\touch\text.txt <<==pysical file path is.
But we mention c:/wamp/touch/text.txt
Related
You can see my raw data above. I'm trying to import this data on the table I've created on MySQL. Here's the code I should be using:
LOAD DATA LOCAL INFILE 'mytbl.txt' INTO TABLE mytbl
-> FIELDS TERMINATED BY ',' ENCLOSED BY '"'
-> LINES TERMINATED BY '\r\n';
I can't understand which character is "fields terminated by", which one is "enclosed by" and which one is for "lines terminated by".
Can you read your .csv file into Excel or LibreOffice Calc (or any spreadsheet program) correctly? I guess you probably can. That means it is formatted correctly.
.csv files contain one line of text for each row of data in a table. These LOAD INFILE directives tell MySQL how to find the rows and columns in the .csv file.
FIELDS TERMINATED BY ',' means each column of data ends with a comma. Notice your first line of data :
De Ruijterkade,,123400000001234,,1,105...
The first column is the street name. The second is empty, the third is 1, the fourth 105 et cetera.
ENCLOSED BY '"' means columns of data which themselves contain a comma (a field terminator) must be enclosed in " characters. For example, if your street name had the value De Ruijterkade, Kade your file would contain
"De Ruijterkade, Kade",,123400000001234,,1,105...
Finally LINES TERMINATED BY '\r\n' means each line in your file (row in your table) ends with a Windows-style <return><linefeed> character pair.
Akina correctly pointed out the documentation. https://dev.mysql.com/doc/refman/8.0/en/load-data.html#load-data-field-line-handling
I have a big myfile.csv file that looks like this
From this file I want to import some columns and obivously rows and its relevant data into mysql database phpmyadmin.
This file is located on my local computer, I want to import some columns,rows with data from that file to my live database.
Here is what I have tried after searching google.
I created a table with following columns
id name email
Then tried to run the following query in my live database
LOAD DATA LOCAL INFILE '/tmp/myfile.csv' INTO TABLE registration_no
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'
(#col1,#col2,#col3) set id=#col1,name=#col2,email =#col3
Note
myfile.csv is located on my computer, in C: drive.
Am I running the correct query,is the path /tmp/myfile.csv is correct ?
Query runs but the data isn't loaded into my live DB, please help me, I've spent one and half day figuring this out .
I have read this.
Step 1: (preferred) Try to have only the columns(csv file) which to be imported into DB.
ex: If 3 columns to be imported then in your myfile.csv remove other unnecessary columns
Step 2: Since you are using a windows system make sure the path is specified properly when loading the file.
Step 3: If your csv has headers to skip it use IGNORE 1 LINES.
So, the query would be like below.
LOAD DATA LOCAL INFILE 'C:/myfile.csv' INTO TABLE registration_no
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(id, name, email);
Step 4: (optional) If you need to import only specific columns from csv.
ex: csv contains 5 columns like id, name, reg_no, dob, email, but need to import only 3 columns id, name, email. Just insert the column into a non-existing variable.
Then the query should be like
LOAD DATA LOCAL INFILE 'C:/myfile.csv' INTO TABLE registration_no
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES #to remove first `n` rows usually used to remove header of a file.
(id, name, #dummy, #dummy, email);
I created a table(tbldemo) with 22 columns using a "create table" statement in mysql, when trying to insert the data from a .csv file using "Load data infile" statement it's giving me the error, Now how do I ignore the rows whose entire row is null or empty
Error Code: 1261. Row 11513 doesn't contain data for all columns
this is what I used to load the data from .csv
LOAD DATA INFILE 'D:/Singapore/rau_sales_order.csv' INTO TABLE tbldemo
FIELDS TERMINATED BY ','
ENCLOSED BY '"' LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
I don't want to open .csv file and clean the data from the file by filtering it for null, instead want to do it directly by query.Is there any way to acheive this?. Thanks in advance.
You need some preprocessing of file. For example, remove blank lines. This will help if problem is caused by empty lines.
sed -i '/^[[:space:]]*$/d' file.csv
So i have a bulk loading query:
# Load data into categories
LOAD DATA LOCAL INFILE
'CategoryData.txt'
REPLACE INTO TABLE db.Category
LINES STARTING BY '('
LINES TERMINATED BY ')\n'
FIELDS TERMINATED BY ',';
My category database has the following schema:
ID (AutoIncrementing primary key) - Category name varchar(255)
How can i keep the autoincrementing there whilst still bulk loading?
Thanks!
Assuming that there is only one column in the text file (for the name), you can only load the category name:
LOAD DATA LOCAL INFILE 'CategoryData.txt'
REPLACE INTO TABLE db.Category(CategoryName)
LINES STARTING BY '('
LINES TERMINATED BY ')\n'
FIELDS TERMINATED BY ','
SET ID = NULL;
Even if the input file has multiple columns, you can still use the same idea by just ignoring the other columns.
If you can change the text file, ensure the ID field has \N - Effectively NULL. This should trigger the auto-increment you're after.
I have some data (in csv) per country taken from a third-party source, and I am having some issues importing them into mySQL.
For example, one column in my table is Country with value 'Côte d'Ivoire' - the import in mysql appears to divide this one row of data into two with Country value of 'C'. It is unable to import the text value of 'Côte d'Ivoire'.
This is what I used for the import:
TRUNCATE TABLE source_DATA_TABLE;
LOAD DATA INFILE 'H://TESTDATA/2015/source_DATA.csv'
INTO TABLE source_proc_pqr
FIELDS TERMINATED BY '\#'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
I changed the delimited through my regional settings on my PC to # but the same problem exists.
Anyone have a fix to this problem? I'm using mySQL workbench (xampp/phpmyadmin).
I tried something and it works. first of all the file should be in utf8. and then you can load it like this:
LOAD DATA INFILE 'H://TESTDATA/2015/source_DATA.csv'
INTO TABLE source_proc_pqr
CHARACTER SET UTF8
FIELDS TERMINATED BY '\#'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;