I've filled my table of instruments using LOAD INTO FILE. It fills the rows successfully but then doesn't enclose the final column (status) with a vertical line. I didn't think this was an issue until I ran a query to check the number of column entries = "commissioning".
SELECT COUNT(*)
FROM instrument
WHERE status = 'commissioning';
All 60 rows contain "commissioning" so it should return 60, but instead it returns 0?
I retried the query with a wildcard search and returned the right result here (You can also see the table is not enclosed)
Perhaps something is going on when I imported from csv file, because a LENGTH(status) query returns 14 when "commissioning" is only 13 characters. Has anyone encountered this before or know what character could be causing this?
Heres the import from the csv file code for further clarity - but it worked fine with my other tables
The problem you are having is produced because Windows uses '\r\n' instead of '\n'. As you are telling the import statement to finish lines with '\n' you have an extra '\r' character in every line. You need to change your import statement as:
LOAD DATA INFILE 'instruments.csv'
INTO TABLE instruments
FILEDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS;
I am using this query to import data from a txt file into my table:
LOAD DATA LOCAL INFILE '~/Desktop/data.txt' INTO TABLE codes LINES TERMINATED BY '\n' (code)
This is working fine. But when I take a look in the "code"-field, every entry has a line break at its end. Is there a way to get rid of this?
Load data infile command is not really suitable for data cleansing, but you may get lucky. First of all, determine what characters exactly make up those 'line breaks'.
It is possible, that the text file uses Windows style line breaks (\r\n). In this case use lines terminated by '\r\n'. If the line breaks consist of different characters, but are consistent across all lines, then include those in the line terminated by clause.
If the line break characters are inconsistent, then you may have to create a stored procedure or use an external programming language to cleanse your data.
I have a Table in MySQL and I am adding data to it from a csv file
My code is:
LOAD DATA LOCAL INFILE 'C:/myaddress/file.csv'
INTO TABLE db.mytable
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
(`Currency`,`field2`,`field3`)
This loads fine except for the first row I add.
I'm adding a USD but when I do a query it imports it as USD.
This only happens for the first row. Anybody knows why this happens?
Solution:
This is an encoding issue, so to solve it, here are two options
1.- Encode the file differently
2.- Add a dummy line and use IGNORE 1 LINES
I think this is something related with your csv. It has been encoded with UTF-8 BOM in ISO-8859-1 (spanish?).
If you are using a editor like Notepad++ open your csv and select from the top menu -> Encoding -> utf-8 without DOM , save and try again.
I have created a database and a table. I have also created all the fields I will be needing. I have created 46 fields including one that is my ID for the row. The CSV doesn't contain the ID field, nor does it contain the headers for the columns. I am new to all of this but have been trying to figure this out. I'm not on here being lazy asking for the answer, but looking for directions.
I'm trying to figure out how to import the CSV but have it start importing data starting at the 2nd field, since I'm hoping the auto_increment will fill in the ID field, which is the first field I created.
I tried these instructions with no luck. Can anyone offer some insight?
The column names of your CSV file must match those of your table
Browse to your required .csv file
Select CSV using LOAD DATA options
Check box 'ON' for Replace table data with file
In Fields terminated by box, type ,
In Fields enclosed by box, "
In Fields escaped by box, \
In Lines terminated by box, auto
In Column names box, type column name separated by , like column1,column2,column3
Check box ON for Use LOCAL keyword.
Edit:
The CSV file is 32.4kb
The first row of my CSV is:
Test Advertiser,23906032166,119938,287898,,585639051,287898 - Engager - 300x250,88793551,Running,295046551,301624551,2/1/2010,8/2/2010,Active,,Guaranteed,Publisher test,Maintainer test,example-site.com,,All,All,,Interest: Dental; custom geo zones: City,300x250,-,CPM,$37.49 ,"4,415","3,246",3,0,$165.52 ,$121.69 ,"2,895",805,0,0,$30.18 ,$37.49 ,0,$0.00 ,IMPRESSIONBASED,NA,USD
You can have MySQL set values for certain columns during import. If your id field is set to auto increment, you can set it to null during import and MySQL will then assign incrementing values to it. Try putting something like this in the SQL tab in phpMyAdmin:
LOAD DATA INFILE 'path/to/file.csv' INTO TABLE your_table FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' SET id=null;
Please look at this page and see if it has what you are looking for. Should be all you need since you are dealing with just one table. MYSQL LOAD DATA INFILE
So for example you might do something like this:
LOAD DATA INFILE 'filepath' INTO TABLE 'tablename' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' (column2, column3, column4);
That should give you an idea. There are of course more options that can be added as seen in the above link.
be sure to use LOAD DATA LOCAL INFILE if the import file is local. :)
my CSV content looks like this
1234,123;123;123
5675,123;567;234;565
No Space is provided at the end of each row in CSV i.e. 1234,123;123;123(No space here)
Imported this using the following command
mysql> load data local infile 'E:\sample.csv' into table Test.Table1 fields
terminated by ',' lines terminated by '\n' (Column1,Colunm2);
It gets executed successfully and i can find all the records in the DB. But the second column ends with a pilcrow.
When i try to edit, the value looks like
123;123;123
<extra line here>
If i remove the extra line, the pilcrow disappears.
Type of the column1, column2 is varchar.
Any clues for the issue?
I believe your problem is because of EOL termination. The file probably is using \r\n instead of only \n, hence the "<extra line here>".