I have imported a CSV file where a specific column has a decimal number.
In the original excel file (before saving it to a CSV), the first number of the column shows up as 218,790. When I choose the cell, the number shows up as 218790.243077911.
In the CSV file the number shows up as 218790 and when I choose the cell it is 218,790.
When I import the file on mySQL and show the table I created, the number shows up as 218.000000000.
Here is the code I used:
create table Apolo_Test(
Leads decimal (15,9)
);
LOAD DATA LOCAL INFILE 'C:/Users/SCRIPTS/file.csv'
INTO TABLE Apolo_Test
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 7 ROWS
;
I tried updating the format with this :
update Apolo_Test set Leads = format(Leads, 10, 'de_DE');
but it did not work. I have never had a case where files had a comma before. I guess it is the UK version of numerical fields.
How is it possible to make it work on mySQL without using any MACROS in excel?
UPD:
It works but I get some warnings although I double checked the csv file and the fields :
create table Apolo_Test(
Ad_Group varchar(50),
Impacts int,
Leads decimal (10,3)
);
LOAD DATA LOCAL INFILE 'C:/Users/me/Desktop/SCRIPTS/11/Adalyser.csv'
INTO TABLE Apolo_Test
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 7 ROWS
(Ad_Group, Impacts, #Leads)
SET Leads = replace(#Leads, ',', '');
;
alter table Apolo_Test ADD IPL decimal (10,6) after Leads;
update Apolo_Test set IPL=Impacts/Leads;
select * from Apolo_Test;
You have to use this syntax:
LOAD DATA LOCAL INFILE 'C:/path/to/mytable.txt' IGNORE
INTO TABLE mytable
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\r\n'
(int_col, #float_col)
SET float_col = replace(#float_col, ',', '.');
For more information read here
The thousands-separator should not matter when moving data around -- Excel internal values and CSV files and MySQL internal values do not include it. Only "formatted" output includes it. And you should not use formatted output for moving numbers around.
Be careful with locale, such as de_DE.
The German "218.790" is the same as English "218,790".
"218790.243077911" is likely to be what Excel had internally for the number.
"218,790" is likely to be the English representation on the screen; note the English thousands separator.
In the CSV file the number shows up as 218790 and when I choose the cell it is 218,790.
What do you mean? Perhaps that there no comma or dot in the file, itself? But what you mean by "choose the cell"?
I can't see how to get "218.000000000" without truncation going on somewhere.
Related
I've filled my table of instruments using LOAD INTO FILE. It fills the rows successfully but then doesn't enclose the final column (status) with a vertical line. I didn't think this was an issue until I ran a query to check the number of column entries = "commissioning".
SELECT COUNT(*)
FROM instrument
WHERE status = 'commissioning';
All 60 rows contain "commissioning" so it should return 60, but instead it returns 0?
I retried the query with a wildcard search and returned the right result here (You can also see the table is not enclosed)
Perhaps something is going on when I imported from csv file, because a LENGTH(status) query returns 14 when "commissioning" is only 13 characters. Has anyone encountered this before or know what character could be causing this?
Heres the import from the csv file code for further clarity - but it worked fine with my other tables
The problem you are having is produced because Windows uses '\r\n' instead of '\n'. As you are telling the import statement to finish lines with '\n' you have an extra '\r' character in every line. You need to change your import statement as:
LOAD DATA INFILE 'instruments.csv'
INTO TABLE instruments
FILEDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS;
I'm trying to save some ID values from CSV that are automatically converted to exponent numbers by Excel.
Like 382383816413 becomes 3.82384E+11. So I'm doing a full import into my MySQL database with:
LOAD DATA LOCAL INFILE
'file.csv'
INTO TABLE my_table
FIELDS TERMINATED BY ';'
ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(#`item_id`,
#`colum2`,
#`colum3`)
SET
item_id = #`item_id`;
I've tried using cast like:
CAST('3.82384E+11' as UNSIGNED) and it gives me just 3.
CAST('3.82384E+11' as BIGINT) and it doesn't work.
CAST('3.82384E+11' as UNSIGNED BIGINT) and gives me 3 again.
So, what's the better way to convert string exponent numbers to real big integers in MySQL?
Set column format as text instead of number in excel. Refer below link.
PHPExcel - set cell type before writing a value in it
My option was to convert the column with 3.82384E+11 to number in the excel file, so it get back to the original value. Then I export to CSV and use SQL query to import it fine.
Sorry for the wordy title. I'm running the following query:
LOAD DATA LOCAL INFILE 'file.csv' INTO TABLE tablename
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY ',\r\n'
I have a CSV with data in the following format:
1111, 2222, "32", "1,234.304", 1023.53,
All lines are terminated with an extra comma.
When attempting to load, I get an error when trying to put the "1,234.304" type field into a NUMERIC(12,4) column. I end up with 1.0000 and a warning that the data was truncated. I had expected 1234.304.
Edit: It seems that doing a regular insert for a value which has comma thousand-separators upsets MySQL. Is there a way to modify the load command for the desired behavior?
Break the problem down into two steps:
Get the data into the database
Get the data into the table/format you want it in
If you get all the data into a table with VARCHAR fields, you can easily manipulate it and explicitly convert the values into the types you want them in, rather than relying on whatever implicit conversion you get with LOAD DATA LOCAL INFILE.
I have some data (in csv) per country taken from a third-party source, and I am having some issues importing them into mySQL.
For example, one column in my table is Country with value 'Côte d'Ivoire' - the import in mysql appears to divide this one row of data into two with Country value of 'C'. It is unable to import the text value of 'Côte d'Ivoire'.
This is what I used for the import:
TRUNCATE TABLE source_DATA_TABLE;
LOAD DATA INFILE 'H://TESTDATA/2015/source_DATA.csv'
INTO TABLE source_proc_pqr
FIELDS TERMINATED BY '\#'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
I changed the delimited through my regional settings on my PC to # but the same problem exists.
Anyone have a fix to this problem? I'm using mySQL workbench (xampp/phpmyadmin).
I tried something and it works. first of all the file should be in utf8. and then you can load it like this:
LOAD DATA INFILE 'H://TESTDATA/2015/source_DATA.csv'
INTO TABLE source_proc_pqr
CHARACTER SET UTF8
FIELDS TERMINATED BY '\#'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Im trying to import a text file containing:
http://pastebin.com/qhzrq3M7
Into my database using the command
Load data local infile 'C:/Users/Gary/Desktop/XML/jobs.txt'
INTO Table jobs
fields terminated by '\t';
But I keep getting the error Row 1-13 doesn't contain data for all columns
Make sure the last field of each row ends with \t. Alternatively, use LINES TERMINATED BY
LOAD DATA LOCAL INFILE 'C:/Users/Gary/Desktop/XML/jobs.txt' INTO TABLE jobs COLUMNS TERMINATED BY '\t' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r';
\r is a carriage return character, similar to the newline character (i.e. \n)
I faced same issue. How i fixed the issue:
Try to open the CSV file using Notepad++ (text editor)
I've seen a blank line at the end of my file, I've deleted it.
-- Hurrah, it resolved my issue.
Below URL also can help you out to resolve the issue.
http://www.thoughtspot.com/blog/5-magic-fixes-most-common-csv-file-problems
If you're on Windows, make sure to use the LINES TERMINATED BY \r\n as explained by the mariadb docs
sounds like load data local infile expects to see a value for each column.
You can edit the file by hand (to delete those rows -- could be blank lines), or you can create a temp table, insert the rows into a single column, and write a mysql command to split the rows on tab and insert the values into the target table
Make sure there are no "\"s at the end of any field. In the csv viewed as text this would look like "\," which is obviously a no-no, since that comma will be ignored so you won't have enough columns.
(This primarily applies when you don't have field encasings like quotes around each field.)