I'm trying to load data into a table (obviously?). My table looks like this:
CREATE TABLE IF NOT EXISTS `condensed` (
`id` bigint(20) NOT NULL,
`src` enum('C_X','C_AH','C_AO','C_B','H_X','H_AH','H_AO','H_B') NOT NULL,
`hash` int(11) default NULL,
`ihash` int(11) default NULL,
`last_updated` datetime default NULL,
PRIMARY KEY (`id`,`src`),
UNIQUE KEY `covering` (`id`,`src`,`hash`)
) ENGINE=MyISAM DEFAULT CHARSET=ascii;
I've got data files with look like this:
320115816,'C_X',692983698,854142703,20120216220954
320124536,'C_X',588472049,1059436251,20100527232845
320120196,'C_X',452117509,855369958,20101118105505
...
But when I load it using
LOAD DATA INFILE '/path/to/data.csv'
IGNORE
INTO TABLE `condensed`
(id, src, hash, ihash, last_updated);
it only loads the first two columns (hash, ihash and last_updated are null).
320115816,'C_X',NULL,NULL,NULL
320124536,'C_X',NULL,NULL,NULL
320120196,'C_X',NULL,NULL,NULL
...
I do get a lot of warnings (presumably because mysql is discarding the 3 columns from the input set and assigning defaults)
Query OK, 20 rows affected, 100 warnings (0.00 sec)
Records: 20 Deleted: 0 Skipped: 0 Warnings: 100
(I intend to load several milion records - not just 20)
I get the same problem using mysqlimport.
Omitting an explicit field list from the LOAD DATA statement (same fields and order in database as in files) resulted in the same outcome.
MySQL version is 5.0.22, there are no non-printable characters in the input file.
Help!
Just add some improvement to Thilo's answer if you are using Windows.
LOAD DATA INFILE '/path/to/data.csv' IGNORE
INTO TABLE `condensed`
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\''
LINES TERMINATED BY '\r\n' --Windows right line terminator
(id, src, hash, ihash, last_updated)
It worked for me. It solved all my truncating problems on Windows. Also have a look at this :
http://forums.mysql.com/read.php?79,76131,76871
Never managed to resolve this. I ended up writing a wee php script to map the data into the db.
It's worth noting that, if the field MySQL is complaining about happens to be the final one in the table, there's a chance that you need to fix the FIELDS TERMINATED BY. On Windows I had to tell it \n instead of \r\n.
I was having similar problems with a CSV file created on an IBM mainframe that was moved to a Windows file server before being loaded. I was getting a truncation warning on all rows except the last. Mainframe file looked okay. Adding '\r\n' cleared the problem.
I think the quotes around only the enum field are confusing the import. Try this:
LOAD DATA INFILE '/path/to/data.csv' IGNORE
INTO TABLE `condensed`
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\''
(id, src, hash, ihash, last_updated)
Related
I'm new to Mysql and am using it to make use of several CSV files I have that are very large (some have over a million rows). I'm on Win7-64 Ultimate. I have installed MySql Workbench v. 6.3.6 build 511 64 bit. I read a similar question however I cannot comment since I am new. I am getting a different error anyway.
I have set up a database called crash0715, and created a table called driver_old with five columns. The first column is a report number (set up as INT(20)) that will be keyed to other files. It contains some duplicates depending upon the data in the other columns. The next four columns contain numeric data that is either 1 or 2 digits.
I set up the report_number column as INT(20), primary key, not null.
The other 4 were set up as INT or INT(2)
When I tried to import a little over 1 million rows in a 5-column CSV file (named do.csv in my c:\ root) via the GUI, the program hung. I had let it run over 12 hours and my task manager showed the program was using 25% cpu.
I next tried the command line. After switching to the database, I used
LOAD DATA LOCAL INFILE 'c:/do.csv' INTO TABLE driver_old FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
I had removed the header row from the CSV before trying both imports.
I got the following message:
QUERY OK, 111 rows affected, 65535 warnings <3.97 sec> Records: 1070145 Deleted: 0 Skipped: 1070034 Warnings: 2273755
I read the first few lines of SHOW WARNINGS and they were as follows:
1264 Out of range value for column 'report_number' for row 1.
1261 Row 1 doesn't contain data for all columns
These two repeated for all of the other lines.
There was also a
1062 Duplicate entry '123456789' for key 'primary' (123456789 is a representative value)
It also reoccurred with the other two codes.
The CSV file has no blanks on the first column, however there are a few in the other ones.
Any idea what I'm doing wrong here?
i solved this by save and export sql insert statement
I would use bigint insted of int!
Inserting ignore or replace may help with duplicate primary key values!
LOAD DATA LOCAL INFILE 'c:/do.csv' ignore/replace INTO TABLE driver_old FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
I cannot comment on this question,but it would be great if you could post an url to a picture showing few lines from csv file and code how you created table and inserted data ! That would be very helpful for answering the question!
I have now successfully imported the 1045767 records. As suggested by another member here, I imported a small 100 row file that gave the same errors. I then opened the csv in Libre Office and saved it. I was able to import it OK.
The problem was the spreadsheet program, GS-Calc. When saving csv files, it gives three options: UTF-8, UTF-16, and ANSI/OEM/ISO. I had initially saved it as UTF-8 and it returned the error.
I saved it as ANSI/OEM/ISO and it was able to be imported OK. I hope this helps others with large csv files in the future.
i change the separator default in mysql by comma
I am writing a file to a MySql table with a blob type.
There are various reasons why I NEED to do this - I cannot just use the filesystem.
I find that writing the file into the db and then reading back from the db, the file is corrupted.
I can replicate this using SQL and nothing else.
The file should fit easily into the LONGBLOB so I do not think the data is being truncated. In fact the data being read back is larger than the data being written.
I have experienced the same issue with BLOB and MEDIUMBLOB too.
Here is the SQL:
CREATE TABLE Video(
Id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
ShopAssistant_Id int NOT NULL,
Data LONGBLOB NOT NULL,
FOREIGN KEY (ShopAssistant_Id) REFERENCES ShopAssistant(id)
);
--insert data from a file
insert into video (shopassistant_id, data) values (1, LOAD_FILE('/Users/tmck/temp/introduction.mp4'));
--read data back into a different file
select data into outfile '/Users/tmck/temp/ret.mp4' from Video;
If we now compare the files, ret.mp4 is bigger than introduction.mp4, the individual bytes have been changed and the file is corrupt and won't play.
The file itself is small, circa 245 KB.
What 'option' in MySql is causing it to convert my data?
Ok, found the answer here
http://ask.metafilter.com/161415/Export-my-Mysql-PDF-binaryblob-into-a-file
The data was being written correctly, and the select into outfile was mangling the data
The soln is to add some escape hints to MYSQL as follows:
select data into outfile '/Users/tmck/temp/ret.mp4' FIELDS TERMINATED BY '' ENCLOSED BY '' ESCAPED BY '' LINES TERMINATED BY '' STARTING BY '' from Video
I am trying to do a load data infile to a database. The only issue I have is I am getting a Error 1261. I was getting an incorrect datetime value error earlier but i solved that with the code in the load data infile below (set date_time = ). My problem now is it says that I don't have enough data for all columns. I know that you are supposed to name the columns after the name of the table but I can't seem to get it to work.
There is one table and it has 15 columns. The first column is the primary key, the other fourteen are regular columns.
Here is the load file statement:
load data infile 'c:/proj/test.csv' into table base (#var1,event,failure,ue,mc,mn,cell,durat,cause,ne,ims,hier,hier3,hier32)
set date_time = STR_TO_DATE(#var1, '%Y%m%d %H%i%s')
;
Additional notes: pk column is called dataId and is an INT
It is auto increment.
Here is the data from the csv file:
2013-03-20 14:55:22,4098,1,21060800,344,930,4,1000,0,11B,344930000000011,4809532081614990000,8226896360947470000,1150444940909480000
Try this
load data infile 'c:/proj/test.csv' into table base (#var1,event,failure,ue,mc,mn,cell,durat,cause,ne,ims,hier,hier3,hier32)
set date_time = STR_TO_DATE(#var1, '%Y%m%d %H%i%s')
character set latin1
fields terminated by '\t' enclosed by '' escaped by '\\'
lines terminated by '\n' starting by ''
ignore 1 lines;
Take a look at here
I also had met the similar problems.
The error message is:
load data infile 'L:/new_ncbi' into table ncbi
fields terminated by '\t'
lines terminated by '\r\n' 1973 row(s) affected, 3 warning(s):
1261 Row 1629 doesn't contain data for all columns
1261 Row 1630 doesn't contain data for all columns
1261 Row 1630 doesn't contain data for all columns
Records: 1973 Deleted: 0 Skipped: 0 Warnings: 3 0.281 sec
so I come back to see the data what I load.
I find at line 1639-1630 in the file, I find this problem:
Sphingomonas phage PAU NC_019521 "Sphingomonas paucimobilis
" species
yes, as you see.
The two line would like to be one line, but it is not.
By the way, I state that my data is stored by a excel file.
While I need to handle my data, I transfer my data from excel file to a normal file.
One line data in excel file will be two just because this line my contain a spacial character like CRLF or others.
So I suggest you can copy your data from csv to a normal file and check whether having similar problems.
Maybe my English is bad, but I still hope be helpful.
Error Code: 1406. Data too long for column
CREATE TABLE `TEST`
(
`idTEST` INT NOT NULL ,
`TESTcol` VARCHAR(45) NULL ,
PRIMARY KEY (`idTEST`)
);
Now Insert some values
INSERT INTO TEST
VALUES
(
1,
'Vikas'
)
select
SELECT * FROM TEST;
Inserting record more than the length
INSERT INTO TEST
VALUES
(
2,
'Vikas Kumar Gupta Kratika Shukla Kritika Shukla'
)
If we select the length
SELECT LENGTH('Vikas Kumar Gupta Kratika Shukla Kritika Shukla')
'47'
And it is showing the error message
Error Code: 1406. Data too long for column
But what is my expectation is, I want to insert at least first 45 characters in Table
please let me know if the question is not clear.
I know the cause of this error. I am trying to insert values more than the length of datatype.
I want solution in MySQL as It is possible in MS SQL. So I hope it would also be in MySQL.
MySQL will truncate any insert value that exceeds the specified column width.
to make this without error try switch your SQL mode to not use STRICT.
Mysql reference manual
EDIT:
To change the mode
This can be done in two ways:
Open your my.ini (Windows) or my.cnf (Unix) file within the MySQL installation directory, and look for the text "sql-mode".
Find:
Code:
# Set the SQL mode to strict
sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION"
Replace with:
Code:
# Set the SQL mode to strict
sql-mode="NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION"
Or
You can run an SQL query within your database management tool, such as phpMyAdmin:
Code:
SET ##global.sql_mode= '';
This happened to me recently.
I was fully migrate to MySQL 5.7, and everything is in default configuration.
All previously answers are already clear and I just want to add something.
This 1406 error could happen in your function / procedure too and not only to your table's column length.
In my case, I've trigger which call procedure with IN parameter varchar(16) but received 32 length value.
I hope this help someone with similar problem.
Besides the answer given above, I just want to add that this error can also occur while importing data with incorrect lines terminated character.
For example I save the dump file in csv format in windows. then while importing
LOAD DATA INFILE '/path_to_csv_folder/db.csv' INTO TABLE table1
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Windows saved end of line as \r\n (i.e. CF LF) where as I was using \n. I was getting crazy why phpMyAdmin was able to import the file while I couldn't. Only when I open the file in notepadd++ and saw the end of file then I realized that mysql was unable to find any lines terminated symbol (and I guess it consider all the lines as input to the field; making it complain.)
Anyway after making from \n to \r\n; it work like a charm.
LOAD DATA INFILE '/path_to_csv_folder/db.csv' INTO TABLE table1
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
I think that switching off the STRICT mode is not a good option because the app can start losing the data entered by users.
If you receive values for the TESTcol from an app you could add model validation, like in Rails
validates :TESTcol, length: { maximum: 45 }
If you manipulate with values in SQL script you could truncate the string with the SUBSTRING command
INSERT INTO TEST
VALUES
(
1,
SUBSTRING('Vikas Kumar Gupta Kratika Shukla Kritika Shukla', 0, 45)
);
This is a step I use with ubuntu. It will allow you to insert more than 45 characters from your input but MySQL will cut your text to 45 characters to insert into the database.
Run command
sudo nano /etc/mysql/my.cnf
Then paste this code
[mysqld]
sql-mode="NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION"
restart MySQL
sudo service mysql restart;
Since this question is prioritized on search-results, I will quickly say that you can pre-truncate your data before saving using substr(); then move on to the more serious issue of saving large data resulting in Error Code: 1406. Data too long for column.
I disagree with all answers and comments advising on turning off the strict mode. The presumption is, data that needs saving must be saved - not left to the chance of mysteriously disappearing at will without notice. Good table structure is advised but if you must save any large data, you can change the column's capacity to:
TEXT: 65,535 characters - 64 KB
MEDIUMTEXT: 16,777,215 - 16 MB
LONGTEXT: 4,294,967,295 characters - 4 GB
Try to check the limits of your SQL database. Maybe you'r exceeding the field limit for this row.
I got the same error while using the imagefield in Django.
post_picture = models.ImageField(upload_to='home2/khamulat/mydomain.com/static/assets/images/uploads/blog/%Y/%m/%d', height_field=None, default=None, width_field=None, max_length=None)
I just removed the excess code as shown above to post_picture = models.ImageField(upload_to='images/uploads/blog/%Y/%m/%d', height_field=None, default=None, width_field=None, max_length=None) and the error was gone
I got this error after creating my table structure first with a primary key set then trying to upload a csv file. My CSV file had information in the primary key column. It was an export from another sql server. No matter how I tried to export and import, it wouldn't work.
What I did to solve it was to drop my primary key column in my db and my csv, upload, then add my primary key column back.
Although the answers above indicate to update my.ini file, but I feel it would be better to alter column lengeth to TEXT or LONGTEXT, so that any higher length can be added.
Go to your Models and check, because you might have truncated a number of words for that particular column eg. max_length="150".
I Have the following table:
CREATE TABLE `tmp_table` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`t` bit(1) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=6 DEFAULT CHARSET=latin1$$
And an xml file called "data.xml" that contains 1 line:
<list><row t="0" /></list>
When I run the following command:
LOAD XML LOCAL INFILE 'c:/temp/data.xml' INTO TABLE `tmp_table`
After running this command I get one row with a value of "1" for column t and a warning:
LOAD XML LOCAL INFILE 'c:/temp/data.xml' INTO TABLE `tmp_table` 1 row(s) affected, 1 warning(s):
1264 Out of range value for column 't' at row 1
Records: 1 Deleted: 0 Skipped: 0 Warnings: 1 0.000 sec
How can I load a 0 for a bit field in an xml document?
MySQL suggests to do next:
BIT values cannot be loaded using binary notation (for example, b'011010'). To work around this, specify the values as regular integers and use the SET clause to convert them so that MySQL performs a numeric type conversion and loads them into the BIT column properly:
http://dev.mysql.com/doc/refman/5.5/en/load-data.html
I have tried this query:
LOAD XML LOCAL INFILE 'data.xml' INTO TABLE `tmp_table`
ROWS IDENTIFIED BY '<row>'
(#var1)
SET t = CAST(#var1 AS SIGNED);
...and I got stange warning message - 'Column 't' cannot be null'.
Hope this will work for you; otherwise, I think, you should write a request to bugs.mysql.com
LOAD XML INFILE seems to be not that good at importing data from arbitrary XML.
I have a blog post about using LOAD DATA INFILE to import from XML. since the approach uses a user-defined variable to hold the group, you can add an additional function to cast the value.
Alternatively, you can try to export data from MySQL in XML, look at how it represents bit values and adjust your xml before loading with XSLT.
My post was actually inspired by the question: LOAD XML LOCAL INFILE with Inconsistent Column Names