Something weird is happening when I try and insert data. Data will not insert - mysql

I have a problem here. I have about 13K rows of data that I want to put into a mysql database. What I notice is that if the row is "short" the insert will happen but if it is long, it fails. I have been trying to figure this out all day and I'm about to give up.
Here is the table structure.
CREATE TABLE `concordance` (
`term` text NOT NULL,
`ref` text NOT NULL,
`id` int(11) NOT NULL auto_increment,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
Here is a row that will go in just fine.
"Abelmeholah"*"JDG 7:22 KI1 4:12 19:16"
Here is a large row that fails.
"A"*"GEN 6:16 30:11 EXO 4:2,26 12:45 28:34 30:2 38:26 39:26 LEV 20:27 21:14 25:11 NUM 28:21,29 29:10 DEU 2:21 8:8-9 9:2 11:12,27 22:30 23:2 26:5 28:50 JDG 9:54 13:6 KI2 11:5 CH2 23:4 PSA 22:30 37:16 68:5 74:5 91:7 92:1,6 97:3 101:4 107:34 112:5 PRO 1:5 6:12,17,19,33 9:13 10:1 11:1,13,16 12:2-4,8,10,14,16,23 13:1-2,5,17,22 14:5-6,16,25,30 15:1,4-5,12-13,18,20,23 16:9-11,28-29 17:2,4,8,10,17-18,22-23,25 18:2,6-7,16,19-20,24 19:5,9,13,19,24 20:8,26 21:14,22,28-29 22:1,3 24:5 25:11,18,26 26:3,28 27:3,12,15 28:3,17,20 29:5,11,19,23 30:30-31 ECC 3:2-8 6:2 7:1 10:2,14,19 SOL 1:13 4:12,15 ISA 8:12 21:2,8 27:2 42:3 60:22 62:12 65:3 66:6 JER 2:24 3:21 4:11 5:30 11:9,16 17:12 20:15 25:31,36 31:15,22 48:3 50:22,35-38 51:54 EZE 5:12 17:3 19:2 21:9 36:26 DAN 7:10 JOE 2:2-3 AMO 7:8 8:2 HAB 3:1 ZEP 1:16 MAL 1:6 MAT 5:14 7:18 11:7-9 12:20,35 13:57 16:4 21:28 MAR 6:4 12:1 LUK 2:24,32 6:45 7:24-26 8:5 10:30 13:6 14:16 15:11 19:12 20:9 JOH 1:42 3:27 9:11 13:34 16:16-19,21 19:36 ACT 3:22 7:37 10:2 11:5 CO1 7:15 GAL 5:9 TI1 3:2 TIT 3:10 HEB 8:2,13 JAM 1:8 REV 6:6"
Any help is greatly appreciated.
Here is what I am using to get the data into the db
LOAD DATA LOCAL INFILE '/data.txt'
INTO TABLE concordance
FIELDS TERMINATED BY '*'
I just tried loading the large dataset above and here is the error:
mysql> LOAD DATA INFILE '/tmp/data.txt' INTO TABLE concordance FIELDS TERMINATED BY '*';
ERROR 1261 (01000): Row 1 doesn't contain data for all columns
HERE IS THE CODE THAT WORKS:
LOAD DATA INFILE '/tmp/f.txt'
INTO TABLE `concordance`
FIELDS TERMINATED BY '*'
ENCLOSED BY '"'
(`term` , `ref`);

Your table creation query is wrong:
#1072 - Key column 'seq' doesn't exist in table
Other than that, provided your query is ".... PRIMARY KEY (id) .... ", I can insert both lines of data without any problem. Maybe you should post your code.
Edit:
Try the following query:
LOAD DATA INFILE '/data.txt' INTO TABLE `concordance` FIELDS TERMINATED BY '*'
ENCLOSED BY '"' LINES TERMINATED BY '\r\n'(
`term` , `ref`
)

this command worked for me:
LOAD DATA LOCAL INFILE 'C:\xampp\tmp\php3BF.tmp' INTO TABLE concordance FIELDS TERMINATED BY '*' ENCLOSED BY '"' ESCAPED BY '\' LINES TERMINATED BY '\r\n'
maybe you should provide value for id column too? or there is other error on line 1 in your file...

You might also want to ass a star at the end of each line, in order to tell MySQL that you want to skip the id field...
"Abelmeholah"*"JDG 7:22 KI1 4:12 19:16"*
It is strange though the short line goes in whereas the long line cannot..

This worked fine for me on a windows system. Have you tried escaping the commas in your text?

Related

Avoid splitting data on commas (within quotation marks) in SQL?

I am trying to load a .csv file into a MySQL table, but I'm running into the following error message: "ERROR 1262 (01000): Row 304 was truncated; it contained more data than there were input columns."
Here is the code I'm using:
LOAD DATA INFILE 'my_taxpayers.csv'
INTO TABLE taxpayers
FIELDS ENCLOSED BY '"'
TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
I've pasted Row 304, the first row that causes a problem, below. The corresponding column headers for the row are
PARCEL, TAXPAYER, ADDRESS_1, ADDRESS_2, CITY, STATE, ZIP, COUNTRY, ROLE_PERCENTAGE and IN_CARE_OF.
*00.000366,"BUNNEY, GARY LEE",40 E TRENT,,SPOKANE,WA,99202,,100.0,**"CAROL LUNDY, 1ST AMERICAN"***
It appears the data is getting split (which I don't want) when it runs into "CAROL LUNDY, 1ST AMERICAN" and maybe "BUNNEY, GARY LEE". I thought the ENCLOSED BY '"' was supposed to solve that, so I'm stuck.
For more context, here are a few more random rows of the .csv file that DIDN'T cause any issues.
00.000102,ANITA'S DAPPER DOGS,727 E 32ND AVE,,SPOKANE,WA,99203,,100.0,ANITA ANN SYKES
00.000103,"MKY INVESTMENTS, INC",9508 N DIVISION ST,,SPOKANE,WA,99218,,100.0,MARK & KAREN YOO
00.000104,COUNTY RECORD'S PUBLISHING CO,503 E ERMINA AVE,,SPOKANE,WA,99207,,100.0,JILL MARIE BUSWELL
00.000105,DANCAR MECHANICAL INC,311 N HODGES RD,,SPOKANE VALLEY,WA,99016,,100.0,DANIEL F SCHROER
00.000106,ARTHUR ARMS ADULT FAMILY HOME,652 S ARTHUR ST,,SPOKANE,WA,99202,,100.0,ANTHONY R JONES
The problem was I needed to be doing
LINES TERMINATED BY \r\n
Adding in the r fixes everything.

Importing NULL values from CSV in MySQL

I have a CSV file that contains the following fields and sample data:
year, month, type, port_name, #_people
2010, Feb, air, test, 100
2010, Feb, air, test, \N
2010, Feb, air, test, 100
No matter what I seem to try, the server throws an "incorrect integer value" error when I execute the following query:
LOAD DATA INFILE "../uploads/data3.csv"
INTO TABLE data3
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(`year`, `month`, `type`, port_name, `#_people`);
I thought this might have something to do with lines terminated by '\n' and '\N' being read as NULL on import. I tried adding a test column:
year, month, type, port_name, #_people, test
2010, Feb, air, test, 100, 1
2010, Feb, air, test, \N, 1
2010, Feb, air, test, 100, 1
When I execute the query below, I get no errors and simply need to drop the test column.
LOAD DATA INFILE "../uploads/data3.csv"
INTO TABLE data3
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(`year`, `month`, `type`, port_name, `#_people`);
How do I get the server to properly import a CSV that has '\N' values in the last column before the new line?

MYSQL: LOAD DATA INFILE ... INTO TABLE...; DATETIME filed becomes 0000:00:00 00:00:00

I have a text file S_data.txt with four columns and a typical row is like:
13 Kate 2.138 8/13/2001 13:24:33 (columns are separated by tab)
I want to load the data into a table s_table with four fields: S_id MEDIUMINT, S_name VARCHA(20), S_value DOUBLE, S_dt DATETIME.
mysql>LOAD DATA LOCAL INFILE 'C:\\temp\\S_data.txt' INTO TABLE s_table
LINES TERMINATED BY '\r\n' SET S_dt = STR_TO_DATE(#S_dt,'%m/%d/%y %H:%i:%s');
The values of S_dt all become 0000-00-00 00:00:00.
Can someone help? Thanks.
I found the problem. For my datetime string (e.g. 8/13/2001 13:23:56), I have to use format '%c/%d/%Y %H:%i%S'. Thanks everyone.

SQL Loader- data not uploaded to the table. All went to .bad

I tried to upload some records into my table ABC. None of the records went through and they all showed up in the .bad log.
I am pretty new to sqlldr. Not quite sure where did I messed up. Let me show you the steps I took.
First, I created an empty table called ABC.
create table abc
(
location_id varchar2(10),
sold_month date,
item_name varchar2(30),
company_id varchar2(10),
qty_sold number(10),
total_revenue number(14,3),
promotional_code varchar2(10)
);
Here is my flat file abcflat.dat. The columns correspond to the columns in the table above.
"1000","02/01/1957","Washing Machine","200011","10","10000","ABCDE"
"1000","05/02/2013","Computer","200012","5","5000","ABCDE"
"1000","05/01/2013","Bolt","200010","100","500","ABCDE"
"1000","05/03/2013","Coca Cola","200011","1000","1000","ABCDE"
Here is my control file abc.ctl
LOAD DATA
INFILE 'C:\Users\Public\abcflat.dat'
INTO TABLE ABC
FIELDS TERMINATED BY ","
enclosed by '"'
(
Location_ID
, Sold_month
, item_name
, Company_id
, QTY_Sold
, Total_revenue
, Promotional_Code
)
And my last step
sqlldr hr/open#xe control=c:\users\public\abc.ctl
It says
Commit point reached - logical record count 3
Commit point reached - logical record count 4
but none of the record showed up on my ABC table.
Thank You
It's most probably the date format, try this:
LOAD DATA
INFILE 'C:\Users\Public\abcflat.dat'
INTO TABLE ABC
FIELDS TERMINATED BY ","
enclosed by '"'
(
Location_ID
, Sold_month DATE "DD/MM/YYYY"
, item_name
, Company_id
, QTY_Sold
, Total_revenue
, Promotional_Code
)

BulkInsert into table with Identity column (T-SQL)

1) Can I do a BulkInsert from a CSV file, into a table, such that the table has an identity column that is not in the CSV, and gets automatically assigned?
2) Is there any rule that says the table that I'm BulkInsert'ing into, has to have the same columns in the same order as the flat file being read?
This is what I'm trying to do. Too many fields to include everything...
BULK INSERT ServicerStageACS
FROM 'C:\POC\DataFiles\ACSDemo1.csv'
WITH (FORMATFILE = 'C:\POC\DataFiles\ACSDemo1.Fmt');
GO
SELECT * FROM ServicerStageACS;
Error:
Msg 4864, Level 16, State 1, Line 3 Bulk load data conversion error
(type mismatch or invalid character for the specified codepage) for
row 1, column 1 (rowID).
I'm pretty sure the error is because I have an identity.
FMT starts like this:
9.0
4
1 SQLCHAR 0 7 "," 1 Month SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 100 "," 2 Client SQL_Latin1_General_CP1_CI_AS
A co-worker recommended that it was easier to do the bulk insert into a view. The view does not contain the identity field, or any other field not to be loaded.
truncate table ServicerStageACS
go
BULK INSERT VW_BulkInsert_ServicerStageACS
FROM 'C:\POC\DataFiles\ACSDemo1.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
SELECT * FROM ServicerStageACS;