MYSQL: LOAD DATA INFILE ... INTO TABLE...; DATETIME filed becomes 0000:00:00 00:00:00 - mysql

I have a text file S_data.txt with four columns and a typical row is like:
13 Kate 2.138 8/13/2001 13:24:33 (columns are separated by tab)
I want to load the data into a table s_table with four fields: S_id MEDIUMINT, S_name VARCHA(20), S_value DOUBLE, S_dt DATETIME.
mysql>LOAD DATA LOCAL INFILE 'C:\\temp\\S_data.txt' INTO TABLE s_table
LINES TERMINATED BY '\r\n' SET S_dt = STR_TO_DATE(#S_dt,'%m/%d/%y %H:%i:%s');
The values of S_dt all become 0000-00-00 00:00:00.
Can someone help? Thanks.

I found the problem. For my datetime string (e.g. 8/13/2001 13:23:56), I have to use format '%c/%d/%Y %H:%i%S'. Thanks everyone.

Related

How can I get missing values recorded as NULL when importing from csv

I have multiple, large, csv files, each of which has missing values in many places. When I import the csv file into SQLite, I would like to have the missing values recorded as NULL for the reason that another application expects missing data to be indicated by NULL. My current method does not produce the desired result.
An example CSV file (test.csv) is:
12|gamma|17|delta
67||19|zeta
96|eta||theta
98|iota|29|
The first line is complete; each of the other lines has (or is meant to show!) a single missing item. When I import using:
.headers on
.mode column
.nullvalue NULL
CREATE TABLE t (
id1 INTEGER PRIMARY KEY,
a1 TEXT,
n1 INTEGER,
a2 TEXT
);
.import test.csv t
SELECT
id1, typeof(id1),
a1, typeof(a1),
n1, typeof(n1),
a2, typeof(a2)
FROM t;
the result is
id1 typeof(id1) a1 typeof(a1) n1 typeof(n1) a2 typeof(a2)
---- ----------- ------ ---------- -- ---------- ------ ----------
12 integer gamma text 17 integer delta text
67 integer text 19 integer zeta text
96 integer eta text text theta text
98 integer iota text 29 integer text
so the missing values have become text. I would appreciate some guidance on how to ensure that all missing values become NULL.
sqlite3 imports values as text and there does not seem to be a way to make it treat empty values as nulls.
However, you can update the tables yourself after import, setting empty strings to nulls, like
UPDATE t SET a1=NULL WHERE a1='';
Repeat for each column.
You can also create a trigger for such updates:
CREATE TRIGGER trig_a1 AFTER INSERT ON t WHEN new.a1='' BEGIN
UPDATE t SET a1=NULL WHERE rowid=new.rowid;
END;
For the cases where you cannot update after import because the import will fail when the empty string (text columns) or 0 (integer columns) is inserted instead of NULL, see my answer to this other stackoverflow question

SQL Loader- data not uploaded to the table. All went to .bad

I tried to upload some records into my table ABC. None of the records went through and they all showed up in the .bad log.
I am pretty new to sqlldr. Not quite sure where did I messed up. Let me show you the steps I took.
First, I created an empty table called ABC.
create table abc
(
location_id varchar2(10),
sold_month date,
item_name varchar2(30),
company_id varchar2(10),
qty_sold number(10),
total_revenue number(14,3),
promotional_code varchar2(10)
);
Here is my flat file abcflat.dat. The columns correspond to the columns in the table above.
"1000","02/01/1957","Washing Machine","200011","10","10000","ABCDE"
"1000","05/02/2013","Computer","200012","5","5000","ABCDE"
"1000","05/01/2013","Bolt","200010","100","500","ABCDE"
"1000","05/03/2013","Coca Cola","200011","1000","1000","ABCDE"
Here is my control file abc.ctl
LOAD DATA
INFILE 'C:\Users\Public\abcflat.dat'
INTO TABLE ABC
FIELDS TERMINATED BY ","
enclosed by '"'
(
Location_ID
, Sold_month
, item_name
, Company_id
, QTY_Sold
, Total_revenue
, Promotional_Code
)
And my last step
sqlldr hr/open#xe control=c:\users\public\abc.ctl
It says
Commit point reached - logical record count 3
Commit point reached - logical record count 4
but none of the record showed up on my ABC table.
Thank You
It's most probably the date format, try this:
LOAD DATA
INFILE 'C:\Users\Public\abcflat.dat'
INTO TABLE ABC
FIELDS TERMINATED BY ","
enclosed by '"'
(
Location_ID
, Sold_month DATE "DD/MM/YYYY"
, item_name
, Company_id
, QTY_Sold
, Total_revenue
, Promotional_Code
)

Which time mysql type should be use for store const "YYYYMMDD" format?

The case:
Need to store date const format that not changed in format "YYYYMMDD"?
Get const data in text file like: 20120201 20110101 etc... should put in table field
(I want to store it like date for fast queries)
I think mysql directly support it with "Date" type
The table:
id| mydate |
-------------------
1 | 20120201 |
2 | 20110101 |
Thanks
To turn into a MySQL date that can be stored in a DATE type field:
$original = 20120924;
$mysql_format = substr($original, 0, 4) . '-' . substr($original), 4, 2) . '-' . substr($original, 6, 2);
Hope that helps.
(Optionally, you could store it as an INT in timestamp format for fast results, just turn $mysql_format into a timestamp with strtotime();)
Store it as a MySQL date type. When you need to query the info without dashes, use the MySQL replace function like:
REPLACE(date_column,'-','') as 'const_date'
to convert const_date into mysql date format upon insert/update, use MySQL CONCAT with SUBSTR like:
CONCAT(
SUBSTR(const_date, 1, 4)
, ','
, SUBSTR(const_date, 5, 2)
, ','
, SUBSTR(const_date, 7, 2)
) as 'mysql_date'
(Unlike PHP, MySQL does not use zero index for first char position)

BulkInsert into table with Identity column (T-SQL)

1) Can I do a BulkInsert from a CSV file, into a table, such that the table has an identity column that is not in the CSV, and gets automatically assigned?
2) Is there any rule that says the table that I'm BulkInsert'ing into, has to have the same columns in the same order as the flat file being read?
This is what I'm trying to do. Too many fields to include everything...
BULK INSERT ServicerStageACS
FROM 'C:\POC\DataFiles\ACSDemo1.csv'
WITH (FORMATFILE = 'C:\POC\DataFiles\ACSDemo1.Fmt');
GO
SELECT * FROM ServicerStageACS;
Error:
Msg 4864, Level 16, State 1, Line 3 Bulk load data conversion error
(type mismatch or invalid character for the specified codepage) for
row 1, column 1 (rowID).
I'm pretty sure the error is because I have an identity.
FMT starts like this:
9.0
4
1 SQLCHAR 0 7 "," 1 Month SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 100 "," 2 Client SQL_Latin1_General_CP1_CI_AS
A co-worker recommended that it was easier to do the bulk insert into a view. The view does not contain the identity field, or any other field not to be loaded.
truncate table ServicerStageACS
go
BULK INSERT VW_BulkInsert_ServicerStageACS
FROM 'C:\POC\DataFiles\ACSDemo1.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
SELECT * FROM ServicerStageACS;

Something weird is happening when I try and insert data. Data will not insert

I have a problem here. I have about 13K rows of data that I want to put into a mysql database. What I notice is that if the row is "short" the insert will happen but if it is long, it fails. I have been trying to figure this out all day and I'm about to give up.
Here is the table structure.
CREATE TABLE `concordance` (
`term` text NOT NULL,
`ref` text NOT NULL,
`id` int(11) NOT NULL auto_increment,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
Here is a row that will go in just fine.
"Abelmeholah"*"JDG 7:22 KI1 4:12 19:16"
Here is a large row that fails.
"A"*"GEN 6:16 30:11 EXO 4:2,26 12:45 28:34 30:2 38:26 39:26 LEV 20:27 21:14 25:11 NUM 28:21,29 29:10 DEU 2:21 8:8-9 9:2 11:12,27 22:30 23:2 26:5 28:50 JDG 9:54 13:6 KI2 11:5 CH2 23:4 PSA 22:30 37:16 68:5 74:5 91:7 92:1,6 97:3 101:4 107:34 112:5 PRO 1:5 6:12,17,19,33 9:13 10:1 11:1,13,16 12:2-4,8,10,14,16,23 13:1-2,5,17,22 14:5-6,16,25,30 15:1,4-5,12-13,18,20,23 16:9-11,28-29 17:2,4,8,10,17-18,22-23,25 18:2,6-7,16,19-20,24 19:5,9,13,19,24 20:8,26 21:14,22,28-29 22:1,3 24:5 25:11,18,26 26:3,28 27:3,12,15 28:3,17,20 29:5,11,19,23 30:30-31 ECC 3:2-8 6:2 7:1 10:2,14,19 SOL 1:13 4:12,15 ISA 8:12 21:2,8 27:2 42:3 60:22 62:12 65:3 66:6 JER 2:24 3:21 4:11 5:30 11:9,16 17:12 20:15 25:31,36 31:15,22 48:3 50:22,35-38 51:54 EZE 5:12 17:3 19:2 21:9 36:26 DAN 7:10 JOE 2:2-3 AMO 7:8 8:2 HAB 3:1 ZEP 1:16 MAL 1:6 MAT 5:14 7:18 11:7-9 12:20,35 13:57 16:4 21:28 MAR 6:4 12:1 LUK 2:24,32 6:45 7:24-26 8:5 10:30 13:6 14:16 15:11 19:12 20:9 JOH 1:42 3:27 9:11 13:34 16:16-19,21 19:36 ACT 3:22 7:37 10:2 11:5 CO1 7:15 GAL 5:9 TI1 3:2 TIT 3:10 HEB 8:2,13 JAM 1:8 REV 6:6"
Any help is greatly appreciated.
Here is what I am using to get the data into the db
LOAD DATA LOCAL INFILE '/data.txt'
INTO TABLE concordance
FIELDS TERMINATED BY '*'
I just tried loading the large dataset above and here is the error:
mysql> LOAD DATA INFILE '/tmp/data.txt' INTO TABLE concordance FIELDS TERMINATED BY '*';
ERROR 1261 (01000): Row 1 doesn't contain data for all columns
HERE IS THE CODE THAT WORKS:
LOAD DATA INFILE '/tmp/f.txt'
INTO TABLE `concordance`
FIELDS TERMINATED BY '*'
ENCLOSED BY '"'
(`term` , `ref`);
Your table creation query is wrong:
#1072 - Key column 'seq' doesn't exist in table
Other than that, provided your query is ".... PRIMARY KEY (id) .... ", I can insert both lines of data without any problem. Maybe you should post your code.
Edit:
Try the following query:
LOAD DATA INFILE '/data.txt' INTO TABLE `concordance` FIELDS TERMINATED BY '*'
ENCLOSED BY '"' LINES TERMINATED BY '\r\n'(
`term` , `ref`
)
this command worked for me:
LOAD DATA LOCAL INFILE 'C:\xampp\tmp\php3BF.tmp' INTO TABLE concordance FIELDS TERMINATED BY '*' ENCLOSED BY '"' ESCAPED BY '\' LINES TERMINATED BY '\r\n'
maybe you should provide value for id column too? or there is other error on line 1 in your file...
You might also want to ass a star at the end of each line, in order to tell MySQL that you want to skip the id field...
"Abelmeholah"*"JDG 7:22 KI1 4:12 19:16"*
It is strange though the short line goes in whereas the long line cannot..
This worked fine for me on a windows system. Have you tried escaping the commas in your text?