currval in sqlldr does not work for multiple rows - sql-loader

I am loading data using SQLldr into multiple tables using 1 control file. The sequence in the first table(seq.nextval) becomes input to the 2nd table(seq.currval). This does not work as the last number generated in the sequence becomes the currval for the remaining tables and it inserts that number to all the rows in the second and subsequent tables, unless I specify rows=1 to commit every row. Is there an alternative to specify commit after certain number still keeing the currval?
Control file:
LOAD DATA INFILE '.\DATA_FILES\USERS.csv'
APPEND
INTO TABLE TABLE1
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
trailing NULLCOLS
( USER_ID "SEQ_USER_ID.NEXTVAL",
USER_NAME,
USER_PASSWORD "(select user_password from USERS where
USER_NAME='DEMO')",
EXPIRY_DATE "(select SYSDATE+1000 from dual)",
FAILED_ATTEMPTS CONSTANT "0"
)
INTO TABLE TABLE2
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
trailing NULLCOLS
( USER_GROUP_ID "SEQ_USER_GROUP_ID.NEXTVAL",
USER_ID "SEQ_USER_ID.CURRVAL",
GROUP_ID CONSTANT "0"
)
Thanks,
Anan

Related

SQL Loader script to load data into multiple table not working records are getting discarded

Need help with the below SQL*Loader script. I have created the below test script to load the data into the 3 tables based on the location and the work_type columns.But data is not getting inserted correctly into the 3 tables. I am not able to understand how to do multiple comparisons on the data in when clause. Please help.
LOAD DATA
INFILE *
REPLACE
INTO
TABLE
xx_dumy_test_emp_table
WHEN (WORK_TYPE='EMP') AND (Work_Location = 'IND')
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(
Work_Type
,Employee_Role
,First_name
,Last_Name
,Work_Location
)
INTO
TABLE
xx_dumy_test_oversea_employess
WHEN (WORK_TYPE='EMP') AND (Work_Location <> 'IND')
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(
Work_Type POSITION(1:3)
,Employee_Role
,First_name
,Last_Name
,Work_Location
)
INTO
TABLE
xx_dumy_test_mgr_table
WHEN (WORK_TYPE='MGR') AND (Work_Location = 'IND')
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(
Work_Type POSITION(1:3)
,Employee_Role
,First_name
,Last_Name
,Work_Location
)
--DATA OF THE SCRIPT
EMP;Test Ops1;Gautam;Hoshing;IND;
MGR;Test Ops2;Steve;Tyler;IND;
EMP;Test Ops3;Hyana;Motler;JPY;
Regards,
Gautam.

MySQL LOAD DATA with updated file

I have a query that loads data from a text file into a table, finds a row with a max value in the "date_time" column and deletes all rows that are less than the max value. But this file will be updated several times a day and each time only one row with the max value will remain in the table.
LOAD DATA INFILE 'C:/users/user/desktop/download.txt' IGNORE INTO TABLE download1
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\r\n'
IGNORE 3 LINES;
DELETE FROM download1 WHERE date_time < (SELECT * FROM (SELECT MAX(date_time) AS MaxDatetime FROM download1) AS t)
How can I make the past max value also remain in the table when executing a query with an updated file?
text file:
table:
Updated based on question edit and comments.
Since the id field in the table is auto_increment, it provides a continuously increading value. Get the max value of the id field before uploading your new file and use that to limit your delete to newer records only:
SET #OLDMAXID = IFNULL((SELECT MAX(id) FROM download1), 0);
LOAD DATA INFILE 'C:/users/user/desktop/download.txt' IGNORE INTO TABLE download1
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\r\n'
IGNORE 3 LINES;
DELETE FROM download1 WHERE date_time < (SELECT * FROM (SELECT MAX(date_time) AS MaxDatetime FROM download1) AS t) and id > #OLDMAXID;

ignore first two characters on a column while importing csv to mysql

I am trying to import a csv file to mysql table, But I need to remove First two characters on particular column before importing to mysql.
This is my statment :
string strLoadData = "LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets FIELDS terminated by ',' ENCLOSED BY '\"' lines terminated by '\n' IGNORE 1 LINES (SiteId,DateTime,Serial,DeviceId,AgentAID,VehicleRegistration,CarPark,SpaceNumber,GpsAddress,VehicleType,VehicleMake,VehicleModel,VehicleColour,IssueReasonCode,IssueReason,NoticeLocation,Points,Notes)";
Column IssueReasoncode' has data like 'LU12' , But i need to remove the first 2 characters it should have only integers on it and not alpha numeric .
I need to remove 'LU' from that column.
Is it possible to write like this on left(IssueReasonCode +' '2). This column is varchar(45) and cant be changed now because of large data on it.
Thanks
LOAD DATA INFILE has the ability to perform a function on the data for each column as you read it in (q.v. here). In your case, if you wanted to remove the first two characters from the IssueReasonCode column, you could use:
RIGHT(IssueReasonCode, CHAR_LENGTH(IssueReasonCode) - 2)
to remove the first two characters. You specify such column mappings at the end of the LOAD DATA statement using SET. Your statement should look something like the following:
LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets
FIELDS terminated by ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(SiteId, DateTime, Serial, DeviceId, AgentAID, VehicleRegistration, CarPark, SpaceNumber,
GpsAddress, VehicleType, VehicleMake, VehicleModel, VehicleColour, IssueReasonCode,
IssueReason, NoticeLocation, Points, Notes)
SET IssueReasonCode = RIGHT(IssueReasonCode, CHAR_LENGTH(IssueReasonCode) - 2)
Referencing this and quoting this example , you can try the below to see if it works
User variables in the SET clause can be used in several ways. The
following example uses the first input column directly for the value
of t1.column1, and assigns the second input column to a user variable
that is subjected to a division operation before being used for the
value of t1.column2:
LOAD DATA INFILE 'file.txt' INTO TABLE t1 (column1, #var1) SET
column2 = #var1/100;
string strLoadData = "LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets FIELDS terminated by ',' ENCLOSED BY '\"' lines terminated by '\n' IGNORE 1 LINES (SiteId,DateTime,Serial,DeviceId,AgentAID,VehicleRegistration,CarPark,SpaceNumber,GpsAddress,VehicleType,VehicleMake,VehicleModel,VehicleColour,#IRC,IssueReason,NoticeLocation,Points,Notes) SET IssueReasonCode = substr(#IRC,2) ;";

LOAD DATA INFILE - Two Default Columns

I want to load data from a CSV file into a table in my database. However, the first two columns are an ID (AUTO INCREMENT, PRIMARY KEY) and CURRENT DATE (CURRENT DATE ON UPDATE) field. The source file will therefore not have these fields.
How would I get the file to load? The below code doesn't work. I have also tried leaving out id and action_date from the fields in the brackets.
LOAD DATA INFILE '/upliftments//UPLIFTMENTS_20160901.csv'
INTO TABLE database.return_movements
CHARACTER SET latin1
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
IGNORE 1 LINES
( id
, action_date
, current_code
, serial_number
, new_code
);

sqlldr - how to load multiple csv to multple table

1st question: I am trying to load test1.csv, test2.csv and test3.csv to table1, table2 and table3 respectively using SQLLDR. Please bear with my lack if knowledge in this area, I couldn't get it quite right while defining this in .ctl file, only I can think of is the below code but this is not correct. so my question is how can I make this right or is this possible?
OPTIONS (SKIP=1)
LOAD DATA
INFILE 'test1.csv'
INFILE 'test2.csv'
INFILE 'test2.csv'
TRUNCATE
INTO TABLE table1
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(
Col1 "TRIM(:Col1)",
Col2 "TRIM(:Col2)"
)
INTO TABLE table2
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(
Colx "TRIM(:Colx)",
Coly "TRIM(:Coly)"
)
INTO TABLE table3
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(
Colp "TRIM(:Colp)",
Colq "TRIM(:Colq)"
)
2nd question: This is an alternative to this first question. Since I couldn't figure it out the first one, what I have done is splitting the loads for each table into multiple .ctl files and calling those all three in a .bat file. This works at least but my question is there a way to process all these 3 .ctl files in a session without mentioning user/password 3 times as below?
sqlldr userid=user/pass#server control=test1.ctl
sqlldr userid=user/pass#server control=test2.ctl
sqlldr userid=user/pass#server control=test3.ctl
If there is a field that can be used which can indicate which table that file's data is intended for, you can do something like this using multiple INFILE satements. Let's say the first field is that indicator and it will not be loaded (define it as a FILLER so sqlldr will ignore it):
...
INTO TABLE table1
WHEN (01) = 'TBL1'
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(
rec_skip filler POSITION(1),
Col1 "TRIM(:Col1)",
Col2 "TRIM(:Col2)"
)
INTO TABLE table2
WHEN (01) = 'TBL2'
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(
rec_skip filler POSITION(1),
Colx "TRIM(:Colx)",
Coly "TRIM(:Coly)"
)
INTO TABLE table3
WHEN (01) = 'TBL3'
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
(
rec_skip filler POSITION(1),
Colp "TRIM(:Colp)",
Colq "TRIM(:Colq)"
)
So logically, each INTO table WHEN section handles each file. Not that flexible though and arguably harder to maintain. For ease of maintenance you may just want to have a control file for each file? If all files and tables are the same layout you could also load them all into the same staging table (with the indicator) for ease of loading, then split them out programatically into the separate tables afterwards. Pros of that method are faster and easier loading and more control over the splitting into the separate table part of the process. Just some other ideas. I have done each method, depends on the requirements and what you are able to change.