I am trying to unload data from an internal Infomix table into an external file using the following stored procedure.
create table table_name(
column1 int,
column2 int
)
insert into table_name(column1, column2) values(1, 1);
insert into table_name(column1, column2) values(2, 2);
insert into table_name(column1, column2) values(3, 3);
=====================================================
create procedure spunloaddata(p_unload_filename varchar(128))
returning
int as num_recs ;
DEFINE l_set SMALLINT;
DEFINE l_statusCode int;
DEFINE l_exec_string lvarchar(4000);
DEFINE l_unique_id INT8;
DEFINE l_num_recs smallint;
ON EXCEPTION
SET l_set
IF (l_set = -535) THEN -- already in TRANSACTION
ELIF (l_set = -244) THEN -- row locked
RETURN l_set;
ELIF ((l_set <> -958) AND (l_set <> -310 ))THEN -- temp table already exists
RETURN -1;
END IF
END EXCEPTION WITH RESUME;
TRACE ON;
-- TRACE OFF;
SET LOCK MODE TO WAIT 30;
SET ISOLATION TO DIRTY READ;
LET l_num_recs = 0;
LET l_unique_id = MOD(DBINFO("sessionid"), 100000) * 100000 + DBINFO('UTC_CURRENT');
-- get all EMAIL notifications
LET l_exec_string = 'SELECT column1, column2 '
|| ' FROM table_name '
|| ' INTO EXTERNAL ext_temp_EMAILnotifications' || l_unique_id
|| ' USING (DATAFILES("DISK:'
|| TRIM(p_unload_filename)
|| '"))';
TRACE l_exec_string;
EXECUTE IMMEDIATE l_exec_string;
LET l_statusCode = SQLCODE;
LET l_num_recs = DBINFO('sqlca.sqlerrd2');
IF(l_statusCode <> 0) THEN
LET l_num_recs = l_statusCode;
END IF
BEGIN
ON EXCEPTION IN (-206) END EXCEPTION WITH RESUME;
LET l_exec_string = 'DROP TABLE ext_temp_EMAILnotifications' || l_unique_id;
EXECUTE IMMEDIATE l_exec_string;
END
RETURN l_num_recs ;
END PROCEDURE;
===========================================================================
execute stored proc
===========================================================================
dbaccess "databasename"<<!
execute procedure spunloaddata("/tmp/foo.unl");
!
The external tables are not getting dropped and my database is filling up with these tables. The stored procedure is going into -206 error (the specified table is not in the database) at the "DROP tablename" code. I can see the table being created.
When I do a dbaccess databasename and go into tables and Info, I see all the ext_temp_uniqueid tables listed. When I try to manually drop these tables through dbaccess, I get the same -206 error.
Any help is appreciated.
Related
I have a stored procedure "let's call it MY_NEW_SP" in which I'm not using BEGIN TRY / BEGIN CATCH. but, when I'm excecuting this SP (MY_NEW_SP), I get the following error:
Msg 266, Level 16, State 2, Procedure <MY_NEW_SP>, Line 132
Transaction count after EXECUTE indicates a mismatching number of BEGIN and COMMIT statements. Previous count = 0, current count = 1.
This new stored procedure makes a big select basically, no transactions are made "in the sense of make DML operations on tables (INSERT, DELETE, UPDATE)", but in temp tables "i.e. #tmp".
I'm thinking this transaction error is due I'm using SET XACT_ABORT ON; in other stored procedures, but, I'm not sure.
I follow what it is said here: C. Using TRY...CATCH with XACT_STATE
The basic structure of the stored procedure that uses SET XACT_ABORT ON; is as follows:
IF NOT EXISTS (SELECT * FROM sysobjects WHERE TYPE = 'P' AND NAME = 'PROCEP_NEW_SP' )
BEGIN
EXEC dbo.sp_executesql #statement = N'CREATE PROCEDURE PROCEP_NEW_SP AS'
END
GO
ALTER PROCEDURE PROCEP_NEW_SP
(
#ID_TABLE INT
)
AS
BEGIN
DECLARE #TBL_CONSECUTIVE TABLE ( LOG_CONSECUTIVE INT );
SET XACT_ABORT ON;
BEGIN TRANSACTION
BEGIN TRY
IF ISNULL(#ID_TABLE, -1) = -1
BEGIN
SET #ID_TABLE = 1;
DELETE FROM #TBL_CONSECUTIVE;
INSERT INTO T_BH_LOG_TABLE (ASO_NCODE, CHA_NCODE, TSO_NCODE,
MSO_DACTION_DATE, MSO_CRESULT, MSO_CCAUSE_FAILURE)
OUTPUT INSERTED.MSO_NCODE INTO #TBL_CONSECUTIVE
SELECT #ASO_NCODE, ISNULL(#CHA_NCODE, 1), ISNULL(#TSO_NCODE, 1),
GETDATE() AS MSO_DACTION_DATE, #CST_FAIL_OR_SUC, #CST_GENERIC_MSG;
IF (XACT_STATE()) = 1
BEGIN
COMMIT TRANSACTION;
END
SELECT NULL Id_table, 'Failed' Result_process, 'Parameter (ID_TABLE) is required.' Result_process_message;
RETURN;
END
-- Operation:
UPDATE MY_TABLE
SET NAME = 'SAMPLE'
WHERE ID_TABLE = #ID_TABLE;
IF (XACT_STATE()) = 1
BEGIN
COMMIT TRANSACTION;
END
END TRY
BEGIN CATCH
ROLLBACK TRANSACTION;
INSERT INTO T_BH_LOG_TABLE (ASO_NCODE, CHA_NCODE, TSO_NCODE,
MSO_DACTION_DATE, MSO_CRESULT, MSO_CCAUSE_FAILURE)
OUTPUT INSERTED.MSO_NCODE INTO #TBL_CONSECUTIVE
SELECT 1 AS ASO_NCODE, 1, 1 As TSO_NCODE,
GETDATE() AS MSO_DACTION_DATE, #CST_FAIL_OR_SUC, #CST_GENERIC_MSG;
SELECT NULL Id_table, 'Failed' Result_process, 'Internal error. See log # (' + CAST(L.LOG_CONSECUTIVE AS NVARCHAR) + ') for more details.' Result_process_message;
FROM #TBL_CONSECUTIVE L;
RETURN;
END CATCH
END;
I really don't know if by using SET XACT_ABORT ON; is causing this kind of error.
Anyone can point me in the right direction for solve this issue?
I am relatively new to MySQL stored Procedures. I have a stored procedure that works fine in certain conditions and not in other. I'm a bit confused that what causes the error. It is a entity processing SP, based on certain values and conditions it either creates or update entity from data in a staging table.
In conditions when it works fine:
When I only process a single entity.
Works for bulk entries when there is nothing in entity table.
Don't works when:
There are entries in entity table and bulk processing is done. - Error in checking up the already existing entity_id in entity and older entity_id is assigned instead a new one should be created.
(I actually need above scenario to work more frequently than others)
I have tried to keep the code to minimum in order to understand the flow of SP. Please consider all variables as declared. SP might not compile.
CREATE DEFINER=`admin`#`%` PROCEDURE `sp_ent`(test_id int)
BEGIN
-- Move code tables to temp tables
-- Declare all the required variables here
DECLARE counter, len INT;
DECLARE var2 INT;
DECLARE var3 TINYINT(1);
DECLARE var4 DECIMAL(18,4);
DECLARE var5 DATE;
DECLARE var6 VARCHAR(1000);
DECLARE var7 VARCHAR(5000);
DROP TEMPORARY TABLE IF EXISTS `temp_ent`;
IF test_id IS NOT NULL THEN
CREATE TEMPORARY TABLE IF NOT EXISTS `temp_ent` AS (SELECT * FROM `ent_st` WHERE processed = 0 and id=test_id);
ELSE
CREATE TEMPORARY TABLE IF NOT EXISTS `temp_ent` AS (SELECT * FROM `ent_st` WHERE processed = 0);
END IF;
ALTER TABLE `temp_ent` ADD PRIMARY KEY(id);
-- SELECT * FROM `temp_ent`;
DROP TEMPORARY TABLE IF EXISTS `temp_exc`; CREATE TEMPORARY TABLE IF NOT EXISTS `temp_exc` AS (SELECT * FROM `code_exc`);
-- A few more like above
SET counter=1, len=(SELECT COUNT(*) FROM `temp_ent`);
WHILE counter <= len DO
SELECT `id`,var1, var2, var3
INTO v_id,var1, var2, var3
FROM `temp_ent` LIMIT 1;-- WHERE `id` = v_id;
BEGIN
DECLARE insufficient_information CONDITION FOR SQLSTATE '45000';
DECLARE CONTINUE HANDLER FOR insufficient_information SET v_proccessed=1;
SET v_status = CASE
WHEN ... THEN ...
ELSE 'valid'
END;
IF v_status <> 'valid' THEN SIGNAL insufficient_information; END IF;
SELECT `entity_id`,`entity`
INTO v_underlying_entity_id,v_underlying_entity_symbol
FROM `entity` WHERE `entity_id` = v_underlying OR `entity` = v_underlying_entity;
SET v_underlying_entity_symbol = COALESCE(v_underlying_entity_symbol,v_underlying_entity);
SET v_entity = (
CASE
WHEN ... THEN ...
WHEN ... THEN ...
.
.
WHEN ... THEN ...
END
);
SELECT `entity_id`,`entity`
INTO v_entity_id_check,v_entity_check
FROM entity WHERE `entity`=v_entity and `exc`=v_exc;
-- SELECT v_entity_id_check,v_entity_check,v_entity,v_exc;
IF v_entity_check IS NULL THEN
-- SELECT 'Create New Entity and Add';
SET v_entity_id = COALESCE((SELECT MAX(`entity_id`) FROM entity),0) + 1;
SET new_entity = 1;
ELSE
-- SELECT 'Entity Already Present';
SET v_entity_id = v_entity_id_check;
SET new_entity = 0;
END IF;
-- SELECT v_entity_id, v_entity, new_entity;
SET v_name = UPPER(COALESCE(v_name,v_entity));
-- UPDATE entity and underlying/derivatives table
IF new_entity = 1 THEN
-- Insert in respective tables
IF ... THEN
-- SELECT 'Entity Added',v_entity; -- Underlying Entity
INSERT INTO `entity_underlying` (`entity_id`,`entity`,`name`,`exc`,`seg`,`ins`,`isin`,`fo_yn`,`tick`,`lot_size`,`active_yn`)
VALUES (v_entity_id,v_entity,v_name,v_exc,v_seg,v_ins,COALESCE(v_isin,''),COALESCE(v_fo_yn,0),COALESCE(v_tick,0.05),COALESCE(v_lot_size,1),1);
ELSEIF ... THEN
-- SELECT 'Entity Added',v_entity; -- Derivative Entity
INSERT INTO `entity_derivatives` (`entity_id`,`entity`,`name`,`underlying`,`exc`,`seg`,`ins`,`ser`,`isin`,`strike`,`tick`,`lot_size`,`expiry`,`ex_ty`,`active_yn`)
VALUES (v_entity_id,v_entity,v_name,COALESCE(v_underlying_entity_id,-1),v_exc,v_seg,v_ins,COALESCE(v_ser,''),v_isin,COALESCE(v_strike),COALESCE(v_tick,0.05),COALESCE(v_lot_size,-1),v_expiry,v_ex_ty,1);
END IF;
-- Insert in final table
INSERT INTO entity(`entity_id`,`entity`,`exc`,`active_yn`)
VALUES (v_entity_id,v_entity,v_exc,1);
SET v_status='Entity Added';
ELSE
SET v_status='Not a New Entity';
END IF;
END;
UPDATE `ent_st` SET `entity_id`=v_entity_id,`processed`=1,`status`=v_status WHERE id=v_id;
DELETE FROM `temp_ent` WHERE id=v_id;
SET counter = counter +1;
COMMIT;
END WHILE;
END
Any help is highly appreciated. Thanks in advance.
This is driving me bananas. I'm not a mysql guru by any stretch. My goal is to add a large number of columns to a table. I've tried this several ways and the procedure chokes on the DECLARE #FooA NVARCHAR(MAX);. No clue as to why.
I appreciate any pointers...
USE mydatabase;
DELIMITER $$
DROP PROCEDURE IF EXISTS RepeatLoopProc$$
CREATE PROCEDURE RepeatLoopProc()
BEGIN
DECLARE x INT;
DECLARE sn VARCHAR(30);
DECLARE dr VARCHAR(48);
DECLARE #FooA NVARCHAR(MAX);
SET x = 0;
WHILE (x <= 150) DO
SET sn = CONCAT('drivesn_', x);
SET dr = CONCAT('driveinf_', x);
SET x = x + 1;
SET #FooA = 'ALTER TABLE DRIVE_MASTER ADD ' + sn + ' VARCHAR(30), ADD ' + dr + ' VARCHAR(48)';
EXEC sp_executesql #FooA;
END WHILE;
END$$
DELIMITER ;
When I do this I get:
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '#FooA NVARCHAR(MAX);
My forehead is getting flat from slamming it into my desk.
The ultimate goal is adding columns drivesn_0, driveinf_0, drivesn_1, driveinf_1, etc all the way out to drivesn_150 and driveinf_150. Type VARCHAR(30) and VARCHAR(48) for each respectively.
#variables are not DECLAREd and declared variables' identifiers do not start with #.
Also, ALTER statements typically recreate a table behind the scenes (equivalent to something like CREATE TABLE newversion... INSERT INTO newversion SELECT * FROM oldversion ... DROP TABLE oldversion ... RENAME newversion). So you'd be much better off building up a single ALTER statement within the loop, and executing it only once.
Example:
...
SET #FooA = 'ALTER TABLE DRIVE_MASTER';
SET x = 0;
WHILE (x <= 150) DO
SET sn = CONCAT('drivesn_', x);
SET dr = CONCAT('driveinf_', x);
SET #FooA = CONCAT(#FooA
, CASE WHEN x != 0 THEN ', ' ELSE '' END
, 'ADD ', sn, ' VARCHAR(30), ADD ', dr, ' VARCHAR(48)'
);
SET x = x + 1;
END WHILE;
EXEC sp_executesql #FooA;
...
... but what Barmar said in comments is good advice, you should probably just have another table, something like DRIVE_MASTER_DETAILS(x int, sn VARCHAR(30), dr VARCHAR(48))
I already have multiple tables. Basically I am using this to catalog drive serial numbers in hosts. Host can have up to 150 drives. Other tables contain network interface information (macaddrs, etc). All tied together by a common index value. For a system with 150 disk drives I cannot see another way other than 150 columns. Either that or I am missing a fundamental concept.
I want to write integration/automated test cases using selenium of live website say testing.example.com. This site is staging website of example.com.
When i run test cases, new data is created, update, deleted. After completion of the test suite, i want to restore the database to the state where it was before running the test cases.
So for example state of database before running test cases --> s1
and state of database after running test cases --> s2
I want the database to go back to s1 state.
I am using rails framework and mysql/pg database
One solution could be taking dump of the database before running the test cases and then restoring the data after test cases run is completed.
What could be other solution?
Thanks
I did exactly that solution until the database got to big and the procedure to long time.
So I changed it to a type of rollback.
As a part of the setup in
Main class called by all test cases
def run_once_beginning
DEBUG("start","Running start function")
DefaultDatabase.prepare_rollback
DefaultDatabase.resetDB
end
def run_once_end
DEBUG("start","Running end function")
DefaultDatabase.drop_all_triggers
DefaultDatabase.resetDB
end
def setup
if(State.instance.started == false)
State.instance.started = true
run_once_beginning
at_exit do
run_once_end
end
end
end
DefaultDatabase.rb
def prepare_rollback
query = "CREATE TABLE IF NOT EXISTS `changes` (`table_name` varchar(60) NOT NULL, PRIMARY KEY (`table_name`))"
HandleDB.sendQuery(query)
query = "show tables;"
tables = HandleDB.sendQuery(query)
exclude = ["phpsessions", "cameras", "changes", "connections", "sysevents"]
triggers = "";
tables.each{ |name|
if (exclude.index(name) == nil)
triggers += "DROP TRIGGER IF EXISTS ins_#{name};"
triggers += "CREATE TRIGGER ins_#{name} AFTER INSERT ON #{name}
FOR EACH ROW BEGIN
INSERT IGNORE INTO `changes` VALUES ('#{name}');
END;"
triggers += "DROP TRIGGER IF EXISTS up_#{name};"
triggers += "CREATE TRIGGER up_#{name} AFTER UPDATE ON #{name}
FOR EACH ROW BEGIN
INSERT IGNORE INTO `changes` VALUES ('#{name}');
END;"
triggers += "DROP TRIGGER IF EXISTS del_#{name};"
triggers += "CREATE TRIGGER del_#{name} AFTER DELETE ON #{name}
FOR EACH ROW BEGIN
INSERT IGNORE INTO `changes` VALUES ('#{name}');
END;"
end
}
setup_connecion = Mysql.new(Constants::DB["ServerAddress"],"root", Constants::DB["Password"],
Constants::DB["Database"],Constants::DB["ServerPort"], nil, Mysql::CLIENT_MULTI_STATEMENTS)
setup_connecion.query(triggers)
# Clear out all the results.
while setup_connecion.more_results
setup_connecion.next_result
end
query = "DROP PROCEDURE IF EXISTS RestoreProc;"
setup_connecion.query(query)
# This is a mysql stored procedure. It will use a cursor for extracting names of
# changed tables from the changed-table and then truncate these tables and
# repopulate them with data from the default db.
query = "CREATE PROCEDURE RestoreProc()
BEGIN
DECLARE changed_name VARCHAR(60);
DECLARE no_more_changes INT DEFAULT 0;
DECLARE tables_cursor CURSOR FOR SELECT table_name FROM changes;
DECLARE CONTINUE HANDLER FOR NOT FOUND SET no_more_changes = 1;
OPEN tables_cursor;
FETCH tables_cursor INTO changed_name;
IF changed_name IS NOT NULL THEN
REPEAT
SET #sql_text=CONCAT('TRUNCATE ', changed_name);
PREPARE stmt FROM #sql_text;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
SET #sql_text=CONCAT('INSERT INTO ', changed_name, ' SELECT * FROM default_db.', changed_name);
PREPARE stmt FROM #sql_text;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
FETCH tables_cursor INTO changed_name;
UNTIL no_more_changes = 1
END REPEAT;
END IF;
CLOSE tables_cursor;
TRUNCATE changes;
END ;"
setup_connecion.query(query)
setup_connecion.close
end
def drop_all_triggers
query = "show tables;"
tables = HandleDB.sendQuery(query)
triggers = ""
tables.each{ |name|
triggers += "DROP TRIGGER IF EXISTS del_#{name};"
triggers += "DROP TRIGGER IF EXISTS up_#{name};"
triggers += "DROP TRIGGER IF EXISTS ins_#{name};"
}
triggers_connection = Mysql.new(Constants::DB["ServerAddress"],"root", Constants::DB["Password"],
Constants::DB["Database"],Constants::DB["ServerPort"], nil, Mysql::CLIENT_MULTI_STATEMENTS)
triggers_connection.query(triggers)
triggers_connection.close
end
def resetDB
restore_connection = Mysql.new(Constants::DB["ServerAddress"],Constants::DB["User"], Constants::DB["Password"],
Constants::DB["Database"], Constants::DB["ServerPort"], nil, Mysql::CLIENT_MULTI_RESULTS)
restore_connection.query("CALL RestoreProc();")
restore_connection.close
end
For this to work a copy is needed of the original database in the state you want it to return to named "default_db".
"Exclude" are tables which will not be changed
This is my perl code:
my $dbc = DBI->connect('DBI:mysql:test', "entcfg", "entcfg") || die "Could not connect to database: $DBI::errstr";
$dbc->{TraceLevel} = "2"; #debug mode
$dbc->{AutoCommit} = 0; #enable transactions, if possible
$dbc->{RaiseError} = 1; #raise database errors
###sql commands
my $particle_value = $dbc->prepare('CALL particle_test_value(?,?,?,?)');
my $particle_name = $dbc->prepare('CALL particle_test_name(?,?,?,?)');
my $table_test = $dbc->prepare('CALL table_test(?,?,?)');
sub actionMessage {
my ($sh,$msgobj) = #_;
my #result;
my $return_ID;
eval {
$table_test->execute(undef,"value","value"); #new item
$return_ID = $table_test->fetchrow_array(); #get new row id
};
if ($#) {
warn $#; # print the error
}
}
The mySQL transaction is as follows:
CREATE DEFINER=`root`#`localhost` PROCEDURE `table_test`(
v_id INT,
v_name VARCHAR(255),
v_value VARCHAR(255)
)
BEGIN
INSERT INTO test (name,value) VALUES (v_name,v_value);
SELECT LAST_INSERT_ID();
END
If I put $dbc->commit; after the execute or the fetchrow_array,I get a Commands out of sync error.
If I remove the AutoCommit line, the code works but I can't use transactions.
If I try to change AutoCommit during the sub, I get this error:Turning off AutoCommit failed.
Any help would be much appreciated.
You can't extract values from stored procedures like that.
Make table_test a function:
CREATE DEFINER=`root`#`localhost` FUNCTION `table_test`(
v_name VARCHAR(255),
v_value VARCHAR(255)
) RETURNS integer
BEGIN
INSERT INTO test (name,value) VALUES (v_name,v_value);
RETURN LAST_INSERT_ID();
END //
and have $table_test use it like a function:
my $table_test = $dbc->prepare('SELECT table_test(?,?,?)');
edit: MySQL stored procedures can actually return results - the result of a SELECT statement inside the procedure is sent back to the SQL client. You have found a bug in DBD::mysql. The above works as a workaround.