I know this isn't ideal, but I would like to know if it is possible to populate a temp table based on dynamic sql?
A very similar example to what I want to achieve is shown in this answer as either
SELECT into #T1 execute ('execute ' + #SQLString )
or
INSERT into #T1 execute ('execute ' + #SQLString )
I couldn't get either to work. It seems from the edit that the first option was wrong, so for the second option I've tried something like;
DECLARE #SQLString VARCHAR (2000) = 'SELECT * FROM INFORMATION_SCHEMA.COLUMNS'
INSERT into #MyTempTable execute (#SQLString )
Any ideas are much appreciated.
Edit:
In an attempt to clarify what I am trying to do without being too localised, I explain as briefly as I can below.
I have data in a staging area of my database that contains tables with dynamic names and a dynamic number of columns. However, a few of the column names are the same for each table. Rather than construct everything in dynamic sql, I would like to be able to simply extract the known columns into a temp table (or table variable, CTE, derived table or whatever) and act on that.
So given a table as so;
CREATE TABLE SomeParticularNameThatCantBeKnownToAStoredProc (
[1] AS VARCHAR(100),
[2] AS VARCHAR(100),
... -- Could be any number of these columns
[Id] AS INT,
[KnownCol] AS VARCHAR(100),
[KnownCol2] AS VARCHAR(100),
....
[DboId] AS INT
)
I'd like to be able to perform the necessary operations to allow my to process this data without having to do it all in dynamic sql. I was hoping to be able to do something like;
DECLARE #TableName AS VARCHAR(1000) = 'SomeParticularNameThatCantBeKnownToAStoredProc'
SELECT [Id], [KnownCol], [KnownCol2], [DboId]
INTO #KnownName
FROM #TableName -- I know this isn't possible, but this is what I'd like to do
This would then allow me to perform SQL statements against a consistent #KnownName. Some of the other operations I need to do are quite lengthy such as using the data to relate to other existing tables, copying data from the staging table(s) to their dbo schema equivalents and matching the DboId against the staging table Id using MERGE with OUTPUT INTO as described here, and so on and so forth.
If you can think of any other way I can limit the amount of dynamic SQL I need to write given the fact that the table name is dynamic then please let me know.
Assuming #MyTempTable already exists:
DECLARE #sql NVARCHAR(MAX);
SET #sql = N'SELECT * FROM INFORMATION_SCHEMA.COLUMNS;';
INSERT #MyTempTable EXEC sp_executesql #SQLString;
Otherwise please clarify what you are trying to do. If the table isn't already created, you can do everything inside of dynamic SQL, e.g.:
DECLARE #sql NVARCHAR(MAX);
SET #sql = N'SELECT * INTO #MyTempTable FROM INFORMATION_SCHEMA.COLUMNS;
SELECT * FROM #MyTempTable;';
EXEC sp_executesql #sql;
Related
I have a procedure that update a table. However, I need to back up the table each time prior updating it. So the only way I can backup the table via procedure is by create a table and inserting all the information from one table to another.
So What I need to do is create a table name with a random value to distinguish the different tables. The ideal solution will be "New_Table_Name_TIMESTAMP" so append timestamp to a string.
My Question is how to create a table with a timestamp added to it's name
New_table_name_201412301044
I have tried the following
DECLARE new_table varchar(100) DEFAULT '';
SET new_table = CONCAT('WORKFLOW_BU_', client_id_to_update, '_', unix_timestamp() );
CREATE TABLE data_import.new_table LIKE development.inventory_engine;
INSERT INTO data_import.new_table
SELECT * FROM development.inventory_engine;
but it create a table name called "new_table" and not the variable
Thanks
You're going to have to use so-called dynamic SQL, also known as "server side prepared statements" to do this. Ordinary SQL prohibits the use of variables for the names of tables or columns.
See here.
http://dev.mysql.com/doc/refman/5.0/en/sql-syntax-prepared-statements.html
sounds like a bad idea but you could do it with PREPARE statement FROM #sql and EXECUTE statement like this sqlFiddle
sample table
CREATE table yourTable(id int auto_increment primary key,value varchar(50));
INSERT into yourtable(value) values ('test1'),('test2'),('test3'),('test4');
create a table with a time_stamp ending (in the example I am only recording down to the HOUR, you can add _%i_%s to the formatting if you want it down to the seconds.
SET #TimeStamp = DATE_FORMAT(NOW(),'%Y_%m_%d_%H');
SET #sql = CONCAT('CREATE table yourTable',#TimeStamp,'(id int auto_increment primary key,value varchar(50))');
PREPARE statement FROM #sql;
EXECUTE statement;
SET #sql = CONCAT('INSERT INTO yourTable',#TimeStamp,'(id,value) SELECT id,value FROM yourTable');
PREPARE statement FROM #sql;
EXECUTE statement;
I am having a table MstAttributes another as TrnIndexAttributes. I want to create those many columns in TrnIndexAttributes table as many rows are there in MstAttributes. Means after insertion of a value in MstAttributesone column should be created in TrnIndexAttribute table like ID1,ID2,ID3....
If you really feel like doing that (despite the rightful warning of Philip Kelley), you'll have to use dynamic query.
DECLARE #Query NVARCJAR(MAX) = N'ALTER TABLE TableOfInfiniteDoom ADD COLUMN' + [your logic for a name] + [your ligic for a type]
EXEC sp_executesql #Query
In your trugger (don't fortget the loop if you handle multyrows DML).
I want to know how to create dynamic table in mySql . I have used dynamic table in Sqlserver 2008 but i am new to mySql . Is it possible ?
Eg: In Sql server i have created Dynamic Customer Table.
DECLARE #tblCustomer as table(
[ ] bit
,Sl# int
,custID int
,CustCode varchar(max)
,Customer nvarchar(max)
,Authorized bit
,RCount int)
SELECT * FROM #tblCustomer
Please Help
#sqlstmt = 'whatever sql';
Prepare st from #sqlstmt;
Execute #st;
Deallocate prepare #st;
Place the CREATE TABLE statement in #sqlstmt and you are good to go !
The table is a real one. You would have to drop the table afterwards.
Pretty easy to do:
CREATE TABLE AS
SELECT * FROM tblCustomer
It will take the existing field types from the schema where possible..
How to drop multiple tables from one single database at one command.
something like,
> use test;
> drop table a,b,c;
where a,b,c are the tables from database test.
We can use the following syntax to drop multiple tables:
DROP TABLE IF EXISTS B,C,A;
This can be placed in the beginning of the script instead of individually dropping each table.
SET foreign_key_checks = 0;
DROP TABLE IF EXISTS a,b,c;
SET foreign_key_checks = 1;
Then you do not have to worry about dropping them in the correct order, nor whether they actually exist.
N.B. this is for MySQL only (as in the question). Other databases likely have different methods for doing this.
A lazy way of doing this if there are alot of tables to be deleted.
Get table using the below
For sql server - SELECT CONCAT(name,',') Table_Name FROM SYS.tables;
For oralce - SELECT CONCAT(TABLE_NAME,',') FROM SYS.ALL_TABLES;
Copy and paste the table names from the result set and paste it after the DROP command.
declare #sql1 nvarchar(max)
SELECT #sql1 =
STUFF(
(
select ' drop table dbo.[' + name + ']'
FROM sys.sysobjects AS sobjects
WHERE (xtype = 'U') AND (name LIKE 'GROUP_BASE_NEW_WORK_%')
for xml path('')
),
1, 1, '')
execute sp_executesql #sql1
I have a Table Type defined in a database. It is used as a table-valued parameter in a stored procedure. I would like to call this procedure from another database, and in order to pass the parameter, I need to reference this defined type.
But when I do DECLARE #table dbOtherDatabase.dbo.TypeName , it tells me that The type name 'dbOtherDatabase.dbo.TypeName' contains more than the maximum number of prefixes. The maximum is 1.
How could I reference this table type?
Cross-database user-defined types seems to work only for CLR-based types. See this forum and MSDN (plus comments).
you could try using sp_executesql:
DECLARE #mylist integer_list_tbltype,
#sql nvarchar(MAX)
SELECT #sql = N'SELECT p.ProductID, p.ProductName
FROM Northwind..Products p
WHERE p.ProductID IN (SELECT n FROM #prodids)'
INSERT #mylist VALUES(9),(12),(27),(37)
EXEC sp_executesql #sql, N'#prodids integer_list_tbltype READONLY', #mylist
and if that doesn't work you may have to create a wrapper procedure in the remote DB, where you pass in a CSV string and the wrapper procedure splits it and creates the table (now using the local table type) to then pass into the actual procedure. See this answer for an explanation on how to split a CVS string.
Can you not just define the type in both databases.
Edit
See this article on how to do what you require
While ago for this thread, but I was trying to do the same thing and was annoyed by the same restriction.
Don't declare an # table but use a # table which can be called upon from exec string where the database can be switched to the one with table type.
e.g.
use dbA
create type ThisTableRecord as table (id int, value varchar(max))
go
create procedure ThisTableSave
#ThisTable ThisTableRecord readonly
AS
begin
select * from #thisTable
end
go
use dbB
go
create procedure ThatTableSave
as begin
create table #thatTable (id int, value varchar(max))
insert into #thatTable
values (1, 'killing')
, (2, 'joke')
, (3, 'the')
, (4, 'damned')
exec ('
use dbA
declare #thisTable thisTableRecord
insert into #thisTable select * from #thatTable
exec thisTableSave #thisTable
')
end
exec ThatTableSave