I have problem when I used OLE-DB Source temp table in SSIS.
I create temp table in Execute T-SQL Statement Task and I change DelayValidation : True and RetainSameConnection : True . But problem is not solved .
Background
What's likely happening here is that the table does not exist at the moment. Temporary tables come in two variants: local and global.
A local temporary table, uses a name with a single sharp/pound/hash/octothorpe prepended to it i.e. #TEMP. The only query that can use that instance of the temporary table is the one that creates it. This is why the advice across the internet says you need to set RetainSameConnection to true to ensure the connection that created the table is reused in the data flow. Otherwise, you're at the mercy of connection pooling and maybe the same connection is used in both places, maybe not and believe me that's an unpleasant bit of randomness to try and debug. The reason for the DelayValidation on the data flow is that as the package starts, the engine will validate that all the data looks as expected before it does any work. As the precursor step is what gets the data flow task into the expected state, we need to tell the execution to only validate the task immediately before execution. Validation always happens, it's just a matter of when you pay the price.
A global temporary table is defined with a double sharp/etc sign prepended to it, ##TEMP. This is accessible by any process, not just the connection that created it. It will live until the creating connection goes away (or explicitly drops it).
Resolution
Once the package is designed (the metadata is established in the data flow), using local temporary table is going to work just fine. Developing it though, it's impossible to use a local temporary table as a source in the data flow. If you execute the precursor step, that connection will open up, create the temporary table and then the connection goes away as does the temporary table.
I would resolve this by the following steps
Copy the query in your Execute SQL Task into a window in SSMS and create your local temporary table as a global temporary table, thus ##TEMP.
Create an SSIS variable called SourceQuery of type String with a value of SELECT * FROM ##TEMP;
Modify the "Data access mode" from the OLE DB Source from "SQL Command" to "SQL Command from Variable" and use the variable User::SourceQuery
Complete the design of the Data Flow
Save the package to ensure the metadata is persisted
Change the query in our variable from referencing ##TEMP to #TEMP
Save again.
Drop the ##TEMP table or close the connection
Run the package to ensure everything is working as I expect it.
Steps 2, 3, and 6 in the above allows you to emulate the magician pulling the tablecloth out from underneath all the dishes.
If you were to manually edit the query in the data flow itself from ##TEMP to #TEMP, the validation fires and since there is no #TEMP table available, it'll report VS_NEEDSNEWMETADATA and likely doesn't let you save the package. Using a variable as the query source provides a level of indirection that gets us around the "validate-on-change"/reinitialize metadata step.
Related
I am implementing a SSIS package and currently trying to do the following.
Truncate the destination table
Fetch the data by executing the stored procedure and insert it into the destination table.
I have created an Execute SQL task to address step 1 and dataflow with oledb source and oledb destination to address the second point. It been working successfully so far but isn't working for one my stored procedure that uses temp tables.
When I edit the oledb source and click the preview button, I get the error no column returned
I know that SSIS has an issue with generating column while executing stored procedures that depend on temp tables. I have converted the stored proc to use temporary table variables and its now able to return columns in SSIS when I do a preview. The only downside is that the stored procedure is taking longer time to execute. Its taking 1 hour 15 mins as compared to 15 mins while using temp tables.
I did see a suggestion to use SET FMTONLY before executing the stored procedure as an alternate solution to changing to temp table variables but that didn't seem to work as I am getting syntax or permission denied error.
Could somebody tell me a solution to my problem which does not compromise on the performance.
Sounds like you've already read all the approaches to using Temp tables in SSIS, including the IF 1=0... trick? If you haven't seen that one yet, google it.
You say that using Table Variables causes your stored procedure to take about 5 times longer than using Temp Tables. The most likely reason for that is that you are indexing your temp tables but not your table variables. If you didn't know that table variables can be indexed, they can. You might try that.
Finally, a solution that you haven't mentioned is that you can replace your temporary table with a real table that gets truncated when you're done using it.
Short comment:
Try EXEC WITH RESULT SETS and specify the metadata yourself for a proc with temp tables; or use the Script Component as a source and specify the Output columns yourself.
Long comment:
Technically speaking, it is the driver/database you are using in SSIS that would decide the behavior when working with temp tables.
Metadata is an important factor when using SSIS's pipeline components. By metadata, I mean the names of the columns, their data types etc that a pipeline component uses. When designing a data flow, someone/something should provide this metadata to the components that require it.
In most cases, SSIS automatically retreives the metadata. Components that do not connect to a external data source, like Conditional Split etc, get their metadata from the other components they are connected to. For the pipeline components that connect to a external data source (like Oledb source, oledb destination, Lookup etc.), SSIS provides a mechanism to get this metadata without human involvement. This mechanism involves the driver connecting to the database and retrieving the metadata of the output. If the driver/database is capable of returning the metadata, then that metadata is used. If the driver/database is incapable, then you get the errors you are seeing. The rest of my comments are based on the assumption that you are using a SQL Server database in your question.
When working with a SQL Server database in SSIS, typically, we use the native client drivers provided by Microsoft. When trying to get the metadata, these drivers try to get the metadata without actually executing the SQL Statement (actual execution can have side effects; and also, might take more than a few seconds/minutes/hours; and you dont want side effects and long wait times during package design time.) So to get the metadata, the driver relies on the metadata of the actual objects used in the sql command. If the command uses a physical table or view, SQL Server already has the metadata available and can supply it to the driver. If it is a temp table, SQL Server does not have the metadata until it can create the temp table. If using FMT ONLY option, you can use it in such a way to create the temp tables, but avoid any heavy processing/side affects and thus be able to retrieve metadata without penalties. Post 2012, these native client drivers rely on some newer functionality to retrieve metadata than the drivers before 2012. In 2012 and after, the driver uses the sp_describe_first_result_set proc to retrieve metadata. So, whether you can get metadata or not is determined by the ability of the sp_describe_first_result_set proc.
So while SSIS can automatically get the metadata (because of the driver/database), it does not automatically get the metadata in some cases (again because of the driver/database). In cases involving the second scenario, some other process (typically a human) can help the driver infer metadata or provide the metadata to the component directly.
To help the driver, in case of SQL Server 2012 and after, you can use the WITH RESULTSETS clause to specify the output metadata. When this clause is present, the driver will use it and doesnt try to query the metadata from system objects; and thus avoid the error which you would otherwise get. If you are using the drivers that came with SQL Server 2008, you can use FMT ONLY. This option is at the driver/database level.
Another option could be to use a Script Component as the Source and in the Output columns, you can specify the columns/metadata. SSIS would not try to retrieve metadata from the datasource in this case, but would rely on the definitions you provided in the Output section of the Script Component.
As you can see, both options involve a human (or some other process) specifying the metadata instead of SSIS trying to retrieve the metadata in an automated fashion. I would prefer the first option if working with SQL Server and the second option if working with databases like MySql.
I am kinda new to SSIS and apologize in advance if this is a repeat post, or simply a dumb question.
I am trying to create the following process in SSIS:
1- [SQL Execute Task] Create Table in SQL DB
2- [Data Flow Task] Load Data from a source file (.xls) into the Created SQL table
3- [SQL Execute Task] Run Code on Created SQL table
4- [SQL Execute Task] Drop the SQL table that was created
The problem I am running into is when I set my OLE DB Destination it wants a table that is already created. I tried to create the table and then run the process, it works the first time, but errors the 2nd time saying the table doesnt exist, even though it is skipping step 1 of creating the table.
Any ideas on a work around, or am I missing something very obvious here?
Thanks in advance!
So first, why drop the table every time? Your package is going to require consistent metadata for the table, so why not just truncate it and save it for the next load? This is a really kind of terrible approach to SSIS packages.
The reason it's failing is because SSIS does both design-time and runtime validation of all your components, so all it sees is the table's not there that it expects to be there.
But if your heart's set on this approach, you need to set the ValidateExternalMetadata property of your destination component to false. As long as the External Columns on the component match the actual columns being generated by your CREATE TABLE statement, you'll be good to go.
I have a package that is essentially trying to copy 26 tables from oracle to sql server.
Its not a complete table copy, we are looking for records that belong to certain 'Regions' of our company.
I pull the data from oracle
I started just doing this with elbow grease, but each of the 26 tables required several variable to do the deletes, the fetches etc.
Long story short, I decided to use variables to represent the table names (source, temp and target).
This allowed me to copy/paste one sequence and effectively bypass a lot of click click in bids.
The problem I am running into is that the meta data seems to be very fragile. Sequences all seem to run on their owwn, but when i run the whole package, it breaks. And never in the same place.
Is this approach just a bad idea w/ SSIS?
So just to take this off the board....
Each sequence container had the following ops
Script task - set variables
Execute SQL task - delete from temp
data flow SourceToTemp -
ole db source - used a generic select * from tbl to temp_tbl
derived column1 - insert a timestamp column
oledb destination - map all the columns into a temp table (**THIS IS THE BIG PROBLEM CHILD)
Execute SQL task - delete from target
Execute SQL task - insert target select from temp
the oleDB destination is the piece that kept breaking.
Since it references variables, I had to be very careful at design time to set the variables correctly before opening one of the data flows.
I am pretty sure this is the problem. Since I can not say w/ certainty when SSIS refreshes meta data in the design environment, I cant be sure if/when sequence X refreshed while the variables were set to support sequence Y.
So while it conceptually should work at run time, dev time is a change control night mare.
I have changed all the oleDB destinations to point to a hard table name. this is really a small concession since there are 4 sql statements that are still driven by variables. (saving me a lot of clicking and typing)
This small change has eliminated the 'shifting sands' problem.
Take-a-way lesson : dont have an oledb destination be basesd on a variable.
thanks for the comments
I can only find a solution for using temp table in OLE DB source.
But I can't find a solution for ADO.NET source. How can I successfully use temp table in the ADo.NET source in SSIS package?
I find working with temporary tables in SSIS more pain than they are usually worth. I hope your experience is better.
Create an ADO.NET connection. In the properties of the Connection Manager set the value of RetainSameConnection from false to true. This will allow the temporary table created to be in existence for the duration of the package execution by preventing connection pooling from swapping out threads.
My trouble extends from getting the metadata set up correctly. To get around this, I create a variable, QuerySource, that queries a physical table that mirrors what the temporary table will look like. SELECT S.src_id, S.src_value FROM dbo.SRC AS S; This allows the data flow to establish the correct meta data for the downstream components. I manually use this query in the ADO.NET source. Once that's done, I will need to change the query to use the temporary table, ##SRC. Unlike the OLE DB Source component, you cannot set this property inside the Data Flow task.
Once the data flow work is completed, back in the Control Flow, view the Properties of the Data Flow Task. Change the Delay Validation from false to true. This will prevent any design time validation from firing which is needed once we remove the non-temporary table "scaffolding." Next find the Expressions and click the ellipses (...). In the drop down, you should see the name of your ADO.NET source. I had renamed mine so I saw [ADONET Src].[TableOrViewName] and [ADONET Src].[SqlCommand] in the drop down list. I selected [ADONET Src].[SqlCommand] and as my value, I used #[User::SrcQuery].
I ran the package to ensure it still worked. It did. I then changed the value of my query to be SELECT S.src_id, S.src_value FROM ##SRC AS S; I reran and this time it correctly pulled data from my temporary table.
If you are using SQL Server 2012 as your source, you might be able to make it easier on yourself by using the WITH RESULT SETS option of the EXECUTE statement to explicitly describe your temporary tables metadata.
I am writing a basic file dump from one database to another. I am using SSIS 2008 and creating several packages to transform the data I have from a MSSQL 2010 database to a MYSQL 5.1 database.
All the connections are set up and records can be tranfered between the two databases but I would like to use temp tables in the transform processes and use the temp table as the MSSQL source in a dataflow task to dump the table in an awaiting MYSQL table.
I have been having problems setting this up. I am using an OLEDB connection and have set the RetainSameConnection property as well as the DelayValidation property to true. When setting up the source figure as the source from the MSSQL database I cannot find the temp table I have created in an earlier task from the control flow. I am using the same connection manager for these two tasks.
Anyone have any ideas or experience with this?
As a simple example one task does..
SELECT *
INTO #TMP
FROM CUSTOMERS
(This is a simplified example and I relize in this case I could just use the Customers table so bear with me)
Is it possible to use this temp table in a dataflow operation as the source table?
As I mentioned in my comment, not much of a solution and more of a workaround. SSIS uses the shape of result sets to bind properties in tasks. As temp tables are not always available in the database this can cause errors in SSIS even if you set DelayValidation to true.
My solution is to create an SSIS schema in whichever database you're connecting to. The reasons for doing so are security and clear separation of objects that are only used within SSIS packages - primarily staging tables.
Instead of throwing tables in your dbo schema (you shouldn't be anyway, shame on you) you'd create them in the SSIS schema. A typical data flow would truncate the table when it begins, load values and perform whatever operations are required, optionally truncating it when complete. As long as the table is always available SSIS can examine the shape of result sets.
You should not use temp tables as the source as it will not recognize the columns for the select. use table variables or CTEs instead.