Is it possible to read object variable values in SSIS script component source?
I have a variable, of type Object, which contains records from table populated by using a SQL Script Task.
I have used this Script Task and it's working perfectly by using below code
oleDA.Fill(dt, Dts.Variables("vTableRowsObj").Value)
in this way where vTableRowsObj is object variable .
I want to read this object in an SSIS script component so that I can directly give the output from script component to the destination table.
The end goal is that I am planning to create more object variables and simply by reading these objects, give the output to destination tables from script component.
I had a similar issue.
Here's some links to reference and my code is below for simple output for ID and Name.
http://agilebi.com/jwelch/2007/03/22/writing-a-resultset-to-a-flat-file/
http://www.ssistalk.com/2007/04/04/ssis-using-a-script-component-as-a-source/
http://consultingblogs.emc.com/jamiethomson/archive/2006/01/04/SSIS_3A00_-Recordsets-instead-of-raw-files.aspx
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
using System.Xml;
using System.Data.OleDb;
public override void CreateNewOutputRows()
{
DataTable dt = new DataTable();
OleDbDataAdapter oleda = new OleDbDataAdapter();
oleda.Fill(dt, this.Variables.ObjectVariable);
foreach (DataRow row in dt.Rows)
{
Output0Buffer.AddRow();
Output0Buffer.ID = Convert.ToInt32(row.ItemArray[0]);
Output0Buffer.Name = row.ItemArray[1].ToString();
}
}
Given that you have a table with records populated by a SQL Script task, why is it necessary to load that data into a variable of type Object? Why not just use that table as a data source in a data flow? The basic steps are...
1) Run your SQL Script task and load your results to a table (sounds like you are already doing this)
2) Skip loading the records to the Object variable
3) Instead add a Data Flow Component as a downstream connection to your SQL Script Task
4) Add a Source component to your Data Flow: use the the table you populated with the SQL Script Task as your data source
5) Add a Destination component to your Data Flow: use your destination table as your data destination
However in the spirit of answering the question you asked directly (if I have in fact understood your question correctly), then the simple answer is yes you can use an SSIS script component as a data source in a data flow. This article walks you through the steps.
Since I've stumbled on this problem today let me give you my solution:
First (something that you've done but placed here for clarity):
Create the ExecuteSQL task with "ResultSet" set to "Full result set" and assign it to the object type variable:
Then link it to the "Script task" and then add the variable either to "ReadOnly" or "ReadWriteVariables"
Now you just need to access this variable - as you suggested by filling it to a datatable, and then assign it to a string variable:
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.Data.OleDb;
public void Main()
{
// TODO: Add your code here
DataTable dt = new DataTable();
var oleDa = new OleDbDataAdapter();
oleDa.Fill(dt, Dts.Variables["Destination"].Value);
string yourValueAsString= dt.Rows[0][0].ToString();
Dts.Variables["MyStringVariable"].Value = yourValueAsString;
[...]
Dts.TaskResult = (int)ScriptResults.Success;
}
Related
I need to use a result from SQL SELECT which should return an array of IDs to iterate through it in my foreach loop.
I'm quite new to SSIS. I created Execute SQL Task, connect to DB and write
SELECT ID FROM TABLE
Then I created Script Task and connect these two components with Constraint. But I don't know how to pass result from SQL Task into an object in Script Task.
The typical pattern you are looking for is this:
1. In execute SQL you need to:
a. Assign the connection
b. Add your SQL Statement
c. Change Result Set to Full Result Set
d. Map the result set to an Object type variable
2. In Foreach
a. Change enumerator to ADO Enum
b. Assign your variable from #1
c. Map a variable to the interation
****EDIT****
3. Change out data flow for script task
a. Pass in iteration variable
b. in script create the URL as a string
c. use webclient to connect to web service
string url = #"https://www.blah.com? ID=" +
Dts.Variable["variable"].Value.ToString();
WebClient wc = new WebClient();
string result = wc.DownloadString(url);
4. Now you have to do something with that result
I need some help.
I am importing some data in .csv file from an oledb source. I don't want the headers to appear twice in the destination. If i Uncheck the "Column names in first data row" property , the headers don't get populated in the first execution as well.
Output as of now.
Col1,Col2
A,B
Col1,Col2
C,D
How can I make the package run in such a way that if the file is empty , the headers get inserted. Then if the execution happens again, headers are not included,just the data.
there was a similar thread, but wasn't able to apply the solution as how to use expressions to get the number of rows of destination itself. It was long back , so I created a new.
Your help is deeply appreciated.
-Akshay
Perhaps I'm missing something but this works for me. I am not having the read only trouble with ColumnNamesInFirstDataRow
I created a package level variable named AddHeader, type Boolean and set it to True. I added a Flat File Connection Manager, named FFCM and configured it to use a CSV output of 2 columns HeadCount (int), AddHeader (boolean). In the properties for the Connection Manager, I added an Expression for the property 'ColumnNamesInFirstDataRow' and assigned it a value of #[User::AddHeader]
I added a script task to test the size of the file. It has read/write access to the Variable AddHeader. I then used this script to determine whether the file was empty. If your definition of "empty" is that it has a header row, then I'd adjust the logic in the if check to match that length.
public void Main()
{
string path = Dts.Connections["FFCM"].ConnectionString;
System.IO.FileInfo stats = null;
try
{
stats = new System.IO.FileInfo(path);
// checking length isn't bulletproof based on how the disk is configured
// but should be good enough
// http://stackoverflow.com/questions/3750590/get-size-of-file-on-disk
if (stats != null && stats.Length != 0)
{
this.Dts.Variables["AddHeader"].Value = false;
}
}
catch
{
// no harm, no foul
}
Dts.TaskResult = (int)ScriptResults.Success;
}
I looped through twice to ensure I'd generate the append scenario
I deleted my file and ran the package and only had a header once.
The property that controls whether the column names will be included in the output file or not is ColumnNamesInFirstDataRow. This is a readonly property.
One way to achieve what you are trying to do it would be to have two data flow tasks on the control flow surface preceded by a script task. these two data flow tasks will be identical except that they will be referring to two different flat file connection managers. Again, the only difference between these two would be the different values for the ColumnsInTheFirstDataRow; one true, another false.
Use this Script task to decide whether this is the first run or subsequent runs. Persist this information and check it within the script. Either you can have a separate table for this information, or use some log table to infer it.
Following solution is worked for me.You can also try the following.
Create three variables.
IsHeaderRequired
RowCount
TargetFilePath
Get the source row counts using Execute SQL task and save it in
RowCount variable.
Have script task. Add readonly variables TargetFilePath and
RowCount. Add read and write variable IsHeaderRequired.
Edit the script and add the following line of code.
string targetFilePath = Dts.Variables["TargetFilePath"].Value.ToString();
int rowCount = (int)Dts.Variables["RowCount"].Value;
System.IO.FileInfo targetFileInfo = new System.IO.FileInfo(targetFilePath);
if (rowCount > 0)
{
if (targetFileInfo.Length == 0)
{
Dts.Variables["IsHeaderRequired"].Value = true;
}
else
{
Dts.Variables["IsHeaderRequired"].Value = false;
}
}
Dts.TaskResult = (int)ScriptResults.Success;
Connect your script component to your database
Click connection manager of flat file[i.e your target file] and go
to properties. In the expression, mention the following as shown in
the screenshot.
Map the connectionString to variable "TargetFilePath".
Map the ColumnNamesInFirstDataRow to "IsHeaderRequired".
Expression for Flat file connection Manager.
Final package[screenshot]:
Hope this helps
A solution ....
First, add an SSIS integer variable in the scope of the Foreach Loop or higher - I'll call this RowCount - and make its default value negative (this is important!). Next, add a Row Count to your Data Flow, and assign the result to the RowCount SSIS variable we just made. Third, select your Connection Manager (don't double-click) and open the Properties window (F4). Find the Expressions property, select it, and hit the ellipsis (...) button. Select the ColumnNamesInFirstDataRow property, and use an expression like this:
[#User::RowCount] < 0
Now, when your package starts, RowCount has the static value of -1 or another negative number. When the data flow starts for the first time in your loop, the ColumnNamesInFirstDataRow property will have a value of TRUE. When the first data flow completes, the row count (even if it's zero) is written to the RowCount variable. On the second interation of the loop, the Connection Manager is then reconfigured to NOT write column names...
I want to convert data from old database to new database with new structure.
in old database I have attachment table that must be convert to attachment table in new database.
old database attachment table structure is below:
Attachment (ID int, Image Image, ...)
and new database attachment table structure is below :
Attachment (ID int, Image Image, OldID Int, ...)
each time I execute convert package copy only not exists data (new data) from old database to new database.
I use below format for do it :
lookup between old table and new table (ID --> OldID) for check exists record.
When I run SSIS Packages; SSIS, first cache all lookups and source component data in memory then execute package. my source data in this package is very huge and when I run this package it will be run very slowly. I want to get Image column data from old database for each new record after lookup for check exists component. if I use new lookup component for get image column data from old database, SSIS cache this new lookup data and execution time of run this package not change. what must I do?
thanks in advance.
Are you sure you're thinking this through correctly? SSIS should not be slow even if the amount of data you are loading is huge.
Your LOOKUP component needs to make sure it's not doing anything it doesn't need to. If you are pointing it to the table in the new database, change it to a SQL Query at once. In this query you only need to SELECT OldId FROM tbl and point the incoming ID from old database to this. Your data flow should contain ID and Image from Old database, which is mapped ID -> OldIdand "Image -> Image` in your OLE DB Destination. No more is needed for "Insert new rows only" operation like you are doing here.
For this job, there is no need for any custom code or dynamic SQL. You -do- want to get the ID and Image from your source system in the data flow (unless you have major network bottlenecks to sort out) - doing a RBAR lookup to get the image data from the old system is a very backwards way of thinking your ETL.
Select only ID from source table
Do lookup in destination db with no change
For its no match output do lookup in source table, with Cache Mode set to No cache, which will append Image to the flow.
In this case each image will be fetched separately, which may affect performance.
You may also do it in two Data Flows.
In first:
Select only ID from source table
Do lookup in destination db with no change
Store new Ids in string variable IdListToBeFetched as comma separated list using Srcipt Component as destination witch code similar to:
using System.Text;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
StringBuilder sb;
public override void PreExecute()
{
base.PreExecute();
sb = new StringBuilder();
}
public override void PostExecute()
{
base.PostExecute();
Variables.IdListToBeFetched = sb.ToString().TrimEnd(',');
}
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
if (!Row.ID_IsNull)
{
sb.AppendFormat("{0},", Row.ID);
}
}
}
In second Data Flow set sql command of source to dynamic generated query from expression similar to "select ID, Image from Attachment where ID in (" + #[User::IdListToBeFetched] + ")" and set DelayValidation = True. It will take all Images in single select which should be faster.
To set dynamic generated query as SqlCommand in sources like ADO NET Source or ODBC Source:
select Expression property of Data Flow Task containing your source
find property [your source name].[SqlCommand] and set expression here
To set dynamic generated query as sql command in OLE DB Source (taken from Jamie Thomson blog):
Create a new variable called SourceSQL
Open up the properties pane for SourceSQL variable (by pressing F4)
Set EvaluateAsExpression=TRUE
Set Expression to "select ID, Image from Attachment where ID in (" + #[User::IdListToBeFetched] + ")"
For your OLE DB Source component, open up the editor
Set Data Access Mode="SQL Command from variable"
Set VariableName = "SourceSQL"
I'm a wannabe to .Net and SQL and am working on an SSIS package that is pulling data from flat files and inputting it into a SQL table. The part that I need assistance on is getting the Date Modified of the files and populating a derived column I created in that table with it. I have created the following variables: FileDate of type DateTime, FilePath of String, and SourceFolder of String for the path of the files. I was thinking that the DateModified could be populated in the derived column w/i the DataFlow, using a Script Component? Can someone please advise on if I'm on the right track? I appreciate any help. Thanks.
A Derived Column Transformation can only work with Integration Services Expressions. A script task would allow you to access the .net libraries and you would want to use the method that #wil kindly posted or go with the static methods in System.IO.File
However, I don't believe you would want to do this in a Data Flow Task. SSIS would have to evaluate that code for every row that flows through from the file. On a semi-related note, you cannot write to a variable until the ... event is fired to signal the data flow has completed (I think it's OnPostExecute but don't quote me) so you wouldn't be able to use said variable in a downstream derived column at any rate. You would of course, just modify the data pipeline to inject the file modified date at that point.
What would be preferable and perhaps your intent is to use a Script Task prior to the Data Flow task to assign the value to your FileDate variable. Inside your Data Flow, then use a Derived Column to add the #FileDate variable into the pipeline.
// This code is approximate. It should work but it's only been parsed by my brain
//
// Assumption:
// SourceFolder looks like a path x:\foo\bar
// FilePath looks like a file name blee.txt
// SourceFolder [\] FilePath is a file that the account running the package can access
//
// Assign the last mod date to FileDate variable based on file system datetime
// Original code, minor flaws
// Dts.Variables["FileDate"].Value = File.GetLastWriteTime(System.IO.Path.Combine(Dts.Variables["SourceFolder"].Value,Dts.Variables["FilePath"].Value));
Dts.Variables["FileDate"].Value = System.IO.File.GetLastWriteTime(System.IO.Path.Combine(Dts.Variables["SourceFolder"].Value.ToString(), Dts.Variables["FilePath"].Value.ToString()));
Edit
I believe something is amiss with either your code or your variables. Do your values approximately line up with mine for FilePath and SourceFolder? Variables are case sensitive but I don't believe that to be your issue given the error you report.
This is the full script task and you can see by the screenshot below, the design-time value for FileDate is 2011-10-05 09:06 The run-time value (locals) is 2011-09-23 09:26:59 which is the last mod date for the c:\tmp\witadmin.txt file
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
namespace ST_f74347eb0ac14a048e9ba69c1b1e7513.csproj
{
[System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
public void Main()
{
Dts.Variables["FileDate"].Value = System.IO.File.GetLastWriteTime(System.IO.Path.Combine(Dts.Variables["SourceFolder"].Value.ToString(), Dts.Variables["FilePath"].Value.ToString()));
Dts.TaskResult = (int)ScriptResults.Success;
}
}
}
C:\tmp>dir \tmp\witadmin.txt
Volume in drive C is Local Disk
Volume Serial Number is 3F21-8G22
Directory of C:\tmp
09/23/2011 09:26 AM 670,303 witadmin.txt
FileHelpers supports a feature called "RunTime Records" which lets you read a CSV file into a DataTable when you don't know the layout until runtime.
Is it possible to use FileHelpers to create a CSV file at runtime in the same manner?
Based on some user input, the CSV file that must be created will have different fields that can only be known at runtime. I can create the needed Type for the FileHelper engine as described in their reading section, but I can't figure out what format my data needs to be in to be written.
var engine = new FileHelpers.FileHelperEngine(GenerateCsvType());
engine.WriteStream(context.Response.Output, dontKnow);
EDIT
Alternatively, can anyone suggest a good CSV library that can create a CSV file without knowing its fields until runtime? For example, create a CSV file from a DataTable.
In fact the library only allows now to read runtime records but for writing purpouses you can use the DataTableToCsv method like this:
CsvEngine.DataTableToCsv(dt, filename);
Let me known if that helps.
I know this is an old question, but I ran into same issue myself and spent some time looking for solution, so I decided to share my findings.
If you are using FileHelpers RunTime Records to create your definition you can populate same definition using reflection.
For example if you create a definition
DelimitedClassBuilder cb = new DelimitedClassBuilder("Customers", ",");
cb.AddField("StringField", "string");
Type t = cb.CreateRecordClass();
FileHelperEngine engine = new FileHelperEngine(t);
Now you can use same type created by FileHelpers to populate your values as follows:
object customClass = Activator.CreateInstance(t);
System.Reflection.FieldInfo field = customClass.GetType().GetField("StringField", System.Reflection.BindingFlags.Public | System.Reflection.BindingFlags.Instance);
if (field != null)
{
field.SetValue(customClass, "StringValue");
}
And then write it to file or string:
string line = engine.WriteString(new object[] { customClass });