I have a very simple SSIS package and an absurdly simple Script task
var x = Dts.Variables["User::OrderPath"];
That fails with:
The element cannot be found in a collection. This error happens when
you try to retrieve an element from a collection on a container during
execution of the package and the element is not there.
I have a variable, OrderPath, that is scoped to the package. Any time I try to add the variable to the script's ReadOnlyVariables, it disappears whenever I execute the task.
This really shouldn't be this difficult so I assume I'm missing something monumentally basic. Any idea what that might be?
When accessing variables through the Dts.Variables collection, you do not prefix the variable name with the namespace; thus
var x = Dts.Variables["User::OrderPath"];
fails, while
var x = Dts.Variables["OrderPath"];
succeeds. (Assuming, of course, that OrderPath is added to either the ReadWriteVariables or ReadOnlyVariables task properties.)
See Using Variables in the Script Task in MSDN for another example.
Related
I am implementing a custom auditing framework, logging ETL events such as start, end, error, insertrows etc.
As well as logging at a package level, I'm implementing "session logging" where a sequence of package executions, i.e. a controller package that executes several packages, is a session. In order to keep track of the "session", the stored procedures always return a SessionLogID.
I was hoping I could map this result set to a project parameter as otherwise, I will have to save it to a user var and then pass it around between packages via parameters. This will mean every single package will have a Package Parameter and User Variable called SessionLogID. I don't want to do this if I don't need to.
Open to other suggestions.
Thanks,
Adam
Parameters cannot change at runtime. They are a set once kind of deal whereas variables can change at any time. You can set the variable once in the parent package and map the variable to the child package's using a parameter.
I have an SSIS package which has got a foreach loop. inside the foreach loop I have a script task. I have put breakpoint in that script task, which gets hit but the problem is, it only gets hit on the first iteration. so if F10 or F5 it does not break again on the second iteration.
how can i make it break each time on the same point on each iteration.
It seems to be a expected behaviour of SSIS, as stated in Books Online:
"If a Script task is part of a Foreach Loop or For Loop container, the debugger ignores breakpoints in the Script task after the first iteration of the loop."
http://technet.microsoft.com/en-us/library/ms137625.aspx
You can try to work around it with the following alternatives:
Interrupt execution and display a modal message by using the MessageBox.Show method in the System.Windows.Forms namespace. (Remove this code after you complete the debugging process.)
Raise events for informational messages, warnings, and errors. The FireInformation, FireWarning, and FireError methods display the event description in the Visual Studio Output window. However, the FireProgress method, the Console.Write method, and Console.WriteLine method do not display any information in the Output window. Messages from the FireProgress event appear on the Progress tab of SSIS Designer. For more information, see Raising Events in the Script Component.
Log events or user-defined messages to enabled logging providers. For more information, see Logging in the Script Component.
http://technet.microsoft.com/en-us/library/ms136033.aspx
I know this is old question, but I have an idea like to share
As it's been answered by Guilherme, I can add something might be useful, if your foreach is based on a SQL query, you can add a ROW_NNUMBER() to it and assign it to a variable, inside the script task you can compare the value of this variable and break the task on any row you want.
if (Dts.Variables["Your_Variable"].Value.ToString() == "4") {
Console.WriteLine("Break");
}
At least you can stop iterating any place in the loop, rather than the first iteration.
I have data on a WebSphere MQ queue. I've written a script task to read the data, and I can output it to a variable or a text file. But I want to use that as input to a dataflow step and transform the data. The ultimate destination is a flat file.
Is there a way to read the variable as a source into a dataflow step? I could write the MQ data to a text file, and read the text file in the dataflow, but that seems like a lot of overhead. Or I could skip the dataflow altogther, and write all the transformations in a script (but then why bother with SSIS in the first place?)
Is there a way to write a Raw File out of the script step, to pass into the dataflow component?
Any ideas appreciated!
If you've got the script that consumes the webservice, you can skip all the intermediary outputs and simply use it as a source in your dataflow.
Drag a Data Flow Task onto the canvas and the add a Script Component. Instead of selecting Transformation (last option), select Source.
Double-Click on the Script Component and choose the Input and Output Properties. Under Output 0, select Output Columns and click Add Column for however many columns the web service has. Name them appropriately and be certain to correctly define their metadata.
Once the columns are defined, click back to the Script tab, select your language and edit the script. Take all of your existing code that could write that consumes the service and we'll use it here.
In the CreateNewOutputRows method, you will need to iterate through the results of the Websphere MQ request. For each row that is returned, you would apply the following pattern.
public override void CreateNewOutputRows()
{
// TODO: Add code here or in the PreExecute to fill the iterable object, mqcollection
foreach (var row in mqcollection)
{
// Adds a new row into the downstream buffer
Output0Buffer.AddRow();
// Assign all the data to the correct locations
Output0Buffer.Column = row.Column;
Output0Buffer.Column1 = row.Column1;
// handle nulls appropriately
if (string.IsNullOrEmpty(row.Column2))
{
Output0Buffer.Column2_IsNull = true;
}
else
{
Output0Buffer.Column2 = row.Column2;
}
}
}
You must handle nulls via the _IsNull attribute or your script will blow up. It's tedious work versus a normal source but you'll be far more efficient, faster and consume fewer resources than dumping to disk or some other staging mechanism.
Since I ran into some additional "gotchas", I thought I'd post my final solution.
The script I am using does not call a webservice, but directly connects and reads the WebSphere queue. However, in order to do this, I have to add a reference to amqmdnet.dll.
You can add a reference to a Script Task (which sits on the Control Flow canvas), but not to a Script Component (which is part of the Data Flow).
So I have a Script Task, with reference and code to read the contents of the queue. Each line in the queue is just a fixed width record, and each is added to a List. At the end, the List is put into a Read/Write object variable declared at the package level.
The Script feeds into a Data Flow task. The first component of the Data Flow is a Script Component, created as Source, as billinkc describes above. This script casts the object variable back to a list. Then parses each item in the list to fields in the Output Buffer.
Various split and transform tasks take over from there.
Try using the Q program available in the MA01 MQ supportpac instead of your script.
I have a Access 2003 database that will dynamically load MDB databases as a library reference. The reason for this is this database is a menu front-end for 60+ application databases. Rather than deal with permanently referencing all these databases, the menu front-end will dynamically reference what is needed when the user makes a selection. I was working on moving this database to Access 2010 and creating a custom ribbon. I started using the technique from here to capture the ribbon object in a global variable when the ribbon loads. I then ran into the problem where I could verify the code was running and the global variable was correctly being assigned the ribbon reference but after the database would run through it's startup routine, that global variable would get reset to Nothing.
To verify what was going on, I created a simple database for testing. In this database, I had a module with a global variable:
Public obj as Object
I then had a function like this:
Public Function SetObj()
Set obj = Application
Debug.Print "IsNothing=" & (obj Is Nothing)
References.AddFromFile "Test.mdb"
Debug.Print "IsNothing=" & (obj Is Nothing)
End Function
Obviously, in my code, "Test.mdb" refers to an actual file. If I run this code, Debug.Print gives me "IsNothing=False" for both instances, but after the function finishes and if I wait a couple seconds, Debug.Print will give me "IsNothing=True". If I comment out References.AddFromFile, Debug.Print gives me "IsNothing=False" no matter how long I wait.
It makes sense to me that since Access has to re-compile the VBA code after loading the library that all global variables are reset. I've experimented with moving the global variable into a class, but since I then need a global variable for the class, the class variable then gets reset instead. I tried using a local variable in the function to save the value of the global variable, but it looks like Access waits a couple seconds after the code is finished running to do the re-compile, so that doesn't work either. Does anyone have any other ideas to accomplish this?
I don't really know if this will solve the problem for this kind of reference, but in general, I don't use public variables for this kind of thing, but instead use a STATIC variable inside your function. It would be something like this:
Public Function SetObj() As Object
Static obj As Object
If (obj Is Nothing) Then
Set obj = Application
End If
Set SetObj = obj
End Function
Then you'd just use SetObj as an object for using your application. In a production app, you'd need tear-down code, too, but I've omitted that here.
I doubt this helps, but your code struck me as rather inefficient and incomplete.
I figured out a solution to my problem, and thanks #David-W-Fenton, as your answer gave me the idea. I use your approach in a library database for caching frequently-accessed values that are stored in a table but don't change after the initial startup. Those values aren't lost every time the references change, and that's when the light bulb lit up.
The solution is to put the global variable in a library database. Access looks to be only resetting global variables in the database that the reference is being loaded into - which makes sense after thinking about it. So since the library database isn't the one being re-compiled, it doesn't get it's global (or private or static) variables reset.
What I ended up doing was creating a new module in an existing library database. It has a private variable and two methods - one to set the variable, one to retrieve the variable value. In my menu front-end database, when the ribbon loads and calls my callback function, rather than saving the ribbon object in the front-end database, I pass it to this module for saving. I now no longer lose that ribbon reference whenever new databases are added to the library references on the fly.
I am trying to access the package level variable "FirstRecord". However, I can't do :
dts.variables("FirstRecord").value = Truth
How do I access variables in a DataFlow Script Component?
The syntax is different in a Script Component in a Data Flow Task than it is in a Script Task. You are using the syntax for the Script Task.
You can try this to assign a value to the package variable from a Script Component
Variables.FirstRecord = Truth
You can only assign values to a package variable in the PreExecute or PostExecute procedures.