I have an SSIS package where I am trying to use a For Loop to check a condition (web service response). I have a condition in the EvalExpression within SSIS and I am setting one of the variables in that EvalExpression within a dataflow task that is inside the loop. I am also logging the expression as well as the variables in the expression, and when the EvalExpression should turn false, the loop keeps going. Can someone please explain why that is? Or how can I get the true EvalExpression each iteration.
This is my EvalExpression
(#[User::statusMessageLoop] == 1 && #[User::statusMessageLoopCheck] == 1)
The variable User::statusMessageLoopCheck is set at each iteration of the loop and when I want it to break out of the loop, I set it to 0.
I hope the images below help explain my situation, and thank you all for your input.
for loop properties
conditions under where i want the loop to keep going
break out of the loop when these condition are true
I am assuming you are reading your XML into columns properly and you know how to do that.
Add a script component and select "Transformation"
Add your variable (I am calling mine isTrue) as read-write
Select your column as read on inputs
Enter the script
Add the following code to set your variable
Variables.isTrue = Row.isTrue == true ? 1 : 0;
This is basic if then else logic.
This is how you set a variable using SSIS inherit tools.
If this isn't working, maybe you should check to make sure you are reading the XML properly.
EDIT:
You need an interim variable to hold the test result and then you can set the Package Variable in the post execute. Here is a screen shot of the code:
Sorry about this. I learned this lesson with you. I typically handle all XML and JSON entirely inside C#.
Note that bool test is declared outside of row processing as well.
This assumes you only have one row of processing as well.
Related
In Jitterbit Dataloader 10.37 I want to create CSV-files from Salesforce data but only if the query returns data.
I checked "do not create empty files" on the target type local file but it is still creating a csv just with the header but with no data. I do not want files created with no data in it. It is not an option to not have the header at all in the files - I will need it when there is data from the query.
Any suggestions? What am I missing?
I've seen this happen in situations where the write operation is after a couple of other operations. In that instance a header is written in the first operation, then another header is written in a second operation. The first row is read as the header, the second row (another header) is read as data, and written out.
I always add in a condition where I check if one of the fields equals its name. Something like this, to just skip those rows.
<trans>
if(Id=="Id",
false;,
true;
);
</trans>
The best way to do this is to send your output to a variable array. Then check the variable to see if data is present. So set your target to a global variable. Then add a script after that target and do your validation. To test your script use DEBUGBREAK(); to test and look at your variable content. That way you can see what is going into it.
Then make your condition statement.
if( Length($varailbe)>1,RunOperation("operation:myexport"),"novalue"):
Newbie question. I believe this is one of the most common errors. I found several of them on msdn forum itself but probably there are many ways on achieving this error? Please help.
I am trying to move and rename some images from one folder to another (and yes I have seen the blog by Rafael Salas and many others but none of them helps).
Like moving from \server1\images\123-456.jpg to \server2\images\123456.jpg
I am using a foreach.
Source variable is built dynamically. In the first iteration #imagePath = \server1\images\123-456.jpg ( i checked using messagebox.show)
I have defined #remoteImagePath = \server2\images\ (which never changes) and #revisedImageName = 123456.jpg (built dynamically in script task using string replace - also checked using messagebox.show)
In FileSystem Task, I am using SourceVariable as #imagePath and using Expressions to define Destination as in #[User::remoteImagePath] + "\" + #[User::revisedImageName]
Dont know for what reason, I am getting this error
Failed to lock variable "\server2\images\123456.jpg" for read access with error 0xC0010001 "The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.".
Set the IsSourcePathVariable to True and set the SourceVariable to a SINGLE variable.
If you use the expression editor to set Source using multiple variables, or anything else apart from a SINGLE variable it will not work. If you want to concatenate hard-coded strings to variables, or multiple variables, do this by creating a new variable in the package, and use the Expressions tab in the variables tab to build the variable as a single variable.
I believe expressions editor needs syntax
#[User::remoteImagePath] + "\\" + #[User::revisedImageName]
I need some help.
I am importing some data in .csv file from an oledb source. I don't want the headers to appear twice in the destination. If i Uncheck the "Column names in first data row" property , the headers don't get populated in the first execution as well.
Output as of now.
Col1,Col2
A,B
Col1,Col2
C,D
How can I make the package run in such a way that if the file is empty , the headers get inserted. Then if the execution happens again, headers are not included,just the data.
there was a similar thread, but wasn't able to apply the solution as how to use expressions to get the number of rows of destination itself. It was long back , so I created a new.
Your help is deeply appreciated.
-Akshay
Perhaps I'm missing something but this works for me. I am not having the read only trouble with ColumnNamesInFirstDataRow
I created a package level variable named AddHeader, type Boolean and set it to True. I added a Flat File Connection Manager, named FFCM and configured it to use a CSV output of 2 columns HeadCount (int), AddHeader (boolean). In the properties for the Connection Manager, I added an Expression for the property 'ColumnNamesInFirstDataRow' and assigned it a value of #[User::AddHeader]
I added a script task to test the size of the file. It has read/write access to the Variable AddHeader. I then used this script to determine whether the file was empty. If your definition of "empty" is that it has a header row, then I'd adjust the logic in the if check to match that length.
public void Main()
{
string path = Dts.Connections["FFCM"].ConnectionString;
System.IO.FileInfo stats = null;
try
{
stats = new System.IO.FileInfo(path);
// checking length isn't bulletproof based on how the disk is configured
// but should be good enough
// http://stackoverflow.com/questions/3750590/get-size-of-file-on-disk
if (stats != null && stats.Length != 0)
{
this.Dts.Variables["AddHeader"].Value = false;
}
}
catch
{
// no harm, no foul
}
Dts.TaskResult = (int)ScriptResults.Success;
}
I looped through twice to ensure I'd generate the append scenario
I deleted my file and ran the package and only had a header once.
The property that controls whether the column names will be included in the output file or not is ColumnNamesInFirstDataRow. This is a readonly property.
One way to achieve what you are trying to do it would be to have two data flow tasks on the control flow surface preceded by a script task. these two data flow tasks will be identical except that they will be referring to two different flat file connection managers. Again, the only difference between these two would be the different values for the ColumnsInTheFirstDataRow; one true, another false.
Use this Script task to decide whether this is the first run or subsequent runs. Persist this information and check it within the script. Either you can have a separate table for this information, or use some log table to infer it.
Following solution is worked for me.You can also try the following.
Create three variables.
IsHeaderRequired
RowCount
TargetFilePath
Get the source row counts using Execute SQL task and save it in
RowCount variable.
Have script task. Add readonly variables TargetFilePath and
RowCount. Add read and write variable IsHeaderRequired.
Edit the script and add the following line of code.
string targetFilePath = Dts.Variables["TargetFilePath"].Value.ToString();
int rowCount = (int)Dts.Variables["RowCount"].Value;
System.IO.FileInfo targetFileInfo = new System.IO.FileInfo(targetFilePath);
if (rowCount > 0)
{
if (targetFileInfo.Length == 0)
{
Dts.Variables["IsHeaderRequired"].Value = true;
}
else
{
Dts.Variables["IsHeaderRequired"].Value = false;
}
}
Dts.TaskResult = (int)ScriptResults.Success;
Connect your script component to your database
Click connection manager of flat file[i.e your target file] and go
to properties. In the expression, mention the following as shown in
the screenshot.
Map the connectionString to variable "TargetFilePath".
Map the ColumnNamesInFirstDataRow to "IsHeaderRequired".
Expression for Flat file connection Manager.
Final package[screenshot]:
Hope this helps
A solution ....
First, add an SSIS integer variable in the scope of the Foreach Loop or higher - I'll call this RowCount - and make its default value negative (this is important!). Next, add a Row Count to your Data Flow, and assign the result to the RowCount SSIS variable we just made. Third, select your Connection Manager (don't double-click) and open the Properties window (F4). Find the Expressions property, select it, and hit the ellipsis (...) button. Select the ColumnNamesInFirstDataRow property, and use an expression like this:
[#User::RowCount] < 0
Now, when your package starts, RowCount has the static value of -1 or another negative number. When the data flow starts for the first time in your loop, the ColumnNamesInFirstDataRow property will have a value of TRUE. When the first data flow completes, the row count (even if it's zero) is written to the RowCount variable. On the second interation of the loop, the Connection Manager is then reconfigured to NOT write column names...
Using Script Task, I had stored data ie a Dataset or Datatable object to a SSIS variable of data type Object.
I wanted to pull data from this SSIS object variable which contains a Dataset object of data, and store it to a destination. This is possible in Script Task itself. I know. But how is this possible by using other SSIS tasks ? We pull data from source tasks by connecting to server and using sql command.
But how do we pull data from a SSIS object variable ? I want a solution , other than Foreach loop container. Without using Foreach loop container, what is the solution ? Because, I dont suggest Foreach loop container, since there are records more than 300.
AFAIK without using script task you cannot pull data from object variable. You can use a Script component as a data source and add rows to its output from with in the script.
300 records is a very small number. If the Foreach loop works, then why not just use it? If you then have performance problems, and if you can trace them to the Foreach loop, then you should look into other options. But I have a hard time imagining that looping over 300 records is a significant problem. Of course you have more information than us, so maybe it really is an issue.
How do you check to see if a file does not exist in SQL Server Integration Services 2005?
Is there a native SSIS component which will just do this for you?
I have checked for file existence this using the Script Task and then branch accordingly.
You can do something like
If System.IO.File.Exists("\\Server\Share\Folder\File.Ext") Then
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End If
Although there are no native components for this, there are several third party components for SSIS that you can use for this purpose.
The File System Task in SSIS is basically for move, copy, delete, etc., but does not support file existence checks.
#Raj More gave a good solution. Another way that I have used before is to create a Foreach Loop Container that loops over the file system for a file spec. If you know the name of the file you want, then you can set the name in a variable and set the spec to equal the variable in the expression tab for the Foreach Loop Container. You could also just specify or a directory or a partial file name if you don't know the exact name but know the naming convention or know there will be no other files in the folder.
If you want to take a specific action based on whether or not there is a file, then you could create a variable with a default value of 0 and create a script task in the Foreach Loop Container that increments the variable. You could also just put the commands in the Foreach Loop Container that you want to execute if you want to execute it for the existence of each individual file. If you want to take an action based on the absence of the file, then you could restrict your precedence constraint after the Foreach Loop Container so that it is restricted on constraint and expression and make it check if the counter variable is > 0.
#Raj's solution could also be used to increment the variable. Instead of using an If Else to raise an error or success result, you could do this:
C#
if (System.IO.File.Exists("\\Server\Share\Folder\File.Ext"))
{ Dts.Variables["my_case_sensitive_variable_name"].Value = Dts.Variables["my_case_sensitive_variable_name"].Value + 1;
}
VB.NET
If System.IO.File.Exists("\\Server\Share\Folder\File.Ext") Then
Dts.Variables["my_case_sensitive_variable_name"].Value = Dts.Variables["my_case_sensitive_variable_name"].Value + 1
End If
The advantage of this approach is that the package may not need to fail in the absence of a file. You could also use a variable name if the file changes that you could define either as a variable in the package or just solely created in the script task. The only short-coming of #Raj's approach is that you have to know the file name you want to check.
Another possibility is to execute a File System Task to rename the file to its existing name or copy the file to its existing location. If the file doesn't exist, then you can route the error to an action. I don't recommend this solution, but I remember using it years ago in one instance where it actually made sense. But in that particular instance, I was actually copying it to a real location.
Good luck!