Why SSIS logging is not working in a script component / script task when inside Foreach container - ssis

I have an SSIS package that has a ForEach container (to loop through multiple files).
Outside the loop I also have a script task where I'm using Dts.Log() function to log some information. It works fine.
Inside the loop I have several tasks such as another script task and dataflow task. I'm using Dts.Log() in another script task as well but it logs nothing. I've checked all logging settings and they seems to be right.
Inside the dataflow I also have one script component where I'm trying to use that Log() function with no success. Once again I've checked my setting and they seem to be correctly set (according to this one SSIS: Why is this not logging?).
I'm able to log inside the loop by raising events though (FireInformation http://technet.microsoft.com/en-us/library/microsoft.sqlserver.dts.runtime.idtscomponentevents.fireinformation(v=sql.105).aspx)
I remember reading somewhere that there are some restrictions in debugging when inside the foreach container. So is there some sort of things in Logging as well or is this something else?
I can't provide any code samples (at least not right now) so we have to keep the discussion at the certain level.

Related

I'm new to SSIS- ForEach Loop with Data Flow Task won't run completely. Data Flow Task runs fine for one file. What am I missing?

I have created a sql table to hold records; I've created an SSIS query with ForEach Loop to go through flat files and take information from them and load them into a sql table. I have created my variables, made sure "DelayValidation" is set to True.
The package (Data Flow) runs separately fine with no errors. When I move it to the ForEach Loop, it runs fine pulling on the one file (again).
This image is of the variables I have created :
But will not go on to the next file. I have researched on the internet and watched MULTIPLE videos and can't seem to find where the issue lies. I hope I have given enough information. I have tried to debug and that does not show anything.
Progress screen after running the whole package when Started Debugging with ForEach Loop and Data Flow Task :
Screen shot of the data flow task that successfully completed outside of the ForEach Loop:
Screen shot of the sql table to show one record came from the task in ssis :
Screen shot of the txt files to be included in the package and added to the sql table.
I don't enough about the coding to view the code and place a break. I'd be happy to try if someone could direct me.
The last picture is a detail screen shot of the variables - what they were created for and expressions.

SSIS ForceExecutionResult not working

I have a package, inside which contains a script task, due to probably C# library issues in some of the servers, this task may success in some machines but fail in others (reporting Cannot load script for execution).
I want to force the task to be success by setting the ForceExecutionResult = Success option for this task. However when running, I found this doesn't work, the task still fails in the old-fashioned way.
I don't want to modify the MaxErrorCount for package because I want to reveal errors from other components, in the meanwhile, even this script task fails during validation, I want the package report success, is there any way to make the solution?
To let your package continue execution, you can set the DelayValidation property to True on the Script Task (so the package will begin executing), then on the Precedence Constraint that follows this Script Task, set it to continue on completion, instead of success.

How to run a function in javaScript when all files have been Uploaded

I am using a primefaces FileUpload's the sending of multiple files. I would like to know how to perform a function onComplete when all files have been transferred.
I am creating a p: dialog that runs at the beginning, and if I use the onComplete, after the first shipment, it already performs the function. I would run the function after all the items you were completed.
Anyone know?
Edit:
It could also, if I could spend the entire file that was inserted in the list, could control the event from the Bean. Anyone know how I can pass the full list of files included in the upload?
I guess it's not possible. Take a look at this, there's an patch to do what you need: https://code.google.com/p/primefaces/issues/detail?id=3836

Problem inserting a user variable in connection expression

I am trying to import 110 excel files into a sql server database in SSIS2008.
I am at the the point where I have dragged in my foreach loop container, pointed to the correct folder. I have made a string variable (with foreach loop scope) and set the default value to a file in the source folder of excel files.
When I try to build a connection string expression and try to find the user variable it is not in the list. The only variables in the list are system variables.
Does anyone have any idea where I might be going wrong. I feel that I have set the correct scope by defining the string variable from the foreach loop.
(The User::FilePath variable that I made is not visible in the package explorer either.)
Thanks.
I find I generally have a better SSIS experience when I keep my variables at the package level. I suspect the connection manager doesn't like the connection string variable only being visible in the loop and that may be causing it some heartache for design-time validation. The user variable(s) you've created are visible, just not visible at the scope you're looking at. If you've clicked on the canvas/background of SSIS, you'll only see package level variables. My suspicion is the variables are in the foreach loop or possibly even on the dataflow or other tasks within the foreach container.
If you really want to find where you created those variables, look at the unused tab "Package Explorer". Keep expanding Executables and looking at the Variables item until you find your missing variables.
Finally, if you have variables at the "wrong" level, user BIDS Helper. Even if you have the variables at the right level, grab BIDS Helper. It's free and it really improves the package development experience.

How to get a response from a script back to Hudson and to fail/success emails?

I'm starting a Python script from a Hudson job. The script is started though 'Execute Windows batch command' in build section as 'python my_script.py'
Now I'd need to get some data created by the script back to Hudson and add it to the fail/success emails. My current approach is that the Python script writes data to stderr which is read to a temp file by the batch and then taken into an environment variable. I can see the environment variable correctly right after the script execution (using set command), but in the post-build actions it's not visible any more. The email sending is probably done in different process, so the variables are not visible anymore. I'm accessing the env vars in the email as ${ENV, varname} (or actually in debug mode as $ENV to print them all)
Is there a way to make the environment variable global inside Hudson?
Or can someone provide a better solution for getting data back from Python script to Hudson.
All the related parts (Hudson, batch and Python script) are under my control and can be modified as needed.
Thanks.
Every build step get's is own shell. This implies, that your environment variables are only valid within the build step.
You can just write the data in a nice format to the std output (use a header that is easy to identify) and if the job fails, the data output gets attached in the email.
If you insist on only putting in the data, you can use the following token for the Editable Email Notification post build action (Email-ext plugin).
${BUILD_LOG_REGEX, regex, linesBefore, linesAfter, maxMatches, showTruncatedLines, substText}