Skip a Data Flow Component on Error - ssis

I want to skip a component of my data flow task, when this component throws a specific error.
To be precise, I read data from different source files/connections in my dataflow and process them.
The problem is that I can't be sure if all source files/connections will be found.
Instead of checking each source that I can connect to, I want to continue the execution of the data flow by skipping the component that reads data from the source.
Is there any possibility to continue the data flow after the component, which originally threw the error by jumping back from the On_Error-Eventhandler (of the data flow task) into the next component? Or is there any other way in continuing the data flow task execution by skipping the component?

As #praveen observed, out of the box you cannot disabled data flow components.
That said, I could see a use case for this, perhaps a secondary source that augments existing data which may or may not be available. If I had that specific need, then I'd need to write a script component which performs the data reading, parsing, casting of data types, etc when a file is present and sends nothing, but keeps the metadata in tact when no source is available.

You can do the following based on what I understand:
1) Create a script component that will check which source to go and check
2) Based on the Source connection you can assign the Source

Related

ESQL to read value from cache property file

AM new to esql, in my message flow have a lookup file which contains some values for forking messages. Now i have a new requirement to read a value from lookup cache file and search for string, so if contains particular string duplicate the messages and fork to multiple queues if string doesn't exists fork to single queue. Can someone help with this ??
Thanks,
Vinoth
You should not read the file for every message, but instead cache the contents of the file in SHARED variables.
For this your message flow should have 2 input queues, one for getting the messages to route, and the second for a technical queue which would receive messages to initiate the reload of the file into cache.
This second part of the flow should look like this: MQ Input -> File Read -> Compute
And put the logic of storing the file contents to SHARED variables into the Compute.
So as you see, you don't read the file in ESQL, you do that by using the File Read node in your flow, and use ESQL only to process the file contents. And you can access the values stored in the SHARED variables in the first part of the flow where you do the routing.

How to do File System Task in SSIS depending on Result of Data Flow

I'm writing a (what I thought to be a) simple SSIS package to import data from a CSV file into a SQL table.
On the Control Flow task I have a Data Flow Task. In that Data Flow Task I have
a Flat File Source "step",
followed by a Data Conversion "step",
followed by a OLE DB destination "step".
What I want to do is to move the source CSV file to a "Completed" folder or to a "Failed" folder based on the results of the Data Flow Task.
I see that I can't add a File System step inside the Data Flow Task, but I have to do it in the Control Flow tab.
My question is how do I do a simple thing like assign a value to a variable (I saw how to create variable and assign them a value at the bottom pane of Data Tools (2012)) depending of if the "step" succeeds or fails?
Thanks!
(You can tell by my question that I'm an SSIS rookie - and don't assume I can write a C# script, please)
I have used VB or C# scripts to accomplish this myself. Since you do not want to use scripts I would recommend using a different path for the project to flow. Have your success path lead to moving the file to completed and failure path lead to moving the file to failed. This keeps it simple and accomplishes what you are looking for.

SSIS Excel File Null Data Microsoft BI

I'm new to SSIS. When I try to load data from an Excel File and there is another data flow task in the same package, it just fills the table with null data, e.g., dim_Alarm(null,null,null,null). However, when I try adding a new package and the data flow task is alone in the package, then the data is loaded.
Look at the Connection Manager for the Excel Source for the dataflow that is returning null data. There is probably some difference - maybe a typo error? - between the one that returns null data, and the one that loads the data from the file.
It is unlikely that the presence or absence of the other data flows is causing this problem, unless they are hitting the same Excel file, or they are hitting the same database table dim_Alarm. It is much more likely that there is some small difference between the data flow that loads nulls and the data flow that works (in the empty package).
You can also add a Data Viewer to the data flow that isn't behaving as you expect. The Data Viewer goes on one of the arrows between transformations in the data flow. When you run the package in BIDS, the Data Viewer will show you the data that flows through that point. If the data is missing, you may be able to see where it got lost. Is there data coming out of the Excel Source, but after the next transformation there is no more data? Then that is where the problem is.

SSIS - Continue Package Flow even after inner task in a Foreach Loop Container fails

In the figure below, why is the Foreach Loop Container failing despite the fail path (of the DFT that failed) being handled correctly?
How can I get the loop to continue after handling fail path?
If it helps to know what's going on in the package, here's the gist:
We have a requirement where data from Excel files must be loaded into
a DB. The package we have splits each Excel file into constituent CSV
files (one CSV per sheet), and loads the CSVs into the DB. It is
possible that some of the sheets have issues (missing columns, data
type mismatch etc), and such erroneous CSVs are captured by the fail
path of the DFT. Ideally the package must resume processing the rest
of the CSVs, and the rest of the Excel files, and exit successfully.
Do you have any OnError EventHandlers defined for that Data Flow Task? If yes, you could as well set the System Variable, Propogate (type Boolean), for that Error Handler scope to 'False'.
Also please go through Gracefully Handing Task Error in SSIS Package
There is a property on every SSIS component that is called MaximumErrorCount which defines the number of errors that this particular component can accept before failing the whole package.
You have to increase this value for each and every component that you want to continue executing before failing.

SSIS - "switchable" file output for debug?

In an SSIS data-flow task, I'm using a Multicast transform at a key part of the flow which I want to hang a File Output destination off.
This, in itself, is no problem to do. However I only want output in the file if I enable it; i.e., I'd be using it for debugging the data if the flow fails unexpectedly and it's not immediately obvious from the default log message output why this occured.
My initial thought was to create a File Output whose output file was obtained from a variable, and by default, the variable would contain 'nul' - i.e., the Windows bit-bucket - which I could override through configuration in the event of needing to dig further.
Unfortuantly this isn't working: the File Output complains saying that "The filename is a device or contains invalid characters". So it looks like I can't use the bit-bucket.
Is anyone aware of a way to make output "switchable"? This would make enabling debug a less risky proposition than editing the package and dropping a File Output in directly.
I suppose I could have a Conditional Split off the multi-cast which basically sends output if a variable is set to some given value, but this seems overly messy, I'll be poking other options, but if anyone has any suggestions/solutions, they'd be welcome.
I'd go for the conditional split, redirecting rows to the konesans trash destination adaptor if your variable wasn't set, otherwise send to your file.