SSIS: How to create custom log files? - ssis

I would like to create a custom log errors . I followed the tutorials on msdn but to no avail. I am just trying to log "Hello" to a file called test.txt. I have enabled logging and enabled the script task to handle the log i set up.
Any suggestions/tutorials/advice?
I would certainly appreciate it!
Danke

General steps are
Create the log connection manager
Create whatever task you are wanting to log (or you can do it for the entire package)
In the Event Handlers tab for that task or the package, place whatever you want to happen on the canvas. If you want to write to a text file, I'd create a script task that would write to your text file. The file location could be a variable passed from the package so you don't have to go into the script task to change that configuration.
Info on setting variables in the script task: http://msdn.microsoft.com/en-us/library/ms135941.aspx

Related

KNIME - Execute a EXE program in a Workflow

I have a workflow Knime, in the middle I must execute an external program to create an Excel file.
Exists some node that allows me to achieve this? I don't need to put any input or output, only execute the program and wait to generate the Excel file (I require to use this Excel for the next nodes).
There are (at least) two “External Tool” nodes which allow running executables on the command line:
External Tool
External Tool (Labs)
In case that should not be enough, you can always go for a Java Snippet node. The java.lang.Runtime class should be your entry point.
It's could be used the External tool node. The node requires inputs and outputs... but, you can use a table creator node for input:
This create an empty table.
In the external tool node, you must include an Input file and Output file, depending on your request, this config could be meaningless but require to the Node works.
In this case, the external app creates a text with the result of the execution, so, in the initial table (Table creator node), will be read the file and get the information into Knime.

Totally new to Talend ESB

I'm completely brand new to Talend ESB (not so much Talend for data integration, but ESB totally.)
That being said, I'm trying to build a simple route that watches a specific file path and get the filename of any file dropped into it. Then it will pass that filename to the childjob (cTalendJob) and the child job will do something to the file.
I'm able to watch the directory, procure the filename itself and System.out.println the filename. but I can't seem to 'pass' it down to the child job. When it runs, the route goes into an endless loop.
Any help is GREATLY appreciated.
You must add a context parameter to your Talend job, and then pass the filename from the route to the job by assigning it to the parameter.
In my example I added a parameter named "Param" to my job. In the Context Param view of cTalendJob, click the + button and select it from the list of available parameters, and assign a value to it.
You can then do context.Param in your child job to use the filename.
I think you are making this more difficult than you need...
I don't think you need your cProcessor or cSetBody steps.
In your tRouteInput if you want the filename, then map "${header.CamelFileName}" to a field in your schema, and you will get the filename. Mapping "${in.body}" would give you the file contents, but if you don't need that you can just map the required heading. If your job would read the file as a whole, you could skip that step and just map the message body.
Also, check the default behaviour of the camel file component - it is intended to put the contents of the file into a message, moving the file to a .camel subdirectory once complete. If your job writes to the directory cFile is monitoring, it will keep running indefinitely, as it keeps finding a "new" file - you would want to write any updated files to a different directory, or a filename mask that isn't monitored by the cFile component.

Triggering execution of SSIS package when Files Arrive in a Folder

I have scenario in SSIS. I have a package which is simple data movement from flatfile to database.
I have a specific location and I want to execute that package when file comes on the folder.
Step-By-Step using WMI Event Watcher Task
Create a WMI Connection manager. Use Windows credentials when running locally (you must be an admin to access WMI event info), and enter credentials when running remotely (be sure to encrypt your packages!)
Add a new WMI Event Watcher Task. Configure the WQLQuerySource property with a WQL query to watch a specific folder for files.
WQL is SQL-like but slightly off, here's the example I'm using to watch a folder:
SELECT * FROM __InstanceCreationEvent WITHIN 10
WHERE TargetInstance ISA "CIM_DirectoryContainsFile"
and TargetInstance.GroupComponent= "Win32_Directory.Name=\"c:\\\\WMIFileWatcher\""
Breaking down this query is out of scope, but note the directory name in the filter and the string escaping required to make it work.
Create a For Each Loop and attach it to the WMI Event watcher Task. Set it with a Foreach File Enumerator, and set the folder to the folder you're watching.
In the Variable Mappings tab of the For Each Loop editor, assign the file name to a variable.
Use that variable name to perform actions on the file (for example, assign it to the ConnectionString property of a Flat File connection and use that connection in a Data Flow task) and then archive the file off somewhere else.
In the diagram below, this package will run until a file has been added, process it, and then complete.
To make the package run in perpetuity, wrap those two tasks in a For Loop with the EvalExpression set to true == true.
You can also consider registering object events using PowerShell and kicking off your SSIS package when those events are triggered. This requires a little less continuous overhead of having your package constantly running, but it adds an extra dependency.
The WMI solution is interesting, but the environment / setup requirements are a bit complex for my tastes. I prefer to resolve this using a ForEach Loop Container and a Execute SQL Wait task, both inside a For Loop Container.
I configure the ForEach Loop Container to loop over the files in a directory, pointing it at the expected file name. The only task inside this Container is a Script Task that increments a Files_Found variable - this will only happen when files are found.
Following the ForEach Loop Container is an Execute SQL task to wait between checks, e.g. WAITFOR DELAY '00:05:00' for a 5 minute delay.
Both that ForEach Loop and Execute SQL task are contained in a For Loop, which initializes and tests the Files_Found variable.
This solution is a lot simpler to manage - it doesn't require any particular setup, environment, permissions, stored credentials or knowledge of WMI syntax.

The connection "C:\\<path>\\*.txt" is not found. This error is thrown by Connections collection when the specific conn element is not found

I developed a SSIS package that creates several .txt files. These files are zipped and then the .txt files need to be removed. Using a foreach file enumerator, I loop through all the .txt files for a specific folder. The folder is retrieved from a variable in configuration and looks something like: C:\Folder\
The foreach loop uses: *.txt to gather all .txt files, does not traverse subfolder and uses the full qualified name.
In the Variable Mappings the "FileName" variable gets filled with the 0 index.
Within the foreachloop I use a File system task.
This task removes the .txt files which are generated before, using the FileName variable that is filled in the loop.
On the development machine this runs like a charm. All greens, no problem at all. Now I copy the package and the configuration file to the test environment. A basic version without the file removing was running perfectly fine here. I replaced the package. Nothing big.
Now I run the SQl Server Agent Job and it starts running. I can see all the text files appearing, and disappearing after it created the zipfiles. However, when all files are removed the package results with errors. Namely the error shown above in the title.
I tried looking for the connectionmanager that might have been removed
Looked for connection managers named in the config that don't exist in the package.
No such thing found. Annoying part is that the package is fully functioning, but still results with the error.
EDIT: I noticed that if I run the package using the execute package utility with the dev. config it gives the same errors.
Hopefully someone is able to help me out.
Thanks in advance!
I managed to "fix" the issue. Remove the File System Component responsible for deleting the files. Then add it again and configure it again.
I think this happens if you accidentally change General parameters before changing the Operation parameter. It holds the metadata to irrelevant parameters and upon execution says: "Wait, you defined this parameter but I don't need it, but I'm checking for it anyway, and it's not there!"
It's a bug for sure

SQLServer 2008 : Name of backup file

I have a SQL server 2008 and I would change the name of the backup file.
I use an SSIS package to perform my backups.
The file's name looks like
[DATABASE_NAME]_backup_YYYY_MM_DD_XXXXXX_XXXXXX
This is automatically generated by SqlServer, and I want to remove the "_".
How I can modify this ?
Thank you in advance,
Andy.
I have faced similar situation today and used following workaround.
Use "Execute Process Task" to rename the backup. I created a batch file with following command and executed it after the Database backup task.
ren BDNAME.bak DBNAME_%date:~-4,4%%date:~-7,2%%date:~4,2%.bak
Above command will rename DBNAME.bak file to DBNAME_yyyymmdd.bak
Keep the file in the same folder where you keep the backup file. In the Execute Process Task Editor, specify batch file name in the Executable property and the location of batch file in the WorkingDirectory property.
Hope it helps.
I believe that you can use the DestinationManualList for this, although I've never used it myself and I can't seem to find documentation or examples of it anywhere. It appears in the Properties list for the Backup Database Task, but not in the dialog for it.
I don't believe you can manual edit the DestinationManualList property. Right click on the task and select Edit. In the dialog that opens click on "Back up databases across one or more files" then click on the Add button. In the Select Backup Destination dialog click on File name and enter the path including the name and extension of the file. What you've entered will then show up in the DestinationManualList collection.