Getting a custom logback appender to work inside a Dataflow job - logback

I have a custom log appender defined in my logback.xml, which pushes data to an Elasticsearch endpoint. Along with this, there are console and file appenders.
This xml, along with code, is part of a Dataflow job, and gets uploaded to Dataflow. I can see the log statements in the Dataflow logs in Stackdriver just fine, but the ES appender does not kick in. It looks like Google overrides all custom appenders with one that just writes to the Stackdriver logs.
Is there any way to use a custom log appender (e.g. with logback) inside Dataflow?
Related thread, not so helpful - https://mail-archives.apache.org/mod_mbox/beam-user/201705.mbox/%3CCABnhQv_J0Qb5xnUdOhVdaDf4d8OE+f-AkZ2MrAruQgAAZ7XYGQ#mail.gmail.com%3E

Related

how to log process md files with junit?

I work on a project using concordion, maven and junit. There are a lot's of md files which are processed but the logs do not contain the name of the processed md file. So it is hard to link logs lines to executed tests.
How is it possible to log the name of processed md file ?
Assuming you want the details logged in the output specification, take a look at the echo command (or the embed extension if you want to alter the HTML formatting).
If you just want it logged to a log file, you'll need to look at a logging framework (eg. slf4j/logback). The logback extension can make the log accessible from the concordion output specification.

Totally new to Talend ESB

I'm completely brand new to Talend ESB (not so much Talend for data integration, but ESB totally.)
That being said, I'm trying to build a simple route that watches a specific file path and get the filename of any file dropped into it. Then it will pass that filename to the childjob (cTalendJob) and the child job will do something to the file.
I'm able to watch the directory, procure the filename itself and System.out.println the filename. but I can't seem to 'pass' it down to the child job. When it runs, the route goes into an endless loop.
Any help is GREATLY appreciated.
You must add a context parameter to your Talend job, and then pass the filename from the route to the job by assigning it to the parameter.
In my example I added a parameter named "Param" to my job. In the Context Param view of cTalendJob, click the + button and select it from the list of available parameters, and assign a value to it.
You can then do context.Param in your child job to use the filename.
I think you are making this more difficult than you need...
I don't think you need your cProcessor or cSetBody steps.
In your tRouteInput if you want the filename, then map "${header.CamelFileName}" to a field in your schema, and you will get the filename. Mapping "${in.body}" would give you the file contents, but if you don't need that you can just map the required heading. If your job would read the file as a whole, you could skip that step and just map the message body.
Also, check the default behaviour of the camel file component - it is intended to put the contents of the file into a message, moving the file to a .camel subdirectory once complete. If your job writes to the directory cFile is monitoring, it will keep running indefinitely, as it keeps finding a "new" file - you would want to write any updated files to a different directory, or a filename mask that isn't monitored by the cFile component.

log4j appender to save log as json format with additional data

I am trying to figure out the best way to write all of our logs as a single line of JSON using log4j2. Can any one suggest me appender to achieve above one. Any help would be appreciated. Currently I am converting data into JSON and logging at particular levels but I want to do it automatically.

Skip a Data Flow Component on Error

I want to skip a component of my data flow task, when this component throws a specific error.
To be precise, I read data from different source files/connections in my dataflow and process them.
The problem is that I can't be sure if all source files/connections will be found.
Instead of checking each source that I can connect to, I want to continue the execution of the data flow by skipping the component that reads data from the source.
Is there any possibility to continue the data flow after the component, which originally threw the error by jumping back from the On_Error-Eventhandler (of the data flow task) into the next component? Or is there any other way in continuing the data flow task execution by skipping the component?
As #praveen observed, out of the box you cannot disabled data flow components.
That said, I could see a use case for this, perhaps a secondary source that augments existing data which may or may not be available. If I had that specific need, then I'd need to write a script component which performs the data reading, parsing, casting of data types, etc when a file is present and sends nothing, but keeps the metadata in tact when no source is available.
You can do the following based on what I understand:
1) Create a script component that will check which source to go and check
2) Based on the Source connection you can assign the Source

SSIS: How to create custom log files?

I would like to create a custom log errors . I followed the tutorials on msdn but to no avail. I am just trying to log "Hello" to a file called test.txt. I have enabled logging and enabled the script task to handle the log i set up.
Any suggestions/tutorials/advice?
I would certainly appreciate it!
Danke
General steps are
Create the log connection manager
Create whatever task you are wanting to log (or you can do it for the entire package)
In the Event Handlers tab for that task or the package, place whatever you want to happen on the canvas. If you want to write to a text file, I'd create a script task that would write to your text file. The file location could be a variable passed from the package so you don't have to go into the script task to change that configuration.
Info on setting variables in the script task: http://msdn.microsoft.com/en-us/library/ms135941.aspx