Logging Exception.Data using Log4Net - exception

We're just getting started with Log4Net (and wishing we'd done it earlier). Whilst we can see inner exceptions, etc. the one thing that seems to be missing from the output when logging an exception is any key/value information held inside the "Exception.Data". Is there anyway we can do this "out of the box"? If not, as we really are only just starting out where should be looking to find a way to implement this functionality?
As an example please see the very basic pseudo code below. We don't want to pollute the exception message with context information just what the problem was (We'd probably have lost more information in the data which would help in investigating the actual problem). But right now all we see in our logs is the type of exception, the message, any stack trace - but no exception "data". This means in our logs we lose the customer id, etc. How can we easily get this information into our logs (without having to code it by hand in each exception catch).
try
{
var ex = new ApplicationException("Unable to update customer");
ex.Data.Add("id", customer.Id);
throw ex;
}
catch(ApplicationException ex)
{
logger.Error("An error occurred whilst doing something", ex);
throw;
}

Following Stefan's lead:
namespace YourNamespace {
public sealed class ExceptionDataPatternConverter : PatternLayoutConverter {
protected override void Convert(TextWriter writer, LoggingEvent loggingEvent) {
var data = loggingEvent.ExceptionObject.Data;
if (data != null) {
foreach(var key in data.Keys) {
writer.Write("Data[{0}]={1}" + Environment.NewLine, key, data[key]);
}
}
}
}
}
And in your configuration add %ex_data and the converter:
<appender ...>
...
<layout type="log4net.Layout.PatternLayout,log4net">
<conversionPattern value="%date %d{HH:mm:ss.fff} [%t] %-5p %c %l - %m%n %ex_data"/>
<converter>
<name value="ex_data" />
<type value="YourNamespace.ExceptionDataPatternConverter" />
</converter>
</layout>

If you have multiple appenders defined you can use a custom renderer rather than defining the converter for every layout.
web/app.config
<log4net>
...
<renderer renderingClass="YourNamespace.ExceptionObjectLogger, YourAssembly" renderedClass="System.Exception" />
...
</log4net>
ExceptionObjectLogger
public class ExceptionObjectLogger : IObjectRenderer
{
public void RenderObject(RendererMap rendererMap, object obj, TextWriter writer)
{
var ex = obj as Exception;
if (ex == null)
{
// Shouldn't happen if only configured for the System.Exception type.
rendererMap.DefaultRenderer.RenderObject(rendererMap, obj, writer);
}
else
{
rendererMap.DefaultRenderer.RenderObject(rendererMap, obj, writer);
const int MAX_DEPTH = 10;
int currentDepth = 0;
while (ex != null && currentDepth <= MAX_DEPTH)
{
this.RenderExceptionData(rendererMap, ex, writer, currentDepth);
ex = ex.InnerException;
currentDepth++;
}
}
}
private void RenderExceptionData(RendererMap rendererMap, Exception ex, TextWriter writer, int depthLevel)
{
var dataCount = ex.Data.Count;
if (dataCount == 0)
{
return;
}
writer.WriteLine();
writer.WriteLine($"Exception data on level {depthLevel} ({dataCount} items):");
var currentElement = 0;
foreach (DictionaryEntry entry in ex.Data)
{
currentElement++;
writer.Write("[");
ExceptionObjectLogger.RenderValue(rendererMap, writer, entry.Key);
writer.Write("]: ");
ExceptionObjectLogger.RenderValue(rendererMap, writer, entry.Value);
if (currentElement < dataCount)
{
writer.WriteLine();
}
}
}
private static void RenderValue(RendererMap rendererMap, TextWriter writer, object value)
{
if (value is string)
{
writer.Write(value);
}
else
{
IObjectRenderer keyRenderer = rendererMap.Get(value.GetType());
keyRenderer.RenderObject(rendererMap, value, writer);
}
}
}

I think a more log4net way of approaching this problem would be to write a PatternLayoutConverter. An example can be found here.
In the convert method you can access your data like this (and write it the way you like):
override protected void Convert(TextWriter writer, LoggingEvent loggingEvent)
{
var data = loggingEvent.ExceptionObject.Data;
}

I think Massimiliano has the right idea but I would modify his solution slightly.
If you plan on sticking all of of your additional data in the dictionary Data within an exception I would change his extension method to the following:
public static class ExLog4Net
{
public static void Error(this ILog log, Exception ex)
{
StringBuilder formattedError = new StringBuilder();
formattedError.AppendFormat("Exception: {0}\r\n", ex.ToString());
foreach (DictionaryEntry de in ex.Data)
formattedError.AppendFormat("{0}: {1}\r\n", de.Key, de.Value);
log.Error(formattedError.ToString());
}
}
You would then stick this method extension in a library you would use in all of your applications. If you don't have one you would have to add this to every project.

you could create an Extension method for your logger to log the customer Id : you should not add important information to the exception
You can abstract the concept of "Additional Information to log" and create an interface with a method that return the additional information you want to log
public interface IDataLogger
{
string GetAdditionalInfo();
}
public class UserDataLogger: IDataLogger
{
public string GetAdditionalInfo()
{
return "UserName";
}
}
public class MoreDataLogger : IDataLogger
{
public string GetAdditionalInfo()
{
return "something";
}
}
you can create different "data Logger" and maybe combine them together
then you could create an generic extension method that get the type of the logger
public static class ExLog4Net
{
public static void Error<T>(this ILog log, Exception ex) where T:IDataLogger,new()
{
var dataLogger=new T();
log.Error(ex.ToString() + " " + dataLogger.GetAdditionalInfo());
}
}
you will be able to do the below:
try
{
}
catch (Exception ex)
{
logger.Error<UserDataLogger>(ex);
logger.Error<MoreDataLogger>(ex);
throw;
}

Related

MappingException for Map type of data

While saving Map type data in couchBase I am getting an exception
Caused by: org.springframework.data.mapping.MappingException: Couldn't find PersistentEntity for type java.lang.Object!
I've taken a map in DataModel
#Data
public class test {
private Map<String,Object> testMap;
}
I found this and override couchBase configuration to do customMapping in case of Object Type like
protected <R> R read(final TypeInformation<R> type, final CouchbaseDocument source,
final Object parent) {
if (Object.class == typeMapper.readType(source, type).getType()) {
return (R) source.export();
} else {
return super.read(type, source, parent);
}
}
It worked for the request like
{
"dummyMap":{
"key1":"val1",
"key2":"val2"
}
}
But failed for
{
"dummyMap":{
"key1":"val1",
"key2":"val2",
"objects":[
{
"key1":"val1",
"key2":"val2"
}
]
}
}
with exception
Caused by: java.lang.IllegalArgumentException: Basic type must not be null!
I guess it is because of the array. Please let me know what I am doing wrong.
I am using spring-data-couchbase version 2.0.4.RELEASE.
hi please use below code, its because type is null and couchbase mapping convertor cant read document
its must be work.
#Override
#SuppressWarnings("unchecked")
protected <R> R read(final TypeInformation<R> type, final CouchbaseDocument source, final Object parent) {
if (type == null)
return (R) source.export();
if (Object.class == typeMapper.readType(source, type).getType()) {
return (R) source.export();
} else {
return super.read(type, source, parent);
}
}

Using setStrictHeaderValidationEnabled method in csvRoutines not working

I am creating a bean processor and setting setStrictHeaderValidationEnabled to true. Now my CsvParserSettings are consuming this bean processor which in turn is consumed by CSVRoutines. But on iterating through csvroutines the bean processor does not validate headers and subsequent rows get converted to beans for files with invalid headers as well
Sample Code-
final BeanProcessor<TestBean> rowProcessor = new BeanProcessor<TestBean>(TestBean.class) {
#Override
public void beanProcessed(TestBean bean, ParsingContext context) {
}
};
rowProcessor.setStrictHeaderValidationEnabled(true);
final CsvParserSettings parserSettings = new CsvParserSettings();
parserSettings.setProcessor(rowProcessor);
parserSettings.setHeaderExtractionEnabled(true);
parserSettings.getFormat().setDelimiter(',');
CsvRoutines routines = new CsvRoutines(parserSettings);
for(TestBean bean : routines.iterate(TestBean.class, inputFile, StandardCharsets.UTF_8)) {
try {
System.out.println(OBJECT_MAPPER.writeValueAsString(bean));
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
Note: TestBean uses #Parsed annotation of univocity to set column names.
The iterate method returns a IterableResult, which provides you the ParsingContext from which you can get the headers parsed from the input.
Try this code:
IterableResult<TestBean, ParsingContext> iterableResult = routines.iterate(TestBean.class, inputFile, StandardCharsets.UTF_8);
ResultIterator<TestBean, ParsingContext> iterator = iterableResult.iterator();
ParsingContext context = iterator.getContext();
String[] headers = context.headers();
//HEADERS HERE:
System.out.println(Arrays.toString(headers));
while(iterator.hasNext()){
try {
System.out.println(OBJECT_MAPPER.writeValueAsString(iterator.next()));
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
Hope it helps

How to access StackTrace property form my Custom Exceptions in dot net core

I'm trying to implement my own custom exceptions in dot net core.
This is what I have so far:
public class WSException: Exception
{
// some custom stuff...
private readonly string _developerMessage = "";
public string DeveloperMessage { get { return _developerMessage; } }
public WSException() {}
public WSException(string message) : base(message) {
this._developerMessage = message;
}
public WSException(string message, Exception inner) : base(message, inner) {
this._developerMessage = message;
}
public WSException(Exception ex) : base(ex.Message, ex.InnerException) {
_developerMessage = ex.Message;
Source = ex.Source;
//StackTrace = ex.StackTrace; // cannot be assigned to, it's read only
}
public WSException(string message) : base(message) {
this._developerMessage = (String.IsNullOrWhiteSpace(developerMessage) ? message : developerMessage);
}
}
When I catch a general exception, I try to create one of my own (a WSException) to handle it in a common way, like this:
try {
// whatever
}
catch (WSException e) {
HandleException(e);
}
catch (Exception e) {
HandleException(new WSException(e));
}
When I do it like that, e.Source and e.StackTrace are null, and when I try to assign StackTrace I get a Propery or indexer 'Exception.StackTrace' cannot be assigned to --it is read only.
How should such I implement this constructor?
public WSException(Exception ex) : base(ex.Message, ex.InnerException) {
_developerMessage = ex.Message;
Source = ex.Source;
//StackTrace = ex.StackTrace; // cannot be assigned to, it's read only
}
The workaround I found so far is to handle it when I'm serializing the error to json, something like this:
public class WSExceptionJsonConverter : JsonConverter
{
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
var ex = value as WSException;
writer.WriteStartObject();
// buch of properties[...]
string stackTrace = null;
if (ex.StackTrace != null) {
stackTrace = ex.StackTrace;
} else if (ex.InnerException != null && ex.InnerException.StackTrace != null) {
stackTrace = ex.InnerException.StackTrace;
} else {
stackTrace = null;
}
writer.WritePropertyName("stacktrace");
serializer.Serialize(writer, stackTrace.Split('\n'));
writer.WriteEndObject();
}
But it feels too hacky

WebApi Custom MediaTypeFormatter get posted parameters

I am calling a post action on a webapi by passing in a serialised JSON DTO.
I also have a custom media type formatter to encrypt the resulting data. However in the WriteToStreamAsync method, how can i get the posted parameters?
The custom media type formatter class
public class JsonFormatter : JsonMediaTypeFormatter
{
public override Task WriteToStreamAsync(Type type, object value, Stream writeStream, HttpContent content, TransportContext transportContext)
{
var taskSource = new TaskCompletionSource<object>();
try
{
if (value != null)
{
//How to get posted parameters?
}
}
catch (Exception e)
{
taskSource.SetException(e);
}
return taskSource.Task;
}
}
}
I managed to get it via the HttpContext.Current.Request.InputStream
Using HttpContext.Current generally should not work in this scenario since it won't always be available for async calls.
Instead do something like this:
public class JsonFormatter : JsonMediaTypeFormatter
{
private readonly HttpRequestMessage request;
public JsonFormatter() { }
public JsonFormatter(HttpRequestMessage request)
{
this.request = request;
}
public override MediaTypeFormatter GetPerRequestFormatterInstance(Type type, HttpRequestMessage request, MediaTypeHeaderValue mediaType)
{
return new JsonFormatter(request);
}
public override Task WriteToStreamAsync(Type type, object value, Stream writeStream, HttpContent content, TransportContext transportContext)
{
// logic referencing this.request
}
}

Storing Apache Hadoop Data Output to Mysql Database

I need to store output of map-reduce program into database, so is there any way?
If so, is it possible to store output into multiple columns & tables based on requirement??
please suggest me some solutions.
Thank you..
The great example is shown on this blog, I tried it and it goes really well. I quote the most important parts of the code.
At first, you must create a class representing data you would like to store. The class must implement DBWritable interface:
public class DBOutputWritable implements Writable, DBWritable
{
private String name;
private int count;
public DBOutputWritable(String name, int count) {
this.name = name;
this.count = count;
}
public void readFields(DataInput in) throws IOException { }
public void readFields(ResultSet rs) throws SQLException {
name = rs.getString(1);
count = rs.getInt(2);
}
public void write(DataOutput out) throws IOException { }
public void write(PreparedStatement ps) throws SQLException {
ps.setString(1, name);
ps.setInt(2, count);
}
}
Create objects of previously defined class in your Reducer:
public class Reduce extends Reducer<Text, IntWritable, DBOutputWritable, NullWritable> {
protected void reduce(Text key, Iterable<IntWritable> values, Context ctx) {
int sum = 0;
for(IntWritable value : values) {
sum += value.get();
}
try {
ctx.write(new DBOutputWritable(key.toString(), sum), NullWritable.get());
} catch(IOException e) {
e.printStackTrace();
} catch(InterruptedException e) {
e.printStackTrace();
}
}
}
Finally you must configure a connection to your DB (do not forget to add your db connector on the classpath) and register your mapper's and reducer's input/output data types.
public class Main
{
public static void main(String[] args) throws Exception
{
Configuration conf = new Configuration();
DBConfiguration.configureDB(conf,
"com.mysql.jdbc.Driver", // driver class
"jdbc:mysql://localhost:3306/testDb", // db url
"user", // username
"password"); //password
Job job = new Job(conf);
job.setJarByClass(Main.class);
job.setMapperClass(Map.class); // your mapper - not shown in this example
job.setReducerClass(Reduce.class);
job.setMapOutputKeyClass(Text.class); // your mapper - not shown in this example
job.setMapOutputValueClass(IntWritable.class); // your mapper - not shown in this example
job.setOutputKeyClass(DBOutputWritable.class); // reducer's KEYOUT
job.setOutputValueClass(NullWritable.class); // reducer's VALUEOUT
job.setInputFormatClass(...);
job.setOutputFormatClass(DBOutputFormat.class);
DBInputFormat.setInput(...);
DBOutputFormat.setOutput(
job,
"output", // output table name
new String[] { "name", "count" } //table columns
);
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}