Mule 3 datamapper (list-of-maps to list-of-pojos) retuns empty result - datamapper

I have a problem using Mule 3.3 datamapper, converting a list of maps into a list of POJOs. The source data was created by reading from a database (JDBC outbound endpoint with a SQL query), and my logger shows 2 entries before the datamapper.
The console log shows 2 records being processed by the datamapper, including 2 POJOs being created by the POJO writer. However, immediately after the datamapper I have a logger which shows the collection size to be zero.
I have tried converting to xml and csv formats, instead of pojos, and both of these work fine.
Does anyone have any idea what might be wrong?
This is the config snippet...
<jdbc-ee:outbound-endpoint connector-ref="jdbcConnector" queryKey="selectActuals" exchange-pattern="request-response" queryTimeout="-1" doc:name="Get actuals from DB"></jdbc-ee:outbound-endpoint>
<logger level="INFO" doc:name="logger 1" message="logger 1 - list size: #[message.payload.size()] "/>
<data-mapper:transform config-ref="map_list_to_pojo_list" doc:name="DataMapper"/>
<logger level="INFO" doc:name="logger 2" message="logger 2 - list size: #[message.payload.size()] "/>

Related

How to exclude null values while converting object to json

I am working with MuleSoft Anypoint Studio and I need to convert JSON payload to in the end XML. During this conversion every field that is NULL need to be excluded. Some values are not sent via POST request and I am expecting to not see them in the end result - XML file but that is not the case as they are there. For example in the JSON POST request Value field is not sent, which becomes null in Mule so it should not appear in the XML file but it's still written in it as <Value/>. I am mainly having problems with Object to JSON transformer.
I have tried configuring a custom mapper
<spring:beans>
<spring:bean id="Bean" name="NonNullMapper" class="org.codehaus.jackson.map.ObjectMapper">
<spring:property name="SerializationInclusion">
<spring:value type="org.codehaus.jackson.map.annotate.JsonSerialize.Inclusion">NON_NULL</spring:value>
</spring:property>
</spring:bean>
But that didn't really work. I also tried
<spring:beans>
<spring:bean id="jacksonObjectMapper" class="org.codehaus.jackson.map.ObjectMapper" />
<spring:bean
class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<spring:property name="targetObject" ref="jacksonObjectMapper" />
<spring:property name="targetMethod" value="configure" />
<spring:property name="arguments">
<spring:list>
<spring:value>WRITE_NULL_MAP_VALUES</spring:value>
<spring:value>false</spring:value>
</spring:list>
</spring:property>
</spring:bean>
</spring:beans>
That didn't work too as I get an error which I couldn't manage to fix
More than one object of type class org.codehaus.jackson.map.ObjectMapper registered but only one expected
I am working with
Mule 3.9.0
Anypoint Studio 6.4
com.fasterxml.jackson and in some places org.codehaus.jackson
I would really appreciate any help or some hint.
Given that this in Mule, you can use DataWeave instead to transform the payload. Setting the XML writer property skipNullOn could give the desired result. https://docs.mulesoft.com/mule-user-guide/v/3.9/dataweave-formats#skip-null-on
Example
%output application/xml skipNullOn="payload"
---
payload

How to include multiple JSON fields when using JSON logging with SLF4J?

I'm working with Dropwizard 1.3.2, which does logging using SLF4J over Logback. I am writing logs for ingestion into ElasticSearch, so I thought I'd use JSON logging and make some Kibana dashboards. But I really want more than one JSON item per log message - if I am recording a status update with ten fields, I would ideally like to log the object and have the JSON fields show up as top level entries in the JSON log. I did get MDC working but that is very clumsy and doesn't flatten objects.
That's turned out to be difficult! How can I do that? I have it logging in JSON, but I can't nicely log multiple JSON fields!
Things I've done:
My Dropwizard configuration has this appender:
appenders:
- type: console
target: stdout
layout:
type: json
timestampFormat: "ISO_INSTANT"
prettyPrint: false
appendLineSeparator: true
additionalFields:
keyOne: "value one"
keyTwo: "value two"
flattenMdc: true
The additional fields show up, but those values seem to be fixed in the configuration file and don't change. There is a "customFieldNames" but no documentation on how to use it, and no matter what I put in there I get a "no String-argument constructor/factory method to deserialize from String value" error. (The docs have an example value of "#timestamp" but no explanation, and even that generates the error. They also have examples like "(requestTime:request_time, userAgent:user_agent)" but again, undocumented and I can't make anything similar work, everything I've tried generates the error above.
I did get MDC to work, but it seems silly to plug in each item into MDC and then clear it.
And I can deserialize an object and log it as nested JSON, but that also seems weird.
All the answers I've seen on this are old - does anyone have any advice on how to do this nicely inside Dropwizard?
You can use logback explicitly in Dropwizard using a custom logger factory, and then set it up with logstash-logback-encoder, and configure it to write out to a JSON appender.
The JSON encoder may look like this:
<included>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<pattern>
<pattern>
{
"id": "%uniqueId",
"relative_ns": "#asLong{%nanoTime}",
"tse_ms": "#asLong{%tse}",
"start_ms": "#asLong{%startTime}",
"cpu": "%cpu",
"mem": "%mem",
"load": "%loadavg"
}
</pattern>
</pattern>
<timestamp>
<!-- UTC is the best server consistent timezone -->
<timeZone>${encoders.json.timeZone}</timeZone>
<pattern>${encoders.json.timestampPattern}</pattern>
</timestamp>
<version/>
<message/>
<loggerName/>
<threadName/>
<logLevel/>
<logLevelValue/><!-- numeric value is useful for filtering >= -->
<stackHash/>
<mdc/>
<logstashMarkers/>
<arguments/>
<provider class="com.tersesystems.logback.exceptionmapping.json.ExceptionArgumentsProvider">
<fieldName>exception</fieldName>
</provider>
<stackTrace>
<!--
https://github.com/logstash/logstash-logback-encoder#customizing-stack-traces
-->
<throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter">
<rootCauseFirst>${encoders.json.shortenedThrowableConverter.rootCauseFirst}</rootCauseFirst>
<inlineHash>${encoders.json.shortenedThrowableConverter.inlineHash}</inlineHash>
</throwableConverter>
</stackTrace>
</providers>
</encoder>
</included>
File on Github
and produce output like this:
{"id":"FfwJtsNHYSw6O0Qbm7EAAA","relative_ns":20921024,"tse_ms":1584163814965,"start_ms":null,"#timestamp":"2020-03-14T05:30:14.965Z","#version":"1","message":"Creating Pool for datasource 'logging'","logger_name":"play.api.db.HikariCPConnectionPool","thread_name":"play-dev-mode-akka.actor.default-dispatcher-7","level":"INFO","level_value":20000}

How to iterate a map using Foreach in mule?

I'm giving "MAP" as input to foreach :
{Id=1, Sum=10, Name=Jon1, Level=1},
{Id=2, Sum=20, Name=Jon2, Level=1},
{Id=3, Sum=30, Name=Jon3, Level=1}...................,
Based on "Sum" value I need to send each record into two different files. Where I struck is I don't know how to write these conditional statements in foreach, when and logger statements where I kept question mark.
<foreach doc:name="For Each" collection="?????????????????">
<choice doc:name="Choice">
<when expression="???????????<=30">
</when>
<otherwise>
<data-mapper:transform doc:name="DataMapper"/>
<logger message="default logger "?????????" level="INFO"doc:name="Logger"/>
</otherwise>
</choice>
</foreach>
please suggest me on this and comment if you know how to write conditional statements if "CSV" is input.I'm new to mule ., Thanks.,
Mule uses MEL based on MVEL as an expression language. It allows you to use the dot syntax to navigate Maps and POJOs etc. or standard Java method invocation:
#[message.payload.get('Sum')]
#[message.payload.Sum]
The foreach will automatically default to the message payload, if you do not provide a collection expresssion. If your payload is a Collection then it should be fine. It looks like your payload is a Collection of maps, so you should be able to use:
<foreach doc:name="For Each" collection="#[message.payload]">
<choice doc:name="Choice">
<when expression="#[message.payload.Sum < 30]">
</when>
<otherwise>
<logger message="#[message.payload.Sum]" level="INFO"doc:name="Logger"/>
</otherwise>
</choice>
</foreach>
If you want to iterate different entries in a SINGLE map you can use the following:
<foreach collection="#[message.payload.entrySet()]">
...
</foreach>
I would link a previous answer of mine, the essence being do not try to code procedural logic into flow, use components for that kind of stuff.
But if you are trying to learn then #Ryan answer is on the mark.

Mule Studio, Transform byte array into MySQL

I've got a Magento connection up and running and want to get all customers.
My subflow looks like this:
<sub-flow name="listCustomers" doc:name="listCustomers">
<magento:list-customers config-ref="MagentoConnecter" doc:name="Magento"/>
<byte-array-to-object-transformer doc:name="Byte Array to Object"/>
<json:object-to-json-transformer doc:name="Object to JSON"/>
</sub-flow>
which results into a string. But I'd like to insert the variables/customer data into a MySQL.
Do I need to use a foreach component?
And how can I address the variables then?
Thanks,
Chris
Foreach seems like a good way to achieve that.
The step to do are the following:
Transform the JSON representation into a list of maps using the JSON Transformer (the returnClass will be java.util.Map)
Introduce the foreach scope
Within this scope insert a jdbc outbound endpoint that performs the insert query

how to access json data mule esb

i want to access json data generated from the sync flow into an async flow.
I am getting json data from sync flow correctly and i want to fetch certain attribute value from that my json data is as follows :
{"data" : [{"in_timestamp":"2012-12-04","message":"hello","out_timestamp":null,"from_user":"user2","ID":43,"to_user":"user1"}]} and to user is #[json:to_user]}
I want to access to_user attribute from this json format.
I have tried using #[json:to_user] but it simply prints it as a string and doesnt return any value.
Please help. Thanks in advance.
The right expression based on your sample JSON is:
#[json:data[0]/to_user]
JsonPath expression are depreciated for now and you will even not get enough document on it for doing ..
So, currently you need to use either :- <json:json-to-object-transformer returnClass="java.lang.Object" doc:name="JSON to Object" />
or <json:json-to-object-transformer returnClass="java.util.HashMap" doc:name="JSON to Object" />
or even <json:json-to-object-transformer returnClass="java.util.List" doc:name="JSON to Object" /> to extract data from JSON depending on the JSON data