Let's say I have a simple json file such as the following
{
"log": {
"host": "blah",
"severity": "INFO",
"system": "1"
}
}
I'm using Apache Camel, and it's Spring XML to process and route the json file. My routing code looks something like this:
<route>
<from uri="file:/TESTFOLDER/input"/>
<choice>
<when>
<jsonpath>$.log?(#.severity == 'WARNING')</jsonpath>
<to uri="smtp://(smtpinfo...not important)"/>
</when>
<otherwise>
<to uri="file:/TESTFOLDER/output"/>
</otherwise>
</choice>
</route>
The part that I'm really confused about is the JSONPath expression. The expression I have above isn't even syntactically correct, because its hard to find examples for the case where you aren't trying to sort through a list of elements. My goal is to only send an email if the severity of the log is 'WARNING' but I can't come up with the expression.
This worked for me using Camel 2.13.1 (I checked for INFO as your JSON example has this severity; you may change this according to your needs):
<jsonpath>$..log[?(#.severity == 'INFO')]</jsonpath>
Note the .. and the []. However, using a single dot . at the beginning of the search path failed:
<jsonpath>$.log[?(#.severity == 'INFO')]</jsonpath>
The error messages said:
java.lang.IllegalArgumentException: Invalid container object
This may be a bug.
According to the JSON Path doc, .. stands for "recursive descent". This may not meet your requirements. However, as a single dot . didn't work, this was the only possible work around I figured out. Otherwise, you may rise a bug ticket.
Related
I've implemented a Mule flow that reads CSV files and insert the records into Salesforce using a batch.
To manage the errors I have created a step that only accepts failed records.
I've tried modifying the original values so that it fails and proving that it works properly.
The Salesforce response message is a JSON containing a field called statusCode with the following value: INVALID_TYPE_ON_FIELD_IN_RECORD.
However Mule does not recognize it as an error and does not fail so it never enters the Step of failed records.
How can I modify this? Should I change it in Salesforce or add the statusCode cases in Error Mapping?
In Mule 4 you can use raiser-error to force an error. Then you just need to define what expression to trigger your expression:
#[sizeOf((payload.errors default [])) > 0]
or
#[payload.errors[0].statusCode=='INVALID_TYPE_ON_FIELD_IN_RECORD']
etc.
Example using choice router:
<choice doc:name="successful?">
<when expression="#[sizeOf((payload.errors default [])) > 0]">
<raise-error type="APP:INVALID_TYPE_ON_FIELD_IN_RECORD" />
</when>
</choice>
Alternative to controlling flow with errors is setting the acceptExpression on the batch step with the same expression:
<batch:step name="step1" acceptExpression="#[sizeOf((payload.errors default [])) > 0]">
i'm using camel in a rest context and i've to manipulate a json got from a request . It's something like:
{
'field1':'abc',
'field2':'def'
}
All i've to do is to extract field1 and field2 and put them in 2 properties, so i tried something like that
<setProperty propertyName="Field1">
<jsonpath>$.field1</jsonpath>
</setProperty>
<setProperty propertyName="Field2">
<jsonpath>$.field2</jsonpath>
</setProperty>
but i get this error:
org.apache.camel.ExpressionEvaluationException:
com.jayway.jsonpath.PathNotFoundException: Expected to find an object with property ['field2'] in path $ but found 'java.lang.String'. This is not a json object according to the JsonProvider: 'com.jayway.jsonpath.spi.json.JsonSmartJsonProvider'.
and after some tests i found out my body was empty after the first use of jsonpath.
The same process applied to an XML using xpath doesn't give any error, and i'm wondering if it's possible to do the same with jsonpath instead to create a mapper object in java. thank you in advance
If the processed Camel message is of type InputStream, this stream can obviously be read only once.
To solve this:
either enable Camel stream caching (http://camel.apache.org/stream-caching.html)
or insert a step (before jsonpath queries) in your route to convert message body to a string (so that it can be read multiple times:
(eg <convertBodyTo type="java.lang.String" charset="ISO-8859-1">) )
I'm working with Dropwizard 1.3.2, which does logging using SLF4J over Logback. I am writing logs for ingestion into ElasticSearch, so I thought I'd use JSON logging and make some Kibana dashboards. But I really want more than one JSON item per log message - if I am recording a status update with ten fields, I would ideally like to log the object and have the JSON fields show up as top level entries in the JSON log. I did get MDC working but that is very clumsy and doesn't flatten objects.
That's turned out to be difficult! How can I do that? I have it logging in JSON, but I can't nicely log multiple JSON fields!
Things I've done:
My Dropwizard configuration has this appender:
appenders:
- type: console
target: stdout
layout:
type: json
timestampFormat: "ISO_INSTANT"
prettyPrint: false
appendLineSeparator: true
additionalFields:
keyOne: "value one"
keyTwo: "value two"
flattenMdc: true
The additional fields show up, but those values seem to be fixed in the configuration file and don't change. There is a "customFieldNames" but no documentation on how to use it, and no matter what I put in there I get a "no String-argument constructor/factory method to deserialize from String value" error. (The docs have an example value of "#timestamp" but no explanation, and even that generates the error. They also have examples like "(requestTime:request_time, userAgent:user_agent)" but again, undocumented and I can't make anything similar work, everything I've tried generates the error above.
I did get MDC to work, but it seems silly to plug in each item into MDC and then clear it.
And I can deserialize an object and log it as nested JSON, but that also seems weird.
All the answers I've seen on this are old - does anyone have any advice on how to do this nicely inside Dropwizard?
You can use logback explicitly in Dropwizard using a custom logger factory, and then set it up with logstash-logback-encoder, and configure it to write out to a JSON appender.
The JSON encoder may look like this:
<included>
<encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<pattern>
<pattern>
{
"id": "%uniqueId",
"relative_ns": "#asLong{%nanoTime}",
"tse_ms": "#asLong{%tse}",
"start_ms": "#asLong{%startTime}",
"cpu": "%cpu",
"mem": "%mem",
"load": "%loadavg"
}
</pattern>
</pattern>
<timestamp>
<!-- UTC is the best server consistent timezone -->
<timeZone>${encoders.json.timeZone}</timeZone>
<pattern>${encoders.json.timestampPattern}</pattern>
</timestamp>
<version/>
<message/>
<loggerName/>
<threadName/>
<logLevel/>
<logLevelValue/><!-- numeric value is useful for filtering >= -->
<stackHash/>
<mdc/>
<logstashMarkers/>
<arguments/>
<provider class="com.tersesystems.logback.exceptionmapping.json.ExceptionArgumentsProvider">
<fieldName>exception</fieldName>
</provider>
<stackTrace>
<!--
https://github.com/logstash/logstash-logback-encoder#customizing-stack-traces
-->
<throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter">
<rootCauseFirst>${encoders.json.shortenedThrowableConverter.rootCauseFirst}</rootCauseFirst>
<inlineHash>${encoders.json.shortenedThrowableConverter.inlineHash}</inlineHash>
</throwableConverter>
</stackTrace>
</providers>
</encoder>
</included>
File on Github
and produce output like this:
{"id":"FfwJtsNHYSw6O0Qbm7EAAA","relative_ns":20921024,"tse_ms":1584163814965,"start_ms":null,"#timestamp":"2020-03-14T05:30:14.965Z","#version":"1","message":"Creating Pool for datasource 'logging'","logger_name":"play.api.db.HikariCPConnectionPool","thread_name":"play-dev-mode-akka.actor.default-dispatcher-7","level":"INFO","level_value":20000}
I have XML files following an XSD and I need to transform them into JSON.
The files are typically like this
example.xml :
<object name="foo">
<values>one</values>
<values>two</values>
<values>three</values>
<param attr="2" value="true" />
</object>
Which translate into JSON to this
{
"name" : "foo",
"values" : [
"one",
"two",
"three"
],
"param" : {
"attr" : "2",
"value" : "true"
}
}
This is almost fine, except that I would like the data to be typed, so that param becomes :
"param" : {
"attr" : 2,
"value" : true
}
The XML files reference an XSD schema that defines the data type for each element or attribute, such as
<xs:attribute name="attr" type="xs:integer"
The XML to JSON transformation is done using XML::Simple to read the XML into a Perl hash and the JSON module is used to encode into JSON.
How could I do the same job but using the definitions from the XSD Schema to load the XML with the right type for each field?
I need to use the XSD because it may happen that text field are made of only numbers.
Well, the summary answer is - you can't do what you're trying to do, the way you're trying to do it.
XML is a 'deeper' structure than JSON, in that it has style sheets and attributes, so inherently you'll be discarding data in the transformation process. How much is acceptable to discard is always going to be a case by case sort of thing.
More importantly - both XML and JSON are designed with similar use cases - to be machine readable/parsable. Almost everything that can 'read' JSON programatically can also read XML because libraries for both are generally available.
And most importantly of all - Don't use XML::Simple as it's not "Simple" it's for "Simple" XML. Which yours isn't. Both XML::Twig and XML::LibXML are much better tools for almost any XML parsing job.
So really - what you need to do is backtrack a bit, and explain what you're trying to accomplish and why.
Failing that though - I would probably try a simplistic 'type test' within perl, using regex to detect if something is 'just' numeric, or 'just' boolean, and treat everything else as string.
i want to access json data generated from the sync flow into an async flow.
I am getting json data from sync flow correctly and i want to fetch certain attribute value from that my json data is as follows :
{"data" : [{"in_timestamp":"2012-12-04","message":"hello","out_timestamp":null,"from_user":"user2","ID":43,"to_user":"user1"}]} and to user is #[json:to_user]}
I want to access to_user attribute from this json format.
I have tried using #[json:to_user] but it simply prints it as a string and doesnt return any value.
Please help. Thanks in advance.
The right expression based on your sample JSON is:
#[json:data[0]/to_user]
JsonPath expression are depreciated for now and you will even not get enough document on it for doing ..
So, currently you need to use either :- <json:json-to-object-transformer returnClass="java.lang.Object" doc:name="JSON to Object" />
or <json:json-to-object-transformer returnClass="java.util.HashMap" doc:name="JSON to Object" />
or even <json:json-to-object-transformer returnClass="java.util.List" doc:name="JSON to Object" /> to extract data from JSON depending on the JSON data