Logging in json - json

I would like to log messages in a jSon format from within Java. I would like the convenience of professional logging like log4j for hierarchical loggers and method names, but I would also like to output other key-value names in the json object.
I am looking for output simillar to this:
{ 'time':'123' , level:'debug', action: 'open',filename:'bla.txt'}
{ 'time':'432' , level:'info', action: 'calculate',result:'353'}
If I use log4j and reformat I cannot get the automatic values (timestamp for example) in the same object as the logged values.
Is there a logging framework or plugin to solve this?

I started using this project
Works great
https://github.com/michaeltandy/log4j-json

Related

Logging Additional Fields in Quarkus with JSON

I am currently attempting to change our logs format in Quarkus from String to JSON with some additional fields that are important for our monitoring and data analysis in elastic/kibana.
So far I have added this dependency as specified in the official documentation
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-logging-json</artifactId>
</dependency>
https://quarkus.io/guides/logging
That changed the log format from a normal String to a full JSON format.
For example:
{"timestamp":"2022-09-05T13:30:09.314+01:00","sequence":24441,"loggerClassName":"org.jboss.logging.Logger","loggerName":"org.com.Controller","level":"INFO","message":"Test","threadName":"executor-thread-0","threadId":354,"mdc":{},"ndc":"","hostName":"hostname","processName":"test.jar","processId":9552}
My question is, how do I add additional fields to this log output, for instance I need to add and additional json field called 'pattern' with a value extracted from the code each time. The final json output will look like this:
{"timestamp":"2022-09-05T13:30:09.314+01:00","sequence":24441,"loggerClassName":"org.jboss.logging.Logger","loggerName":"org.com.Controller","level":"INFO","message":"Test","threadName":"executor-thread-0","threadId":354,"mdc":{},"ndc":"","hostName":"hostname","processName":"test.jar","processId":9552, "pattern" :"test-pattern"}
I tried the following as specified in the documentation:
quarkus.log.file.json.additional-field.pattern.value=test-value
quarkus.log.file.json.additional-field.pattern.type=string
But this didn't show anything, and Im not sure how to use it programmatically,
Example configuration
quarkus.log.console.json.additional-field."EXTRA".value=test-value
quarkus.log.console.json.additional-field."EXTRA".type=string
quarkus.log.file.json.additional-field."EXTRA".pattern.value=test-value
quarkus.log.file.json.additional-field."EXTRA".pattern.type=string
Should have double quotes. and example output
{"timestamp":"2022-09-18T14:37:37.687+01:00","sequence":1548,"loggerClassName":"org.jboss.logging.Logger","loggerName":"org.acme.GreetingResource","level":"INFO","message":"Hello","threadName":"executor-thread-0","threadId":101,"mdc":{},"ndc":"","hostName":"mintozzy-mach-wx9","processName":"code-with-quarkus-dev.jar","processId":133458,"EXTRA":"test-value"}
for full working example check
You might be able to solve your problem using the quarkiverse logging json extension using their JsonProvider
https://github.com/quarkiverse/quarkus-logging-json
The quarkiverse logging json extension is much more flexible/extensible than the standard quarkus logging json packages, since you can add fields to the json log programatically, instead of in hard coded configuration

Convert and Transform JSON HTTP request to XML

I need to create a Logic Apps workflow with three steps:
When HTTP Request is received (JSON)
Convert Json from request to XML
Save XML file to FTP
What I have done so far:
Add action "When HTTP Request is received"
Add Liquid to Convert JSON to XML
(but i don't see option JSON to XML...Only Tranform JSON to JSON, JSON to
TEXT, XML to JSON, XML to TEXT)
Add action "FTP - Create file"
I also created Integration Account and try to add map for mapping JSON to XML, but I can't find any examples/templates to do this...
Is it possible at all ? Maybe there is another way to convert between these two formats ?
When you just want to convert a JSON payload to an XML file, without doing any transformation to the data, you can use the built-in xml() function of the Workflow Definition Language.
Detailed info in the docs: Workflow Definition Language reference #xml
I've made a small test Logic App to demo your usecase. It looks like this:
As you can see I use the xml function on the triggerbody #xml(triggerBody()) as an input for my FTP file content.
Remark: This will only work if your JSON message has a single rootnode. Otherwise the xml conversion will fail. You'll get this error:
The provided value cannot be converted to XML: 'JSON root object has multiple properties. The root object must have a single property in order to create a valid XML document. Consider specifying a DeserializeRootElementName.
You can work around that by concatenating a rootnode to your JSON payload. The function then would look like: #xml(json(concat('{\"rootnode\":',triggerBody(),'}')))
Good luck testing this out. Let me know if you need more help with this.

JSON to JSON Mapper

We need a JSON mapper from Type-A to Type-B ( i.e. JSON to JSON string). I'm aware of ESB tools which has mapping for XML to XML like IBM ESB.
So do we have any open source tool or paid application
Which has an editor to do mapping of JSON to other JSON , with capability to do some basic operations like formatting, etc
Can this transformation be exposed as REST service
If needed be, extract this transformation logic as JAR file and other team can use it
Thanks.
Manjesh,
I have good news for you. There is indeed an open source program that will accomplish this for you. Talend Open Studio (TOS) ESB (not to be confused with their TOS for Data Integration). Any ESB tool should do this quite easily. See below:
Image 1 shows in SoapUI where I am calling a REST service, passing the JSON prefix: team1, team: Giants is sent in. I return: Prefix: Cowboys are better than, Team: Giants. I could have done other manipulations (including changing the json structure) but put together a simple example.
The next image shows the Talend REST service implementation within Talend:
finally, I show the internals of the component (tXMLMap_2) where I manipulate the json data.

Moustache template for Swagger codegen static documentation - responseMessages

I am trying to use swagger-codegen to generate static-docs.
The docs are generated based on Moustache templates that are included in the project.
When I run it with the sample JSON from wordnik Swagger api-docs, it generates everything perfectly (every api has it's own .file like Pet.html, User.html), but when I try to run it with similar JSON of mine, it generates only 1 operations file containing all the methods of my REST api.
Wordnik JSON reponse can be found at worndik JSON api
Mine API response looks like this:
{"apiVersion":"1.0","swaggerVersion":"1.2","apis":[{"path":"/default/countries","description":"Operations about countries"},{"path":"/default/gateways","description":"Operations on payment gateways"},{"path":"/default/location","description":"Operations about locations"},{"path":"/default/mccs","description":"Operations about MCCs"},{"path":"/default/merchants","description":"Operations about merchants"},{"path":"/default/partners","description":"Operations about partners"},{"path":"/default/payments","description":"Operations about payments"},{"path":"/default/resources","description":"Operations about resources"},{"path":"/default/terminals","description":"Operations about terminals"},{"path":"/default/terminalsubsetdefaultresourceset","description":"Operations about terminalSubsetDefaultResourceSet"},{"path":"/default/users","description":"Operations about users"}],"info":{"title":"my API","description":"","termsOfServiceUrl":"","contact":"","license":"","licenseUrl":""}}
Also, I would like to extract ReponseMessage Codes in every operation that has them in JSON. I tried to add
{{#ResponseMessages}}
<h3 class="responseMessages">{{message}}</h3>
{{/ResponseMessages}}
to operations.model, but It doesn't work (not with myApi, nor with Wordnik) (I have similar JSON like this:JSON with responseCodes
You should be able to use the following (at least it works for me), based on the current version of Codegen.scala
{{#errorList}}
{{code}} {{reason}} {{responseModel}}
{{/errorList}}

SpringXD JSON parser to Oracle DB

I am trying to use SpringXD to stream some JSON metrics data to a Oracle database.
I am using this example from here: SpringXD Example
Http call being made: EarthquakeJsonExample
My shell cmd.
stream create earthData --definition "trigger|usgs| jdbc --columns='mag,place,time,updated,tz,url,felt,cdi,mni,alert,tsunami,status,sig,net,code,ids,souces,types,nst,dmin,rms,gap,magnitude_type' --driverClassName=driver --username=username --password --url=url --tableName=Test_Table" --deploy
I would like to capture just the properties portion of this JSON response into the given table columns. I got it to the point where it doesn't give me a error on the hashing but instead just deposits a bunch of nulls into the column.
I think my problem is the parsing of the JSON itself. Since really the properties is in the Features array. Can SpringXD distinguish this for me out of the box or will I need to write a custom processor?
Here is a look at what the database looks like after a successful cmd.
Any advice? Im new to parsing JSON in this fashion and im not really sure how to find more documentation or examples with SpringXD itself.
Here is reference to the documentation: SpringXD Doc
The transformer in the JDBC sink expects a simple document that can converted to a map of keys/values. You would need to add a transformer upstream, perhaps in your usgs processor or even a separate processor. You could use a #jsonPath expression to extract the properties key and make it the payload.