Filter json objects - json

Some of my logs contain json in their message field. I use the json filter as follow:
json {
skip_on_invalid_json => true
source => "message"
target => "json"
}
To try to parse the message field, and if it contains valid json add it to the json field.
Unfortunately from time to time, I receive logs which contain a single string like "some random message" in the message field. In these logs the string from message end-up in the json and messes up the index mapping.
I htried to filter this out by adding:
prune {
blacklist_values => { "json" => "/.+/" }
}
But this seems to always remove the json field.
Is there a way to parse the message field or keep the json field only when it contains an object and not a single string?

You could do it using a ruby filter that tests the field you are interested in
ruby {
code => '
s = event.get("json")
if s and s.instance_of? String
event.remove("json")
end
'
}
That will not remove [json] if it is a hash or array.

Related

How do I PUT and GET JSON object of ANY format in a Spring Boot REST API using MySQL?

I need to be able to PUT a JSON data from POSTMAN, with no fixed format, store it in database(in MYSQL preferably with datatype: JSON) and then send a GET request to get the same value back. The data can be uniquely identified by an id that I'll be sending as a path variable.
I cannot define an entity class as the JSON will have no fixed format. How do I proceed?
#PutMapping("/sample/{sampleNo}")
public ResponseEntity<Object> addSampleData(
#ApiParam("Sampleto PUT") #PathVariable Long sampleNo, #RequestBody String sampleBody) {
if (sampleBody!= null) {
sampleService.save(new Sample(sampleNo, sampleBody));
return new ResponseEntity<>(HttpStatus.OK);
} else {
return new ResponseEntity<>(HttpStatus.BAD_REQUEST);
}
}
Not sure how to proceed with the "save" functionality so that the data is stored as a JSON object, because even if I've used datatype JSON n mySQL, GET returns a string with "\".
{
"sampleNo": 1,
"sampleData": "{\"sample\": \"test\", \"sampleNo\": \"20\"}"
}
Example: PUT request body
{
"field1": "test",
"field2": 20
}
GET response body expected
{
"field1": "test",
"field2": 20
}
P.S: I don't need to process the data, so it doesn't matter how I store it, I just need to be able to GET it back in the same format(JSON) it arrived(in the PUT req).
sampleData is returned as JSON String, so it is reasonable to include escape sequence \":
This is a valid JSON String:
String json = "{\"sample\": \"test\", \"sampleNo\": \"20\"}";
This does not compile:
String invalidJson = "{"sample": "test", "sampleNo": "20"}";
Solution:
In order to have the expected response, you have to map your MySQL JSON column into an Object.
There are several ways to achieve this, take a look at a solution here. It maps JSON column into a Map<String, Object>.

How to validate doctrine type "json" with symfony json constraint in a form correctly?

So, I'd like to enter some json into a form for it to be validated by symfonys json constraint:
/**
* #Assert\Json(
* message = "variantJson field: invalid Json."
* )
* #ORM\Column(type="json", nullable=true)
*/
private $variantJson = [];
The form looks kinda like this:
$builder
...
->add('variantJson', null, ['attr' => $style])
->addEventListener(FormEvents::PRE_SET_DATA, function (FormEvent $event) {
...
}
})
;
$builder->get('variantJson')
->addModelTransformer(new CallbackTransformer(
function ($jsonToString) {
// transform the array to a string
return json_encode($jsonToString);
},
function ($stringToJson) {
// transform the string back to an array
dump(json_decode($stringToJson, true));
dump(json_last_error());
//1
return $stringToJson;
//2
return json_decode($stringToJson, true);
}
))
;
The main problem is, that when I try to only return the json string in the ModelTransformer, I get this exception:
Expected argument of type "array or null", "string" given at property
path "variantJson".
At the "PropertyAccessor"
And when I want to return as an array, I do the json_decode, and get a different error:
Expected argument of type "string", "array" given
At the "JsonValidator".
My suspection is, both PropertyAccessor and JsonValidator are in serial, and both need different types.
I must be missing something. Any ideas? Thanks in advance!
The Doctrine type JSON does indeed expect an array:
https://www.doctrine-project.org/projects/doctrine-dbal/en/2.10/reference/types.html#json
This is useful if you have the need for an Array in your code and want to save that data in the database. The data gets converted to a JSON string before saving and back to an Array when retrieved.
If you don't need the array in your PHP code and your end-user is submitting JSON somehow that needs to be validated. You should probably just save it as a (LONG)TEXT.

Logstash json filter parsed fields cannot be read within logstash

I am parsing a json file with "codec => json" in the input and " json { source=>message }" in the filter.
I have also tried alternating the two.
The parsed fields cannot be read by logstash using "if [comment]". This will not work despite the being about to see the field with values with "stdout { codec => rubydebug }" as output
I just found out that the fields that I am trying to work with are actually sub fields. I was trying to access them like normal fields.

Get json data from var

I gets following data in to a variable fields
{ data: [ '{"myObj":"asdfg"}' ] }
How to get the value of myObj to another variable? I tried fields.myObj.
I am trying to upload file to server using MEANjs and node multiparty
Look at your data.
fields only has one property: data. So fields.myObj isn't going to work.
So, let's start with fields.data.
The value of that is an array. You can see the []. It has only one member, so:
fields.data[0]
This is a string. You seem to want to treat it as an object. It happens to conform to the JSON syntax, so you can parse it:
JSON.parse(fields.data[0])
This parses into an object, so now you can access the myObj property.
JSON.parse(fields.data[0]).myObj
var fields = { data: [ '{"myObj":"asdfg"}' ] };
alert(JSON.parse(fields.data[0]).myObj);

Logstash remove field after JSON-parsing it

Using Logstash 1.4.2 with ElasticSearch 1.3 (I'm aware it's not the latest ES version available) on Ubuntu 14.04 LTS.
We have an event stream which contains JSON inside one of its fields named "message".
We'd like to replace the event fields by the JSON of that field if it's found.
We'd also like to remove the ORIGINAL "message" field (the one which contains the JSON string) if found and parsed.
The problem is that the JSON object inside the field's text could define a new "message" field, which we have to retain.
The following removes the "message" field always after parsing it:
json {
source => "message"
remove_field => [ "message" ]
}
Which is wrong, we want to keep it in case there was a "message" field inside the value of the original "message" field.
I tried to do the following trick, but it seems to still remove the "message" field from the result:
mutate {
rename => [ "message", "___temp_logstash_filter_message___" ]
}
json {
source => "___temp_logstash_filter_message___"
}
mutate {
remove_field => [ "___temp_logstash_filter_message___" ]
}
i.e. I try to rename the original "message" field to an arbitrary internal name which I don't expect to appear in the input value, parse the JSON string using that temporary name as a source, then remove the renamed original field.
That way I was hoping to distinguish between the "original" message field and any "message" field which may be contained inside its JSON value. But this doesn't seem to make a difference - the "message" field is still missing from the result.
Is there a way to achieve what I need?
Thanks.
Instead of renaming the field, copy the first field into a new one.
This can be done with:
filter {
...
ruby {
code => "event['new_field'] = event['old_field']"
}
...
}