logstash not parsing all messages - json

im trying to parse syslog messages with logstash
im using grok match :
"message", "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{WORD:module}\] \[%{LOGLEVEL:severity}\] (\[tid=%{DATA
:tid}(\s)?\]) (\[%{DATA:auth}:%{NUMBER:linenum}:%{DATA:method}\]) %{GREEDYDATA:data}"
and i'm getting this error message:
[logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"[2017-03-27 09:18:03,071] [WS-Server] [INFO] [WebSocketsHandler:81:on_message] [ON_MESSAGE] [140609651632016] [trn_id: 1062fed9-9ae3-4523-8817-657031e83af1] Received update-device-details message from box 38:b8:eb:50:00:a9", :exception=>#
what im missing ?
thanks

It's likely because this:
[2017-03-27 09:18:03,071] [WS-Server] [INFO] [WebSocketsHandler:81:on_message] [ON_MESSAGE] [140609651632016] [trn_id: 1062fed9-9ae3-4523-8817-657031e83af1] Received update-device-details message from box 38:b8:eb:50:00:a9
Is not valid JSON. Your hint is this in the error:
[logstash.filters.json ] Error parsing json
That tells me there is a json {} filter being applied to this in addition to the grok {} segment you've quoted. I suggest you review your filter { } statements and figure out how the raw syslog message (very likely still the message field) is ending up getting parsed by a json {} filter. Grok has nothing to do with this.

Related

how to get error message at return response using json in postman

how to check only one value is accepted when the request is inserted? if both values are passes then send the error message using json. Either RegNumber or Pan should be accepted. i am very new to this
{
"email_id": "sample#example.com",
"RegNumber": "AM47",
"FromDate": 062020,
"ToDate": 062022,
"SchemaCode": 560043,
"PAN": "ABC44X"
}
I think you meant by get error in json response means app should return Json when it crashes, so to do, we must use middlewares and when you expect some error in response override that response error with your own custom json dict with error details.

Getting an error while sending a json payload in quartz scheduler

I'm trying to send a json payload in quartz scheduler. I have the header set to Content-type:application/json, but for some reason my json string is throwing an error: Uncaught error, unexpected token in json.
The original json that I'm sending to a graphql service looks like this:
{
GetAllAuthors{
id
name
}
}
But to make it work in quartz, I need to mimic a rest API call, which is why I tried using the following:
{ "query":{{""{\nGetAllAuthors {\nid\nname\n}\n\n}""}} }
The above is also giving me the "Uncaught error, unexpected token in json" error. Is there something that I'm missing or overlooking?
PS: I tried using an online json formatter and when I try to validate the above json, I get the following error:
Error: Parse error on line 2:
{ "query": { { "" {\ nGetA
--------------^
Expecting 'STRING', '}', got '{'
That's not valid Json. This is how it might look if it were:
{
"GetAllAuthors": [
"id",
"name"
]
}
but I suspect you're trying for something like this:
{
"GetAllAuthors": {
"id": 123,
"name": "James Brown"
}
}
Play here until you get it right: https://jsonlint.com/
Edit: I've not worked with GraphQL, but this page shows how a (non JSON) GraphQL query might be transferred over Http by POSTing JSON or GETting using querystrings: https://graphql.org/learn/serving-over-http/
I figured this out by testing through Mozilla's network tab - the correct format is:
{"query":"{\n GetAllAuthors{\n id\n name\n}\n}\n","variables":null, "operationName":null}

PutElasticsearchHttpRecord: Invalid char between encapsulated token and delimiter

I am trying to fetch a specific record from Database using QueryDatabaseTable -> UpdateAttribute-> PutElasticSearchHttpRecord
The ES processor is throwing error as Java.IO.Exception Invalid char between encapsulated token and delimiter.
Please find attached my config. How to fix this?
I am getting the correct result in the queue after 'UpdateAttribute' but not able to push it into ES. I have added the schema.name property to appropriate schema.
The following is the correct result i am getting in the queue after UpdateAttribute processor. How to fix the error of Invalid character between token and delimiter?
[ {
"TimeOfDay" : "2018-09-20T18:10:36.941",
"BMU_Debug_Pack_BlkVolt_Max2" : 4114.0,
"BMU_Debug_Pack_BlkVolt_Max1" : 4114.0,
"BMU_Debug_Pack_BlkVolt_Max3" : 4114.0,
"BMU_Debug_Pack_BlkVolt_Max0" : 4116.0,
"BMU_Debug_Pack_CTemp_Min" : 21.0,
"BMU_Debug_Pack_CurrVolt_Curr" : 2.0,
"BMU_Debug_Pack_Blk_Volt_Delta" : 6.0,
"total_Difference" : 15.0
} ]
Thank you! Please help what should I change?
You need to configure Avro Reader instead of CSV Reader in PutElasticSearchHttpRecord as QuerydatabaseTable processor outputs flowfile in Avro format.
Use embedded avro schema in Avro Reader controller service.

Logstash JSON serialization fails on valid JSON (mapper_parsing_exception)

Given the following multiline log
{
"code" : 429
}
And the following pipeline logstash.conf
filter {
grok {
match =>
{
"message" =>
[
"%{GREEDYDATA:json}"
]
}
}
json {
source => "json"
target => "json"
}
}
When Log is send into logstash through filebeat
Then Logstash returns
[2018-08-07T10:48:41,067][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-to-logstash", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2bf7b08d>], :response=>{"index"=>{"_index"=>"filebeat-to-logstash", "_type"=>"doc", "_id"=>"trAAFGUBnhQ5nUWmyzVg", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [json]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:3846"}}}}}
This is incorrect behavior as the JSON is perfectly valid, how should this be solved?
I found out that in Logstash 6.3.0 this problem occurs when one tries to serialize JSON on the "json" field. Changing this field name to anything else solves this issue.
Since Elastic JSON filter plugin documentation does not mention anything about this behaviour and the error message is inaccurate it can be assumed this is a bug.
Bug report has been send: https://github.com/elastic/logstash/issues/9876

Postman: More descriptive tv4 validation error message

I'm using postman to validate the schema of json data returned from an api.
I have a test that runs through basic http validation, then ends with:
if (tv4.error){
console.log("Validation failed: ", tv4.error);
}
The error I get back is difficult to fathom.
Validation failed: 12:22:41.316
Object:{}
message:"Invalid type:
number (expected string)"
name:"ValidationError"
type:"Error"
But I need to know which field the validation failed on. How can I get this info? The npm page for tv4 suggests that the error message should be more descriptive.
According to the documentation of tv4, you can print the path of the error location using console.log(tv4.error.dataPath), I have no idea why this attribute is not logged in the console.
Documentation is here.
The relevant section in the documentation is:
If validation returns false, then an explanation of why validation failed can be found in tv4.error.
The error object will look something like:
{
"code": 0,
"message": "Invalid type: string",
"dataPath": "/intKey",
"schemaPath": "/properties/intKey/type"
}