Logstash wrap json object - json

I would like to wrap a json object sent to Logstash via the http listener plugin so I can send it off to Splunk. I get an object that looks like:
{
"foo" : "text",
"bar" : "text"
}
and I need it to look like:
{ "event" :
{
"foo" : "text",
"bar" : "text"
}
}
I just can not find how to access that top level object.
config:
input {
http {
port => 8080
codec => "json"
}
}
filter {
mutate {
rename => { "WHAT GOES HERE???" => "event" }
}
}
output {
stdout { codec => json }
}
Thanks so much!

You can use rename or add_field.
With rename
mutate {
rename => { "foo" => "[event][foo]" }
rename => { "bar" => "[event][bar]" }
}
With add_field
mutate {
add_field => { "[event][foo]" => "%{[foo]}" }
add_field => { "[event][bar]" => "%{[bar]}" }
}
If you use add_field, the fields foo and bar are kept in the message, you can remove them using remove_field
mutate {
remove_field => [ "foo", "bar" ]
}

Another elegant way is to use json target within a filter:
filter {
json {
source => "message"
target => "event"
}
}
https://www.elastic.co/guide/en/logstash/current/plugins-filters-json.html#plugins-filters-json-target

Related

Logstash json filter. Parsing json string in json message

I have next json, and json as string in input, I want to parse [value][value] field in value level valid json (not string)
{
"partitionId": 3,
"value": {
"value": "{\"system\":\"system\",\"route\":\"route\",\"type\":\"Z9\"}",
},
"valueType": "VARIABLE"
}
And next filter, but this not work for me
filter {
json {
source => "message"
}
if [valueType] == "VARIABLE" {
json {
source => "[value][value]"
target => "values"
}
}
mutate {
remove_field => [ "timestamp", "kafka" ]
}
}
How I can parse this?
I think you just missed it :)
input {
stdin { }
}
filter {
json {
source => "message"
target => "message"
}
if [valueType] == "VARIABLE" {
json {
source => "[message][value][value]" ##You missed it here
target => "[value]"
}
}
output { stdout { codec => rubydebug }}

Removing Json object name using Logtash

I'm trying to parse the following json using logstash
{
"payload": "{\"class\":\"OpenLoyalty\\\\Component\\\\Account\\\\Domain\\\\Event\\\\PointsTransferHasBeenExpired\",\"payload\":{\"accountId\":\"1cbd42b8-1b1f-4a13-a28f-eefeb73a64b0\",\"pointsTransferId\":\"e82c96cf-32a3-43bd-9034-4df343e5f111\"}}"
}
using this filter :
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://localhost:5432/openloyalty"
jdbc_user => "openloyalty"
jdbc_password => "openloyalty"
jdbc_driver_library => "C:\logstash\postgresql-42.2.12.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT payload from events where events.id=130;"
}
}
filter {
json {
source => "payload"
remove_field => ["class"]
}
}
output {
stdout { codec => json_lines }
}
I get this output:
{
"#version": "1",
"#timestamp": "2020-05-12T11:27:15.097Z",
"payload": {
"accountId": "1cbd42b8-1b1f-4a13-a28f-eefeb73a64b0",
"pointsTransferId": "e82c96cf-32a3-43bd-9034-4df343e5f111"
}
}
however, what I'm looking for is like this:
{
"#version": "1",
"#timestamp": "2020-05-12T11:27:15.097Z",
{
"accountId": "1cbd42b8-1b1f-4a13-a28f-eefeb73a64b0",
"pointsTransferId": "e82c96cf-32a3-43bd-9034-4df343e5f111"
}
}
how can I remove the "payload" name from my Json object?
You can use the mutate filter with the following config after your json filter.
mutate {
rename => {
"[payload][accountId]" => "accountId"
"[payload][pointsTransferId]" => "pointsTransferId"
}
remove_field => ["[payload]"]
}
This will rename your fields and remove the empty payload json object after the rename.

Logstash cannot extract json key

I need help regarding logstash filter to extract json key/value to new_field. The following is my logstash conf.
input {
tcp {
port => 5044
}
}
filter {
json {
source => "message"
add_field => {
"data" => "%{[message][data]}"
}
}
}
output {
stdout { codec => rubydebug }
}
I have tried with mutate:
filter {
json {
source => "message"
}
mutate {
add_field => {
"data" => "%{[message][data]}"
}
}
}
I have tried with . instead of []:
filter {
json {
source => "message"
}
mutate {
add_field => {
"data" => "%{message.data}"
}
}
}
I have tried with index number:
filter {
json {
source => "message"
}
mutate {
add_field => {
"data" => "%{[message][0]}"
}
}
}
All with no luck. :(
The following json is sent to port 5044:
{"data": "blablabla"}
The problem is the new field not able to extract value from the key of the json.
"data" => "%{[message][data]}"
The following is my stdout:
{
"#version" => "1",
"host" => "localhost",
"type" => "logstash",
"data" => "%{[message][data]}",
"path" => "/path/from/my/app",
"#timestamp" => 2019-01-11T20:39:10.845Z,
"message" => "{\"data\": \"blablabla\"}"
}
However if I use "data" => "%{[message]}" instead:
filter {
json {
source => "message"
add_field => {
"data" => "%{[message]}"
}
}
}
I will get the whole json from stdout.
{
"#version" => "1",
"host" => "localhost",
"type" => "logstash",
"data" => "{\"data\": \"blablabla\"}",
"path" => "/path/from/my/app",
"#timestamp" => 2019-01-11T20:39:10.845Z,
"message" => "{\"data\": \"blablabla\"}"
}
Can anyone please tell me what I did wrong.
Thank you in advance.
I use docker-elk stack, ELK_VERSION=6.5.4
add_field is used to add custom logic when filter succeeds, many filters have this option. If you want to parse json into a field, you should use target:
filter {
json {
source => "message"
target => "data" // parse into data field
}
}

How to parse nested JSON data with logstash filter

I have a json file which is having data like this
{
"foo" : "bar",
"test" : {
"steps" : [{
"response_time" : "100"
}, {
"response_time" : "101",
"more_nested" : [{
"hello" : "world"
}, {
"hello2" : "world2"
}
]
}
]
}
}
I am using logstash filter, which is giving me wrong result
input {
file {
sincedb_path => ".../sincedb_path.txt"
path => ".../test.json"
type => "test"
start_position => "beginning"
}
}
filter{
json{
source => "message"
}
}
output {
stdout { codec => rubydebug}
}
How can i accomplish the events like steps.response_time :100
steps.more_nested.hello : world.
I tried with ruby but not working.
I am using logstash 5.x version

Interpret locations from Keen.io JSON file in logstash filter

I'm trying to parse a JSON file from Keen.io with logstash into elasticsearch. The location and timestamp are stored in parameters like this:
{
"result":
[
{
"keen":
{
"timestamp": "2014-12-02T12:23:51.000Z",
"created_at": "2014-12-01T23:25:31.396Z",
"id": "XXXX",
"location":
{
"coordinates": [-95.8, 36.1]
}
}
}
]
}
My filter currently looks like this:
input {
file {
path => ["test.json"]
start_position => beginning
type => json
}
}
filter {
json {
source => message
remove_field => message
}
}
output {
stdout { codec => rubydebug }
}
How can I parse the "timestamp" and "location" fields so they are used for the #timestamp and #geoip.coordinates in Elasticsearch?
Update:
I've tried variations of this with no luck. The documentation is very basic - am I misunderstanding how to reference the JSON fields? Is there a way of adding debug output to help? I tried How to debug the logstash file plugin and Print a string to stdout using Logstash 1.4? but neither works.
filter {
json {
source => message
remove_field => message
}
if ("[result][0][keen][created_at]") {
date {
add_field => [ "[timestamp]", "[result][0][keen][created_at]" ]
remove_field => "[result][0][keen][created_at]"
}
}
Update 2:
Date is working now, still need to get location working.
filter {
json {
source => message
remove_field => message
add_tag => ["valid_json"]
}
if ("valid_json") {
if ("[result][0][keen][created_at]") {
date {
match => [ "[result][0][keen][created_at]", "ISO8601" ]
}
}
}
}
Keen.io's "created_at" field is stored in ISO 8601 format and so can easily be parsed by the date filter. Lat/long co-ordinates can be set by copying Keen.io's existing co-ordinates into logstash's geoip.coordinates array.
input {
file {
path => ["data.json"]
start_position => beginning
type => json
}
}
filter {
json {
source => message
remove_field => message
add_tag => ["valid_json"]
}
if ("valid_json") {
if ("[result][0][keen][created_at]") {
date {
# Set #timestamp to Keen.io's "created_at" field
match => [ "[result][0][keen][created_at]", "ISO8601" ]
}
}
if ("[result][0][keen][location][coordinates]") {
mutate {
# Copy existing co-orndiates into geoip.coordinates array
add_field => [ "[geoip][coordinates]", "%{[result][0][keen][location][coordinates][0]}" ]
add_field => [ "[geoip][coordinates]", "%{[result][0][keen][location][coordinates][1]}" ]
remove_field => "[result][0][keen][location][coordinates]"
}
}
}
}
output {
stdout { codec => rubydebug }
}