I'm trying to parse the following json using logstash
{
"payload": "{\"class\":\"OpenLoyalty\\\\Component\\\\Account\\\\Domain\\\\Event\\\\PointsTransferHasBeenExpired\",\"payload\":{\"accountId\":\"1cbd42b8-1b1f-4a13-a28f-eefeb73a64b0\",\"pointsTransferId\":\"e82c96cf-32a3-43bd-9034-4df343e5f111\"}}"
}
using this filter :
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://localhost:5432/openloyalty"
jdbc_user => "openloyalty"
jdbc_password => "openloyalty"
jdbc_driver_library => "C:\logstash\postgresql-42.2.12.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT payload from events where events.id=130;"
}
}
filter {
json {
source => "payload"
remove_field => ["class"]
}
}
output {
stdout { codec => json_lines }
}
I get this output:
{
"#version": "1",
"#timestamp": "2020-05-12T11:27:15.097Z",
"payload": {
"accountId": "1cbd42b8-1b1f-4a13-a28f-eefeb73a64b0",
"pointsTransferId": "e82c96cf-32a3-43bd-9034-4df343e5f111"
}
}
however, what I'm looking for is like this:
{
"#version": "1",
"#timestamp": "2020-05-12T11:27:15.097Z",
{
"accountId": "1cbd42b8-1b1f-4a13-a28f-eefeb73a64b0",
"pointsTransferId": "e82c96cf-32a3-43bd-9034-4df343e5f111"
}
}
how can I remove the "payload" name from my Json object?
You can use the mutate filter with the following config after your json filter.
mutate {
rename => {
"[payload][accountId]" => "accountId"
"[payload][pointsTransferId]" => "pointsTransferId"
}
remove_field => ["[payload]"]
}
This will rename your fields and remove the empty payload json object after the rename.
Related
I have next json, and json as string in input, I want to parse [value][value] field in value level valid json (not string)
{
"partitionId": 3,
"value": {
"value": "{\"system\":\"system\",\"route\":\"route\",\"type\":\"Z9\"}",
},
"valueType": "VARIABLE"
}
And next filter, but this not work for me
filter {
json {
source => "message"
}
if [valueType] == "VARIABLE" {
json {
source => "[value][value]"
target => "values"
}
}
mutate {
remove_field => [ "timestamp", "kafka" ]
}
}
How I can parse this?
I think you just missed it :)
input {
stdin { }
}
filter {
json {
source => "message"
target => "message"
}
if [valueType] == "VARIABLE" {
json {
source => "[message][value][value]" ##You missed it here
target => "[value]"
}
}
output { stdout { codec => rubydebug }}
I would like to wrap a json object sent to Logstash via the http listener plugin so I can send it off to Splunk. I get an object that looks like:
{
"foo" : "text",
"bar" : "text"
}
and I need it to look like:
{ "event" :
{
"foo" : "text",
"bar" : "text"
}
}
I just can not find how to access that top level object.
config:
input {
http {
port => 8080
codec => "json"
}
}
filter {
mutate {
rename => { "WHAT GOES HERE???" => "event" }
}
}
output {
stdout { codec => json }
}
Thanks so much!
You can use rename or add_field.
With rename
mutate {
rename => { "foo" => "[event][foo]" }
rename => { "bar" => "[event][bar]" }
}
With add_field
mutate {
add_field => { "[event][foo]" => "%{[foo]}" }
add_field => { "[event][bar]" => "%{[bar]}" }
}
If you use add_field, the fields foo and bar are kept in the message, you can remove them using remove_field
mutate {
remove_field => [ "foo", "bar" ]
}
Another elegant way is to use json target within a filter:
filter {
json {
source => "message"
target => "event"
}
}
https://www.elastic.co/guide/en/logstash/current/plugins-filters-json.html#plugins-filters-json-target
I need help regarding logstash filter to extract json key/value to new_field. The following is my logstash conf.
input {
tcp {
port => 5044
}
}
filter {
json {
source => "message"
add_field => {
"data" => "%{[message][data]}"
}
}
}
output {
stdout { codec => rubydebug }
}
I have tried with mutate:
filter {
json {
source => "message"
}
mutate {
add_field => {
"data" => "%{[message][data]}"
}
}
}
I have tried with . instead of []:
filter {
json {
source => "message"
}
mutate {
add_field => {
"data" => "%{message.data}"
}
}
}
I have tried with index number:
filter {
json {
source => "message"
}
mutate {
add_field => {
"data" => "%{[message][0]}"
}
}
}
All with no luck. :(
The following json is sent to port 5044:
{"data": "blablabla"}
The problem is the new field not able to extract value from the key of the json.
"data" => "%{[message][data]}"
The following is my stdout:
{
"#version" => "1",
"host" => "localhost",
"type" => "logstash",
"data" => "%{[message][data]}",
"path" => "/path/from/my/app",
"#timestamp" => 2019-01-11T20:39:10.845Z,
"message" => "{\"data\": \"blablabla\"}"
}
However if I use "data" => "%{[message]}" instead:
filter {
json {
source => "message"
add_field => {
"data" => "%{[message]}"
}
}
}
I will get the whole json from stdout.
{
"#version" => "1",
"host" => "localhost",
"type" => "logstash",
"data" => "{\"data\": \"blablabla\"}",
"path" => "/path/from/my/app",
"#timestamp" => 2019-01-11T20:39:10.845Z,
"message" => "{\"data\": \"blablabla\"}"
}
Can anyone please tell me what I did wrong.
Thank you in advance.
I use docker-elk stack, ELK_VERSION=6.5.4
add_field is used to add custom logic when filter succeeds, many filters have this option. If you want to parse json into a field, you should use target:
filter {
json {
source => "message"
target => "data" // parse into data field
}
}
I am trying to import data from MySQL into an Elastic index using following Logstash script (ELK v 6.22):
input {
jdbc {
jdbc_driver_library => "E:\ELK 6.22\logstash-6.2.2\bin\mysql-connector-java-5.1.45-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/fbk"
jdbc_user => "root"
jdbc_password => ""
statement => "SELECT fbk_repeat._URI AS URI, _SUBMISSION_DATE AS SUBMISSION_DATE, DEVICEID, LOCATION_LAT, LOCATION_LNG, SECTOR, COMMENTS, ACTION_TAKEN, PURPOSE
FROM
fbk_core
INNER JOIN fbk_repeat ON fbk_core._URI = fbk_repeat._PARENT_AURI"
}
}
filter {
# mutate { convert => {"LOCATION_LAT" => "float"} }
# mutate { convert => {"LOCATION_LNG" => "float"} }
# mutate { rename => {"LOCATION_LAT" => "[location][lat]"} }
# mutate { rename => {"LOCATION_LNG" => "[location][lon]"} }
mutate {
# Location and lat/lon should be used as is, this is as per logstash documentation
# Here we are tying to create a two-dimensional array in order to save data as per Logstash documentation
add_field => { "[location][lat]" => [ "%{LOCATION_LAT}" ] }
add_field => { "[location][lon]" => [ "%{LOCATION_LNG}" ] }
convert => [ "[location]", "float" ]
}
# date {
# locale => "eng"
# match => ["_SUBMISSION_DATE", "yyyy-MM-dd HH:mm:ss", "ISO8601"]
# target => "SUBMISSION_DATE"
# }
}
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "feedback"
document_id => "%{URI}"
document_type => "feedbackdata"
manage_template => true
# user => "elastic"
# password => "changeme"
}
stdout { codec => rubydebug { metadata => true } }
# stdout { codec => dots }
}
Once data is imported, I couldn't find any Geo Point field in Kibana to be able to plot data into a map, can anyone guide what must be going wrong.
Thanks!
Data
Elasticsearch can automatically do the mapping, but not for al fields.
You should set your mapping like this for example :
PUT index
{
"mappings": {
"type": {
"properties": {
"location": {
"properties": {
"coordinates": {
"type": "geo_point"
}
}
},
"field": {
"properties": {
"date": {
"format": "yyyy-MM-dd'T'HH:mm:ss.SSSZ",
"type": "date"
}
}
}
}
}
}
}
Adapt this to handle your data.
Don't forget to create the index pattern in Kibana.
I'm trying to parse a JSON file from Keen.io with logstash into elasticsearch. The location and timestamp are stored in parameters like this:
{
"result":
[
{
"keen":
{
"timestamp": "2014-12-02T12:23:51.000Z",
"created_at": "2014-12-01T23:25:31.396Z",
"id": "XXXX",
"location":
{
"coordinates": [-95.8, 36.1]
}
}
}
]
}
My filter currently looks like this:
input {
file {
path => ["test.json"]
start_position => beginning
type => json
}
}
filter {
json {
source => message
remove_field => message
}
}
output {
stdout { codec => rubydebug }
}
How can I parse the "timestamp" and "location" fields so they are used for the #timestamp and #geoip.coordinates in Elasticsearch?
Update:
I've tried variations of this with no luck. The documentation is very basic - am I misunderstanding how to reference the JSON fields? Is there a way of adding debug output to help? I tried How to debug the logstash file plugin and Print a string to stdout using Logstash 1.4? but neither works.
filter {
json {
source => message
remove_field => message
}
if ("[result][0][keen][created_at]") {
date {
add_field => [ "[timestamp]", "[result][0][keen][created_at]" ]
remove_field => "[result][0][keen][created_at]"
}
}
Update 2:
Date is working now, still need to get location working.
filter {
json {
source => message
remove_field => message
add_tag => ["valid_json"]
}
if ("valid_json") {
if ("[result][0][keen][created_at]") {
date {
match => [ "[result][0][keen][created_at]", "ISO8601" ]
}
}
}
}
Keen.io's "created_at" field is stored in ISO 8601 format and so can easily be parsed by the date filter. Lat/long co-ordinates can be set by copying Keen.io's existing co-ordinates into logstash's geoip.coordinates array.
input {
file {
path => ["data.json"]
start_position => beginning
type => json
}
}
filter {
json {
source => message
remove_field => message
add_tag => ["valid_json"]
}
if ("valid_json") {
if ("[result][0][keen][created_at]") {
date {
# Set #timestamp to Keen.io's "created_at" field
match => [ "[result][0][keen][created_at]", "ISO8601" ]
}
}
if ("[result][0][keen][location][coordinates]") {
mutate {
# Copy existing co-orndiates into geoip.coordinates array
add_field => [ "[geoip][coordinates]", "%{[result][0][keen][location][coordinates][0]}" ]
add_field => [ "[geoip][coordinates]", "%{[result][0][keen][location][coordinates][1]}" ]
remove_field => "[result][0][keen][location][coordinates]"
}
}
}
}
output {
stdout { codec => rubydebug }
}