Nested json objects in logstash aggregate filter plugin - json

I'm using logstash aggregate filter plugin to insert data to ES.
I want to create a json like
"Countries" : {
"Asia" : {
"name" : "Srilanka"
},
"Africa" : {
"name" : "Kenya"
}
}
when uploaded to ES.
I have tried
map['Countries'] = {
map['Asia'] = {
'name' => event.get('name_Asia')
},
map['Africa'] = {
'name' => event.get('name_Africa')
}
}
But it doesn't work.
Is it possible to make create above json?

In the first place to produce nested hashes, you should use hashrockets => not assignments inside a hash. One might create this hash in one turn:
map = {
'Countries' => {
'Asia' => {
'name' => event.get('name_Asia')
},
'Africa' => {
'name' => event.get('name_Africa')
}
}
}
Then you can produce JSON out of it with JSON.dump
require 'json'
JSON.dump(map)

Related

Remove characters from JSON

I try to parse some json with logstash, currently the file which I like to enter has the following structure (simplified):
-4: {"audit":{"key1":"value1","key2":"value2"}}
-4: {"audit":{"key1":"value1","key2":"value2"}}
Therefore I need to remove the -4: prefix in order to proper parse the file using json. Unfortunately I can not use the json codec for the input plugin, because it is not in proper json format. Therefore my requirements for the pipeline are:
Remove the -4: prefix
Code the event to json
Do proper mutation
I have tried with the following pipeline, which gives me a parse error:
input {
tcp {
port => 5001
codec => multiline {
pattern => "^-\d:."
what => previous
}
#codec => json_lines
type => "raw_input"
}
}
filter {
if [type] == "raw_input" {
mutate {
gsub => ["message", "^-\d:.", ""]
}
}
json {
source => "message"
}
mutate {
convert => { "[audit][sequenceNumber]" => "integer" }
add_field => { "test" => "%{[audit][sequenceNumber]}"}
}
}
output {
file {
path => "/var/log/logstash/debug-output.log"
codec => line { format => "%{message}" }
}
}
Is it possible to achieve this with logstash? Any suggestions how to do it?
I would use the dissect filter
if [type] == "raw_input" {
dissect {
mapping => {
"message" => "-%{num}: %{msg}"
}
}
}
json {
source => "msg"
}

Using logstash to strip XSSI prefix from JSON response

I have a fairly simple problem but it's confusing to me. I'm trying to use Logstash to get Gerrit data via rest api. I'm using http_poller and I get a right response with my configuration, so I'm almost there.
Now I need to strip the XSSI prefix )]}' from the start of Gerrits JSON response. The question is, how? How to strip or split or mutate it, or how should I proceed?
My input configuration:
input {
http_poller {
urls => {
gerrit_projects => {
method => get
url => "http://url.to/gerrit/a/projects/"
headers => { Accept => "application/json" }
auth => { user => "userid" password => "supresecret" }
}
}
target => "http_poller_data"
metadata_target => "http_poller_metadata"
request_timeout => 60
interval => 60
}
}
filter {
if [http_poller_metadata] {
mutate {
add_field => {
"http_poller_host" => "%{http_poller_metadata[host]}"
"http_poller" => "%{http_poller_metadata[name]}"
}
}
}
if [http_poller_metadata][runtime_seconds] and [http_poller_metadata][runtime_seconds] > 0.5 {
mutate { add_tag => "slow_request" }
}
if [http_request_failure] or [http_poller_metadata][code] != 200 {
mutate { add_tag => "bad_request" }
}
}
output {
stdout { codec => rubydebug }
}
And parts of the response:
Pipeline main started
JSON parse failure. Falling back to plain-text {:error=>#<LogStash::Json::ParserError: Unexpected character (')' (code 41)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at ... (bunch of lines)...
{
"http_poller_data" => {
"message" => ")]}'\n{\"All-Users\":{\"id\":\"All-Users\",....(more valid JSON)...",
"tags" => [
[0] "_jsonparsefailure"
],
"#version" => "1",
"#timestamp" => "2016-12-13T09:48:25.397Z"
},
"#version" => "1",
"#timestamp" => "2016-12-13T09:48:25.397Z",
"http_poller_metadata" => { ... }
This is my first question to StackOverflow. Thank you for being kind with your answers!
You can use the mutate filter with the gsub option (link) to remove the )]}
mutate {
gsub => [
"message", "\)]}'", ""
]
}
But the gsub replace all occurences of a regex, so you have to be sure that the pattern only appears once.
I use "sed 1d" to remove the ")]}'" prefix and "jq" to process the JSON output. For example, to get the state of a Gerrit project I execute:
curl -s --header 'Content-Type:application/json' --request GET --netrc https://<GERRIT-SERVER>/a/projects/?r=<GERRIT-PROJECT> | sed 1d | jq --raw-output ".[] | .state"
ACTIVE

Firebase + Aurelia: how to process the returned key=>value format by Firebase?

I'm retrieving the following structure from Firebase:
"bills" : {
"1" : { // the customer id
"orders" : {
"-KVMs10xKfNdh_vLLj_k" : [ { // auto generated
"products" : [ {
"amount" : 3,
"name" : "Cappuccino",
"price" : 2.6
} ],
"time" : "00:15:14"
} ]
}
}
}
I'm looking for a way to process this with Aurelia. I've written a value converter that allows my repeat.for to loop the object keys of orders, sending each order to an order-details component. The problem is, this doesn't pass the key, which I need for deleting a certain order ("-KVMs10xKfNdh_vLLj_k")
Should I loop over each order and add the key as an attribute myself?
Is there a better/faster way?
This answer might be a little late (sorry OP), but for anyone else looking for a solution you can convert the snapshot to an array that you can iterate in your Aurelia views using a repeat.for, for example.
This is a function that I use in all of my Aurelia + Firebase applications:
export const snapshotToArray = (snapshot) => {
const returnArr = [];
snapshot.forEach((childSnapshot) => {
const item = childSnapshot.val();
item.uid = childSnapshot.key;
returnArr.push(item);
});
return returnArr;
};
You would use it like this:
firebase.database().ref(`/bills`)
.once('value')
.then((snapshot) => {
const arr = snapshotToArray(snapshot);
});

Symfony - ELK - interpret json logged across Monolog

i'm logging and analysing my logs with the ELK stack on a symfony3 application.
From the symfony application i want to log jsons object that might be a little bit deep.
Is there any way that Kibana interprets my json as a json and not a string ?
Here is an example of the way i'm logging,
$this->logger->notice('My log message', array(
'foo' => 'bar,
'myDeepJson1' => $deepJson1,
'myDeepJson2' => $deepJson2
));
And there, my logstash.conf. I used the symfony's pattern that i found here : https://github.com/eko/docker-symfony
input {
redis {
type => "symfony"
db => 1
key => monolog
data_type => ['list']
host => "redis"
port => 6379
}
}
filter {
if [type] == "symfony" {
grok {
patterns_dir => "./patterns"
match => [ "message", "%{SYMFONY}" ]
}
date {
match => [ "date", "YYYY-MM-dd HH:mm:ss" ]
}
if [log_type] == "app" {
json {
source => "log_context"
}
}
}
}
output {
if [type] == "symfony" {
elasticsearch {
hosts => ["172.17.0.1:9201"]
index => "azureva-logstash"
}
}
}
Actually, enverything i'm logging is in the log_context variable, but monolog transforms the array into a json, so, my $deepJson variables are double encoded, but, there's no way to log a multidimensional array in the context...
any help would be appreciated. Thanks !
Once you have the retrieved the requestJson from log_context, you have to remplace all the \" with " in order to be able to parse it with the json plugin.
You can do it with the mutate filter and its gsub option (documentation).
After you can then parse the resulting json with the json plugin.
You can update your documentation with:
if [log_type] == "app" {
json {
source => "log_context"
}
mutate {
gsub => ["requestJson", "\\"", "\""]
}
json {
source => "requestJson"
target => "requestJsonDecode"
}
}

using conditions on populate with waterline and sails.js

I'm trying to use conditions on populate, but can't solve it.
The following code exists and working great, just add conditions made it crash.
Page.find({ 'id' : id })
.populate('addons', { 'type' : 'seo' })
.then(function(page)) {
});
but when i try to add 'or' condition it crashes.
Page.find({ 'id' : id })
.populate('addons', {
or: [
{
'type' : 'seo'
},
{
'type' : 'general'
},
{
'type' : 'local',
'zone' : zoneId // zoneId holds the correct information
}
]
})
.then(function(page)) {
});
thanks.