Logstash output : Tags on stdout - output

As a newbie in ELK, i'm doing many tests to get used to this environment.
I would like to print the tags generated by Logstash in the the CLI, but I didn't find how.
Is it possible ? I don't want to send it to elasticsearch and then look for the data with Kibana, I just want to know if the tags are here and which ones.
Here is a sample of what i'm trying to do :
I'm using http_poller to get data from a list of URLs, and i'd like to see if the correct HTTPcode is given
input {
http_poller {
urls => {
"url1" => "https://www.google.com"
#"url2" => "https://www.facebook.com"
#"url3" => "https://www.amazon.com"
#"url4" => "http://www.google.com"
#"url5" => "http://www.facebook.com"
#"url6" => "http://www.amazon.com"
}
automatic_retries => 0
#Check les URL toutes les 30sec
interval => 30
#Considere la requete en Timeout au bout de 8secondes
request_timeout => 8
tags => website_healthcheck
}
}
filter{
if [http_poller_metada][code] == 200{
mutate{
add_tag => "Good request"
}
}
}
output {
#Debug
if "Good request" in [tags]{
stdout{
codec => rubydebug
}
}
Right now the output is unreadable (google's html page), I'd like to read only the HttpCode.
Sorry for the poor explanation, and thanks for your answers :)

A solution would be to replace the message field with the tag field, so you will only have the httpCode in your output.
mutate {
replace => { "message" => "%{[http_poller_metada][code]}" }
}

Related

Can I display pictures outside backend web folder in yii2?

I have a little problem. I can’t view images outside the backend web folder.
Aliases in common\config\main:
'aliases' => [
'#upload' => dirname(dirname(__DIR__)).'/upload',
],
View Dataprovider:
[
'format' => 'raw',
'label' => 'Immagine',
'value' => function ($data) {
return Html::img(Yii::getAlias('#upload') . $data->codice_prodotto . '/' . $data->immagine, ['width' => '70px', 'class' => 'img-thumbnail']);
},
],
Can I resolve? Thanks.
If your file is not accessible for http server, so you can't directly download it.
You can:
Move uploads directory to directory accessible for the http-server
Create action that will read the file from private directory and stream it to the browser (you can use yii\web\Response::sendFile() function for that)
Streaming file to the browser
Please read this official docs article to understand this deeply: https://www.yiiframework.com/doc/api/2.0/yii-web-response#sendFile()-detail
Action code example for your case: *
public function actionFile($filename)
{
$storagePath = Yii::getAlias('#upload');
// check filename for allowed chars (do not allow ../ to avoid security issue: downloading arbitrary files)
if (!preg_match('/^[a-z0-9]+\.[a-z0-9]+$/i', $filename) || !is_file("$storagePath/$filename")) {
throw new \yii\web\NotFoundHttpException('The file does not exists.');
}
return Yii::$app->response->sendFile("$storagePath/$filename", $filename);
}
And view Dataprovider configuration: *
[
'format' => 'raw',
'label' => 'Immagine',
'value' => function ($data) {
return Html::img(Url::to(['/path/to-streaming-action/file', 'filename' => $data->codice_prodotto . '/' . $data->immagine]), ['width' => '70px', 'class' => 'img-thumbnail']);
},
],
* Notice that this code is not ready to copy-paste, please read it carefully and try to understand the principle before implement it in your code.

Check if Data value already exists in a json file with liquid

I created a form to collect user data
Then I used an Api to get the user data
And so far I have been successful ( I displayed the data on my site using jekyll-json-get plugin)
Where the problem is users could be able to register with multiple emails...
The file looks like this
json file
{created" => “date”,
“data” => "{
“first name” => “me”,
“email” => “me#me.com”
},
“folder” => “null”
}
{ “created” => “date”,
data" => "{
“first name” => “me2”,
“email” => “me#me.com”
},
“folder” => “null”
}
… And so on
What I want to do is..
On form submission if the email entered as been used by anyone else stop form submission and ask for another email.
How do I go about this(am kinda new)

Search and view Elasticsearch results from an HTML

I am currently working on a project and as the title suggests, what I want to do is to be able to search from an HTML a cluster already uploaded on elasticsearch and preview the results back in the HTML.
I thought of using Logstash to send the search input from HTML to elasticsearch but I can't figure a way of viewing those results back in the HTML. In general what I want to do, is to be able to work with elasticsearch the way kibana does, but from a website.
Appreciate any possible help :)
use php-elastic official library(https://github.com/elastic/elasticsearch-php).
You can use the following code to get the search result:
$this->client = ClientBuilder::create()->setHosts(["ELASTICSEARCH_HOST:ELASTICSEARCH_PORT"])->build();
$queryBody = ["match_all" => new \stdClass()];
if($search) {
$queryBody = [
"match" => [
"_all" => $search
]
];
}
$params = [
"from" => $page * $this->pageSize, // if you want data for pagination
"size" => $this->pageSize, // if you want data for pagination
"index" => $index,
"type" => $type,
"_source_include" => $fields, // Your required field array
"body" => [
"query" => $queryBody
]
];
//Get the search result
$response = $this->client->search($params);

Selective parsing of csv file using logstash

I am trying to feed data into elasticsearch from csv files, through logstash. These csv files contain the first row as the column names. Is there any particular way to skip that row while parsing the file? Are there any conditionals/filters that I could use such that in case of exception it would skip to the next row??
my config file looks like:
input {
file {
path => "/home/sagnik/work/logstash-1.4.2/bin/promosms_dec15.csv"
type => "promosms_dec15"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => ["Comm_Plan","Queue_Booking","Order_Reference","Generation_Date"]
separator => ","
}
ruby {
code => "event['Generation_Date'] = Date.parse(event['Generation_Date']);"
}
}
output {
elasticsearch {
action => "index"
host => "localhost"
index => "promosms-%{+dd.MM.YYYY}"
workers => 1
}
}
The first few rows of my csv file looks like
"Comm_Plan","Queue_Booking","Order_Reference","Generation_Date"
"","No","FMN1191MVHV","31/03/2014"
"","No","FMN1191N64G","31/03/2014"
"","No","FMN1192OPMY","31/03/2014"
Is there anyway I could skip the first line? Also, if my csv file ends with a new line, with nothing in it, then also I get an error. How do I skip those new lines if they come at the end of the file or if thre is an empty row between 2 rows?
A simple way to do it would be to add the following to your filter (after csv, before ruby):
if [Comm_Plan] == "Comm_Plan" {
drop { }
}
Assuming the field would never normally have the same value as the column heading, it should work as expected, however, you could be more specific by using:
if [Comm_Plan] == "Comm_Plan" and [Queue_Booking] == "Queue_Booking" and [Order_Reference] == "Order_Reference" and [Generation_Date] == "Generation_Date" {
drop { }
}
All this would do would be to check to see if the field value had that particular value and if it did, drop the event.
try this:
mutate {
gsub => ["message","\r\n",""]
}
mutate {
gsub => ["message","\r",""]
}
mutate {
gsub => ["message","\n",""]
}
if ![message] {
drop { }
}

Logstash dynamically split events

is there a way to split a logstash (1.4.2) event into multiple other events?
My input looks like this:
{ "parts" => ["one", "two"],
"timestamp" => "2014-09-27T12:29:17.601Z"
"one.key=> "1", "one.value"=>"foo",
"two.key" => "2", "two.value"=>"bar"
}
And I'd like to create two events with the following content:
{ "key" => "1", "value" => "foo", "timestamp" => "2014-09-27T12:29:17.601Z" }
{ "key" => "2", "value" => "bar", "timestamp" => "2014-09-27T12:29:17.601Z" }
Problem is that I can't know the actual "parts"...
Thanks for your help :)
Updating a very old answer because there is a better way to do this in newer versions of logstash without resorting to a custom filter.
You can do this using a ruby filter and a split filter:
filter {
ruby {
code => '
arrayOfEvents = Array.new()
parts = event.get("parts")
timestamp = event.get("timestamp")
parts.each { |part|
arrayOfEvents.push({
"key" => event.get("#{part}.key"),
"value" => event.get("#{part}.value"),
"timestamp" => timestamp
})
event.remove("#{part}.key")
event.remove("#{part}.value")
}
puts arrayOfEvents
event.remove("parts")
event.set("event",arrayOfEvents)
'
}
split {
field => 'event'
}
mutate {
rename => {
"[event][key]" => "key"
"[event][value]" => "value"
"[event][timestamp]" => "timestamp"
}
remove_field => ["event"]
}
}
My original answer was:
You need to resort to a custom filter for this (you can't call yield from a ruby code filter which is what's needed to generate new events).
Something like this (dropped into lib/logstash/filters/custom_split.rb):
# encoding: utf-8
require "logstash/filters/base"
require "logstash/namespace"
# custom code to break up an event into multiple
class LogStash::Filters::CustomSplit < LogStash::Filters::Base
config_name "custom_split"
milestone 1
public
def register
# Nothing
end # def register
public
def filter(event)
return unless filter?(event)
if event["parts"].is_a?(Array)
event["parts"].each do |key|
e = LogStash::Event.new("timestamp" => event["timestamp"],
"key" => event["#{key}.key"],
"value" => event["#{key}.value"])
yield e
end
event.cancel
end
end
end
And then just put filter { custom_split {} } into your config file.
For future reference and based on #alcanzar answer, it is now possible to do things like this:
ruby {
code => "
# somefield is an array
array = event.get('somefield')
# drop the current event (this was my use case, I didn't need the feeding event)
event.cancel
# iterate over to construct new events
array.each { |a|
# creates a new logstash event
generated = LogStash::Event.new({ 'foo' => 'something' })
# puts the event in the pipeline queue
new_event_block.call(generated)
}
"
}