Perl Create JSON string in a specific format - json

I have a table with 3 fields that I query and put the values in a JSON. The table has hostname to application name and application id. I currently have a perl script that outputs the following json string.
[
{
"app_id" : "1234",
"app_name" : "Managed File Transfer",
"ci_name" : "hosta7"
},
{
"app_name" : "Patrtol",
"app_id" : "1235",
"ci_name" : "hosta7"
},
{
"app_id" : "1236",
"app_name" : "RELATIONAL DATA WAREHOUSE",
"ci_name" : "hosta7"
},
{
"ci_name" : "hosta7",
"app_id" : "1237",
"app_name" : "Managed File Transfer"
},
{
"app_id" : "1238",
"app_name" : "Initio Application",
"ci_name" : "hosta7"
},
{
"app_id" : "1239",
"app_name" : "Data Warehouse Operations Infrastructure",
"ci_name" : "hosta7"
},
{
"app_id" : "2345",
"app_name" : "Tableou",
"ci_name" : "hostb"
}
]
I want the resulting json string like the following where if the ci_name already exists, I want the new item to be added to the current entry of the host in the JSON string. So essentially, I want this JSON string
{
"hosts" : [{
"hosta" :[
{
"app_id": "1234",
"app_name": "Managed File Transfer"
},
{
"app_id": "1235",
"app_name": "Patrol"
},
{
"app_id": "1236",
"app_name": "RELATIONAL DATA WAREHOUSE"
},
{
"app_id": "1237",
"app_name": "Managed File Transfer"
},
{
"app_id": "1238",
"app_name": "Initio Application"
},
{
"app_id": "1239",
"app_name": "Data Warehouse Operations Infrastructure"
}
],
"hostb" : [
{
"app_id": "2345",
"app_name": "Tableou"
}
]
}]
}
#!/usr/bin/perl
use strict;
use warnings;
use JSON;
my $hosts = [
{
'app_id' => '1234',
'app_name' => 'Managed File Transfer',
'ci_name' => 'hosta7'
},
{
'app_name' => 'Patrtol',
'app_id' => '1235',
'ci_name' => 'hosta7'
},
{
'app_id' => '1236',
'app_name' => 'RELATIONAL DATA WAREHOUSE',
'ci_name' => 'hosta7'
},
{
'ci_name' => 'hosta7',
'app_id' => '1237',
'app_name' => 'Managed File Transfer'
},
{
'app_id' => '1238',
'app_name' => 'Initio Application',
'ci_name' => 'hosta7'
},
{
'app_id' => '1239',
'app_name' => 'Data Warehouse Operations Infrastructure',
'ci_name' => 'hosta7'
},
{
'app_id' => '2345',
'app_name' => 'Tableou',
'ci_name' => 'hostb'
}
];
my $output;
foreach my $row (#$hosts) {
push #$output, $row;
}
my $json = new JSON;
$json->pretty(1);
print $json->encode($output);

You don't want to push directly, you want to push under a key taken from the ci_name, and you only want to copy the app id and name.
for my $element (#$hosts) {
push #{ $output->{ $element->{ci_name} } },
{ map { $_ => $element->{$_} } qw( app_id app_name ) };
}

Probably the code will look like following snippet
#!/usr/bin/perl
use strict;
use warnings;
use JSON;
use Data::Dumper;
my $debug = 0;
my %data;
while( <DATA> ) {
chomp;
next if /app_id/;
my ($app_id,$ci_name,$app_name) = split /,/;
push #{$data{hosts}{$ci_name}}, {app_id => $app_id, app_name => $app_name };
}
print Dumper(\%data) if $debug;
my $json = encode_json \%data;
print $json;
__DATA__
app_id,ci_name,app_name
1234,hosta7,Managed File Transfer
1235,hosta7,Patrtol
1236,hosta7,RELATIONAL DATA WAREHOUSE
1237,hosta7,Managed File Transfer
1238,hosta7,Initio Application
1239,hosta7,Data Warehouse Operations Infrastructure
2345,hostb,Tableou

Related

Logstash: How to split multiline json object and add_field into kibana?

I have the following JSON data, what I need is split each object into the separate message and add_filed, but with my current configuration, this whole JSON is getting into one message, I'm not sure what I'm doing wrong, any help or right direction will be really helpful.
[
{
"SOURCE": "Source A",
"Model": "ModelABC",
"Qty": "3"
},
{
"SOURCE": "Source B",
"Model": "MoBC",
"Qty": "31"
},
{
"SOURCE": "Source C",
"Model": "MoBCSss",
"Qty": "3qq"
}
]
logstash.config
input {
file {
path => "/usr/share/logstash/sample-log/Test-Log-For-Kibana.json"
start_position => "beginning"
codec => multiline {
pattern => "^}"
negate => true what => previous auto_flush_interval => 1 multiline_tag => ""
}
}
}
filter {
json {
source => "message"
target => "someField"
}
mutate {
add_field => {
"SOURCE" => "%{[someField][SOURCE]}"
"Model" => "%{[someField][Model]}"
"Qty" => "%{[someField][Qty]}"
}
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => "elasticsearch:9200"
user => "elastic"
password => "changeme"
}
}

file_get_contents parse json insert database

Hi I encountered a problem. I will record the data I received from the database.
JSON=>
{
"success": true,
"timestamp": 1565251506,
"base": "EUR",
"date": "2019-08-08",
"rates": {
"AED": 4.119657,
"AFN": 87.689574,
"ALL": 121.192477,
"AMD": 533.113395,
"ANG": 1.998509,
"AOA": 398.760307,
"ARS": 51.036305,
"AUD": 1.654423
}
}
After Json decode
array:5 [
"success" => true
"timestamp" => 1565205306
"base" => "EUR"
"date" => "2019-08-07"
"rates" => array:168 [
"AED" => 4.118588
"AFN" => 87.74397
"ALL" => 121.002609
"AMD" => 534.279745
"ANG" => 2.001014
]
]
I want this=> But How do I get quote and rate?
$response = file_get_contents("rate.json");
$datas = json_decode($response, true);
foreach ($datas as $data) {
$rates = new Rate();
$rates->base = $datas['base'];
$rates->quote = 'AED';
$rates->rate = '4.119657';
$rates->save();
}
You're looping over the wrong thing here, you should be looping over the rates and saving the corresponding key and value:
$response = file_get_contents("rate.json");
$data = json_decode($response, true);
foreach ($data['rates'] as $quote => $rate) {
$rates = new Rate();
$rates->base = $data['base'];
$rates->quote = $quote;
$rates->rate = $rate;
$rates->save();
}

How to parse nested JSON data with logstash filter

I have a json file which is having data like this
{
"foo" : "bar",
"test" : {
"steps" : [{
"response_time" : "100"
}, {
"response_time" : "101",
"more_nested" : [{
"hello" : "world"
}, {
"hello2" : "world2"
}
]
}
]
}
}
I am using logstash filter, which is giving me wrong result
input {
file {
sincedb_path => ".../sincedb_path.txt"
path => ".../test.json"
type => "test"
start_position => "beginning"
}
}
filter{
json{
source => "message"
}
}
output {
stdout { codec => rubydebug}
}
How can i accomplish the events like steps.response_time :100
steps.more_nested.hello : world.
I tried with ruby but not working.
I am using logstash 5.x version

Error in logstash configuration file tomcat

I have problem with Logstash configuration
My logs pattern are
2017-07-26 14:31:03,644 INFO [http-bio-10.60.2.21-10267-exec-92] jsch.DeployManagerFileUSImpl (DeployManagerFileUSImpl.java:132) - passage par ficher temporaire .bindings.20170726-143103.tmp
My current pattern is
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} \(%{DATA:class}\):%{GREEDYDATA:message}" }
Which pattern for [http-bio-10.60.2.21-10267-exec-92] and for jsch.DeployManagerFileUSImpl?
Doesn't seem like the current pattern you've shown would work, as you don't have anything in your sample message that matches \(%{DATA:class}\):%{GREEDYDATA:message} and you're not dealing with the double space after the loglevel.
If you want to match some random stuff in the middle of a line, use %{DATA}, e.g.:
\[%{DATA:myfield}\]
and then you can use %{GREEDYDATA} to get the stuff at the end of the line:
\[%{DATA:myfield1}\] %{GREEDYDATA:myfield2}
If you need to break these items down into fields of their own, then be more specific with the pattern or use a second grok{} block.
in my logstash.conf i have change my pattern to
match => [ "message", "%{TIMESTAMP_ISO8601:logdate},%{INT} %{LOGLEVEL:log-level} \[(?<threadname>[^\]]+)\] %{JAVACLASS:package} \(%{JAVAFILE:file}:%{INT:line}\) - %{GREEDYDATA:message}" ]
With helping of site https://grokdebug.herokuapp.com/ .
But i could not see in kibana 5.4.3 my static log file contains in /home/elasticsearch/static_logs/ directory ?
My logstash configuration file with "static" section
input {
file {
type => "access-log"
path => "/home/elasticsearch/tomcat/logs/*.txt"
}
file {
type => "tomcat"
path => "/home/elasticsearch/tomcat/logs/*.log" exclude => "*.zip"
codec => multiline {
negate => true
pattern => "(^%{MONTH} %{MONTHDAY}, 20%{YEAR} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) (?:AM|PM))"
what => "previous"
}
}
file {
type => "static"
path => "/home/elasticsearch/static_logs/*.log" exclude => "*.zip"
}
}
filter {
if [type] == "access-log" {
grok {
# Access log pattern is %a %{waffle.servlet.NegotiateSecurityFilter.PRINCIPAL}s %t %m %U%q %s %B %T "%{Referer}i" "%{User-Agent}i"
match => [ "message" , "%{IPV4:clientIP} %{NOTSPACE:user} \[%{DATA:timestamp}\] %{WORD:method} %{NOTSPACE:request} %{NUMBER:status} %{NUMBER:bytesSent} %{NUMBER:duration} \"%{NOTSPACE:referer}\" \"%{DATA:userAgent}\"" ]
remove_field => [ "message" ]
}
grok{
match => [ "request", "/%{USERNAME:app}/" ]
tag_on_failure => [ ]
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
geoip {
source => ["clientIP"]
}
dns {
reverse => [ "clientIP" ]
}
mutate {
lowercase => [ "user" ]
convert => [ "bytesSent", "integer", "duration", "float" ]
}
if [referer] == "-" {
mutate {
remove_field => [ "referer" ]
}
}
if [user] == "-" {
mutate {
remove_field => [ "user" ]
}
}
}
if [type] == "tomcat" {
if [message] !~ /(.+)/ {
drop { }
}
grok{
patterns_dir => "./patterns"
overwrite => [ "message" ]
# oK Catalina normal
match => [ "message", "%{CATALINA_DATESTAMP:timestamp} %{NOTSPACE:className} %{WORD:methodName}\r\n%{LOGLEVEL: logLevel}: %{GREEDYDATA:message}" ]
}
grok{
match => [ "path", "/%{USERNAME:app}.20%{NOTSPACE}.log"]
tag_on_failure => [ ]
}
# Aug 25, 2014 11:23:31 AM
date{
match => [ "timestamp", "MMM dd, YYYY hh:mm:ss a" ]
remove_field => [ "timestamp" ]
}
}
if [type] == "static" {
if [message] !~ /(.+)/ {
drop { }
}
grok{
patterns_dir => "./patterns"
overwrite => [ "message" ]
# 2017-08-03 16:01:11,352 WARN [Thread-552] pcf2.AbstractObjetMQDAO (AbstractObjetMQDAO.java:137) - Descripteur de
match => [ "message", "%{TIMESTAMP_ISO8601:logdate},%{INT} %{LOGLEVEL:log-level} \[(?<threadname>[^\]]+)\] %{JAVACLASS:package} \(%{JAVAFILE:file}:%{INT:line}\) - %{GREEDYDATA:message}" ]
}
# 2017-08-03 16:01:11,352
date{
match => [ "timestamp", "YYYY-MM-dd hh:mm:ss,SSS" ]
remove_field => [ "timestamp" ]
}
}
}
output {
elasticsearch { hosts => ["192.168.99.100:9200"]}
}
Where is my mistake ?
Regards

Access nested JSON Field in Logstash

I have a Problem with accessing a nested JSON field in logstash (latest version).
My config file is the following:
input {
http {
port => 5001
codec => "json"
}
}
filter {
mutate {
add_field => {"es_index" => "%{[statements][authority][name]}"}
}
mutate {
gsub => [
"es_index", " ", "_"
]
}
mutate {
lowercase => ["es_index"]
}
ruby {
init => "
def remove_dots hash
new = Hash.new
hash.each { |k,v|
if v.is_a? Hash
v = remove_dots(v)
end
new[ k.gsub('.','_') ] = v
if v.is_a? Array
v.each { |elem|
if elem.is_a? Hash
elem = remove_dots(elem)
end
new[ k.gsub('.','_') ] = elem
} unless v.nil?
end
} unless hash.nil?
return new
end
"
code => "
event.instance_variable_set(:#data,remove_dots(event.to_hash))
"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => "elasticsearch:9200"
index => "golab-%{+YYYY.MM.dd}"
}
}
I have a filter with mutate. I want to add a field that I can use as a part of the index name. When I use this "%{[statements][authority][name]}" the content in the brackets is used as string.%{[statements][authority][name]} is saved in the es_indexfield. Logstash seems to think this is a string, but why?
I've also tried to use this expression: "%{statements}". It's working like expected. Everything in the field statements is passed to es_index. If I use "%{[statements][authority]}" strange things happen. es_index is filled with the exact same output that "%{statements}" produces. What am I missing?
Logstash Output with "%{[statements][authority]}":
{
"statements" => {
"verb" => {
"id" => "http://adlnet.gov/expapi/verbs/answered",
"display" => {
"en-US" => "answered"
}
},
"version" => "1.0.1",
"timestamp" => "2016-07-21T07:41:18.013880+00:00",
"object" => {
"definition" => {
"name" => {
"en-US" => "Example Activity"
},
"description" => {
"en-US" => "Example activity description"
}
},
"id" => "http://adlnet.gov/expapi/activities/example"
},
"actor" => {
"account" => {
"homePage" => "http://example.com",
"name" => "xapiguy"
},
"objectType" => "Agent"
},
"stored" => "2016-07-21T07:41:18.013880+00:00",
"authority" => {
"mbox" => "mailto:info#golab.eu",
"name" => "GoLab",
"objectType" => "Agent"
},
"id" => "0771b9bc-b1b8-4cb7-898e-93e8e5a9c550"
},
"id" => "a7e31874-780e-438a-874c-964373d219af",
"#version" => "1",
"#timestamp" => "2016-07-21T07:41:19.061Z",
"host" => "172.23.0.3",
"headers" => {
"request_method" => "POST",
"request_path" => "/",
"request_uri" => "/",
"http_version" => "HTTP/1.1",
"http_host" => "logstasher:5001",
"content_length" => "709",
"http_accept_encoding" => "gzip, deflate",
"http_accept" => "*/*",
"http_user_agent" => "python-requests/2.9.1",
"http_connection" => "close",
"content_type" => "application/json"
},
"es_index" => "{\"verb\":{\"id\":\"http://adlnet.gov/expapi/verbs/answered\",\"display\":{\"en-us\":\"answered\"}},\"version\":\"1.0.1\",\"timestamp\":\"2016-07-21t07:41:18.013880+00:00\",\"object\":{\"definition\":{\"name\":{\"en-us\":\"example_activity\"},\"description\":{\"en-us\":\"example_activity_description\"}},\"id\":\"http://adlnet.gov/expapi/activities/example\",\"objecttype\":\"activity\"},\"actor\":{\"account\":{\"homepage\":\"http://example.com\",\"name\":\"xapiguy\"},\"objecttype\":\"agent\"},\"stored\":\"2016-07-21t07:41:18.013880+00:00\",\"authority\":{\"mbox\":\"mailto:info#golab.eu\",\"name\":\"golab\",\"objecttype\":\"agent\"},\"id\":\"0771b9bc-b1b8-4cb7-898e-93e8e5a9c550\"}"
}
You can see that authority is part of es_index. So it was not chosen as a field.
Many thanks in advance
I found a solution. Credits go to jpcarey (Elasticsearch Forum)
I had to remove codec => "json". That leads to another data structure. statements is now an array and not an object. So I needed to change %{[statements][authority][name]} to %{[statements][0][authority][name]}. That works without problems.
If you follow the given link you'll also find an better implementation of my mutate filters.