How to get key in hash value in puppet - configuration

I don't know if it is even possible.
I have a manifest like this:
$some_external_value = 'pew_pew'
$dict = {
ensure => $ensure,
configuration => {
"random_name-${some_external_value}" => {
command => 'python script.py config/random_name-${some_external_value}.cfg',
},
"some_other_name-${some_external_value}" => {
command => 'python script.py config/some_other_name-${some_external_value}.cfg',
},
"without-external" => {
command => "python script.py config/without-external.cfg",
user => 'cluster',
},
}
}
notice ($dict["configuration"]["some_other_name-${some_external_value}"]["command"])
I get
notice: Scope(Class[main]): python script.py config/some_other_name-pew_pew.cfg
Is there some trick to write key name just once and after that just refer it?
"some_other_name-${some_external_value}" => {
command => 'python script.py config/${wild_magic_variable_pasting_key_here}.cfg',
},

You can likely get there either with a custom parser function.
$orig_dict = {
...
configuration => {
"random_name-${some_external_value}" => {
command => 'python script.py config/__KEY__.cfg',
},
...
}
$dict = keysubst($orig_dict)
...where the ruby function does the work of replacing the __KEY__ token with the respective key value recursively.

Related

Remove characters from JSON

I try to parse some json with logstash, currently the file which I like to enter has the following structure (simplified):
-4: {"audit":{"key1":"value1","key2":"value2"}}
-4: {"audit":{"key1":"value1","key2":"value2"}}
Therefore I need to remove the -4: prefix in order to proper parse the file using json. Unfortunately I can not use the json codec for the input plugin, because it is not in proper json format. Therefore my requirements for the pipeline are:
Remove the -4: prefix
Code the event to json
Do proper mutation
I have tried with the following pipeline, which gives me a parse error:
input {
tcp {
port => 5001
codec => multiline {
pattern => "^-\d:."
what => previous
}
#codec => json_lines
type => "raw_input"
}
}
filter {
if [type] == "raw_input" {
mutate {
gsub => ["message", "^-\d:.", ""]
}
}
json {
source => "message"
}
mutate {
convert => { "[audit][sequenceNumber]" => "integer" }
add_field => { "test" => "%{[audit][sequenceNumber]}"}
}
}
output {
file {
path => "/var/log/logstash/debug-output.log"
codec => line { format => "%{message}" }
}
}
Is it possible to achieve this with logstash? Any suggestions how to do it?
I would use the dissect filter
if [type] == "raw_input" {
dissect {
mapping => {
"message" => "-%{num}: %{msg}"
}
}
}
json {
source => "msg"
}

Using logstash to strip XSSI prefix from JSON response

I have a fairly simple problem but it's confusing to me. I'm trying to use Logstash to get Gerrit data via rest api. I'm using http_poller and I get a right response with my configuration, so I'm almost there.
Now I need to strip the XSSI prefix )]}' from the start of Gerrits JSON response. The question is, how? How to strip or split or mutate it, or how should I proceed?
My input configuration:
input {
http_poller {
urls => {
gerrit_projects => {
method => get
url => "http://url.to/gerrit/a/projects/"
headers => { Accept => "application/json" }
auth => { user => "userid" password => "supresecret" }
}
}
target => "http_poller_data"
metadata_target => "http_poller_metadata"
request_timeout => 60
interval => 60
}
}
filter {
if [http_poller_metadata] {
mutate {
add_field => {
"http_poller_host" => "%{http_poller_metadata[host]}"
"http_poller" => "%{http_poller_metadata[name]}"
}
}
}
if [http_poller_metadata][runtime_seconds] and [http_poller_metadata][runtime_seconds] > 0.5 {
mutate { add_tag => "slow_request" }
}
if [http_request_failure] or [http_poller_metadata][code] != 200 {
mutate { add_tag => "bad_request" }
}
}
output {
stdout { codec => rubydebug }
}
And parts of the response:
Pipeline main started
JSON parse failure. Falling back to plain-text {:error=>#<LogStash::Json::ParserError: Unexpected character (')' (code 41)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at ... (bunch of lines)...
{
"http_poller_data" => {
"message" => ")]}'\n{\"All-Users\":{\"id\":\"All-Users\",....(more valid JSON)...",
"tags" => [
[0] "_jsonparsefailure"
],
"#version" => "1",
"#timestamp" => "2016-12-13T09:48:25.397Z"
},
"#version" => "1",
"#timestamp" => "2016-12-13T09:48:25.397Z",
"http_poller_metadata" => { ... }
This is my first question to StackOverflow. Thank you for being kind with your answers!
You can use the mutate filter with the gsub option (link) to remove the )]}
mutate {
gsub => [
"message", "\)]}'", ""
]
}
But the gsub replace all occurences of a regex, so you have to be sure that the pattern only appears once.
I use "sed 1d" to remove the ")]}'" prefix and "jq" to process the JSON output. For example, to get the state of a Gerrit project I execute:
curl -s --header 'Content-Type:application/json' --request GET --netrc https://<GERRIT-SERVER>/a/projects/?r=<GERRIT-PROJECT> | sed 1d | jq --raw-output ".[] | .state"
ACTIVE

Symfony3 Forms–obtain Form with choices, default data etc. as JSON

I have a Symfony3 Application setup and would like to rebuild the frontend based on React now.
One of the Entities is User and each of them can have one or more Groups so in the HTML form a list of Checkboxes appears, so the admin can select the groups attached to a User.
In UserType.php this looks like that:
public function buildForm(FormBuilderInterface $builder, array $options)
{
$builder
->add('username', TextType::class)
->add('password', TextType::class)
->add('email', EmailType::class)
->add('groups', EntityType::class, [
'class' => Group::class,
'choice_label' => 'name',
'expanded' => true,
'multiple' => true//,
//'data' => $builder->getData()->getGroups()
]);
}
To render the Form using React, it would be extremely handy to get a JSON response which could look like that:
{
"user": {
…
"groups": [<gid 1>, …]
"groups_available": [
{
"id": <gid 1>,
"name": …
},
…
]
}
}
So that the groups array contains all the ids of the groups, the user is attached to and groups_available a list of all available groups.
Right now I am using FOSRestBundle and in the Controller it looks like that:
public function getUserformAction($id=null)
{
//if the id is null, create a new user
//else get the existing one
…
$form = $this->createForm(UserType::class, $user);
$view = $form->createView();
return $this->handleView($view);
}
How can I do that?
you should try the following code:
->add('groups', EntityType::class, array(
//if Group is in AppBundle or use the required Bundle name
'class' => 'AppBundle:Group',
'query_builder' => function (EntityRepository $er) {
return $er->createQueryBuilder('u')
->orderBy('u.name', 'ASC')
},
'choice_label' => 'name',
'multiple' => true,
'expanded' => true,
));
You can also get a reference from here
After digging in the source and with the help of the debugger I could manage to do it in a more less robust and generic way like so:
protected function getFormStructure(Form $form)
{
return $this->iterateFormview($form->createView(), []);
}
private function iterateFormview(FormView $view, array $result)
{
foreach($view as $child) {
$vars = $child->vars;
$data = ['value' => $vars['value']];
if(isset($vars['choices'])) {
$data['choices'] = [];
foreach ($vars['choices'] as $choice) {
array_push($data['choices'], [
'label' => $choice->label,
'value' => $choice->value]);
}
}
$result[$vars['full_name']] = $data;
if(count($child) > 0) {
$result = $this->iterateFormview($child, $result);
}
}
return $result;
}
Result (as json):
{
…
"user[groups]":
{
"value": "",
"choices": [
{
"value": 100,
"label": "the name"
},
…
]
}
}
I guess this routine needs to be extended if I need to support more types… But for now this will do it.

Symfony - ELK - interpret json logged across Monolog

i'm logging and analysing my logs with the ELK stack on a symfony3 application.
From the symfony application i want to log jsons object that might be a little bit deep.
Is there any way that Kibana interprets my json as a json and not a string ?
Here is an example of the way i'm logging,
$this->logger->notice('My log message', array(
'foo' => 'bar,
'myDeepJson1' => $deepJson1,
'myDeepJson2' => $deepJson2
));
And there, my logstash.conf. I used the symfony's pattern that i found here : https://github.com/eko/docker-symfony
input {
redis {
type => "symfony"
db => 1
key => monolog
data_type => ['list']
host => "redis"
port => 6379
}
}
filter {
if [type] == "symfony" {
grok {
patterns_dir => "./patterns"
match => [ "message", "%{SYMFONY}" ]
}
date {
match => [ "date", "YYYY-MM-dd HH:mm:ss" ]
}
if [log_type] == "app" {
json {
source => "log_context"
}
}
}
}
output {
if [type] == "symfony" {
elasticsearch {
hosts => ["172.17.0.1:9201"]
index => "azureva-logstash"
}
}
}
Actually, enverything i'm logging is in the log_context variable, but monolog transforms the array into a json, so, my $deepJson variables are double encoded, but, there's no way to log a multidimensional array in the context...
any help would be appreciated. Thanks !
Once you have the retrieved the requestJson from log_context, you have to remplace all the \" with " in order to be able to parse it with the json plugin.
You can do it with the mutate filter and its gsub option (documentation).
After you can then parse the resulting json with the json plugin.
You can update your documentation with:
if [log_type] == "app" {
json {
source => "log_context"
}
mutate {
gsub => ["requestJson", "\\"", "\""]
}
json {
source => "requestJson"
target => "requestJsonDecode"
}
}

How can I break up json data with logstash and kibana

I have a log file with a bunch of lines of json data. For example, here is one line:
{"name":"sampleApplicationName","hostname":"sampleHostName","pid":000000,"AppModule":"sampleAppModuleName","msg":"testMessage","time":"2016-02-23T19:33:10.468Z","v":0}
I want logstash to be able to break up these different components of the json string so that I can create visualizations in Kibana based off these components. I have tried playing around with the indexer file and tries countless variations, using both the json filter and grok patterns but I can't get anything to work. Any help is much appreciated.
Below is an exampke config that I use. Try pasting your json line to the command prompt to validate that it is working fine.
input {
stdin {}
}
filter {
json {
source => "message"
}
mutate {
add_field => {
"[#metadata][tenant-id]" => "%{[tenant-id]}"
"[#metadata][data-type]" => "%{[data-type]}"
"[#metadata][data-id]" => "%{[data-id]}"
}
}
if [data-type] == "build" {
mutate {
add_field => { "[#metadata][action]" => "index" }
}
}
}
output {
stdout { codec => rubydebug { metadata => true } }
file { path => "/tmp/jenkins-logstash.log" }
elasticsearch {
action => "%{[#metadata][action]}"
hosts => "XXX:9200"
index => "tenant-%{[#metadata][tenant-id]}"
document_type => "builds"
document_id => "%{[#metadata][data-id]}"
workers => 1
}
}