I have a JSON return (is it a hash? array? JS object?) where every entry is information on a person and follows the following format:
{"type"=>"PersonSummary",
"id"=>"123", "properties"=>{"permalink"=>"personname",
"api_path"=>"people/personname"}}
I would like to go through every entry, and output only the "id"
I've put the entire JSON pull "response" into "result"
result = JSON.parse(response)
then, I'd like to go through result and do print the ID and "api_path" of the person:
result.each do |print id AND api_path|
How do I go about doing this in Ruby?
The only time you would need to use JSON.parse is if you have a string you need to parse into a Hash. For Example:
result = JSON.parse('{ "type" : "PersonSummary", "id" : 123, "properties" : { "permalink" : "personname", "api_path" : "people/personname" } }')
Once you have the Hash, result could be accessed by giving it the key, like result[:id] or result['id'] (both will work), or you can iterate through the hash too using the following code.
If you need to access the api_path value you would do so by using result['properties']['api_path']
result = { 'type' => 'PersonSummary', 'id' => 123, 'properties' => { 'permalink' => 'personname', 'api_path' => 'people/personname' } }
result.each do |key, value|
puts "Key: #{key}\t\tValue: #{value}"
end
You could even do something like puts value if key == 'id' if you just want to show certain values.
Related
I have this log with message field contains this line
"message":"name="Alert notification" event_id=123 alert_name="alert 1234" ipaddr="192.168.0.1" object_name="localhost.local""
I want to add new field with the value after = sign so it will look like this
"alert_name":"alert 1234",
"object_name":"localhost.local",
"ipaddr":"192.168.0.1"
I've tried using this kind of grok filter but it grab the whole string ("alert_name":"alert_name"="alert 1234")
grok {
match => { "message" => "%{WORD:method1} %{WORD:method2} %{WORD:method3}" }
}
How to only pass value after = to this new field?
Trying to create a new jira ticket of specific requestType, but it is nested two levels deep. Tried few possible alterations, but no luck. Here's the code I have,
require 'jira-ruby' # https://github.com/sumoheavy/jira-ruby
options = {
:username => jira_username,
:password => jira_password,
:site => 'https://jiraurl/rest/api/2/',
:context_path => '',
:auth_type => :basic,
:read_timeout => 120
}
client = JIRA::Client.new(options)
issue = client.Issue.build
fields_options = {
"fields" =>
{
"summary" => "Test ticket creation",
"description" => "Ticket created from Ruby",
"project" => {"key" => "AwesomeProject"},
"issuetype" => {"name" => "Task"},
"priority" => {"name" => "P1"},
"customfield_23070" =>
{
"requestType" => {
"name" => "Awesome Request Type"
}
}
}
}
issue.save(fields_options)
"errors"=>{"customfield_23070"=>"Operation value must be a string"}
Also tried passing a JSON object to customfield_23070,
"customfield_23070": { "requestType": { "name": "Awesome Request Type" } }
still no luck, get the same error message.
If it helps, this is how customfield_23070 looks like in our Jira,
Does anyone know how to set requestType in this case, please? Any help is greatly appreciated!!
It seems that for custom fields with specific data types (string/number), you must pass the value as:
"customfield_1111": 1
or:
"customfield_1111": "string"
instead of:
"customfield_1111":{ "value": 1 }
or:
"customfield_1111":{ "value": "string" }
I'm not sure but you can try this possible examples:
eg.1:
"customfield_23070"=>{"name"=>"requestType","value"=>"Awesome Request Type"}
eg.2:
"customfield_23070"=>{"requestType"=>"Awesome Request Type"}
eg.3:
"customfield_23070"=>{"value"=>"Awesome Request Type"}
eg.4
"customfield_23070"=>{"name"=>"Awesome Request Type"}
for ref there are 2 methods depending upon the fields you are interacting with
have a look here '
updating-an-issue-via-the-jira-rest-apis-6848604
' for the applicable fields for update via verb operations, the other fields you can use examples as per above,
you can use both methods within the same call
{
"update": {"description": [{"set": "Description by API Update - lets do this thing"}]},
"fields": {"customfield_23310": "TESTING0909"}
}
Ok, I think I found how to do it.
You need to provide a string, and that string is the GUID of the RequestType.
In order to get that GUID. You need to run the following in a scriptrunner console:
import com.atlassian.jira.component.ComponentAccessor
def issue = ComponentAccessor.issueManager.getIssueByCurrentKey("ISSUE-400546") //Issue with the desired Request Type
def cf = ComponentAccessor.customFieldManager.getCustomFieldObjectByName("Tipo de solicitud del cliente") //Change it to the name of your request type field
issue.getCustomFieldValue(cf)
Source: https://community.atlassian.com/t5/Jira-Software-questions/how-to-set-request-type-value-in-while-create-jira-issue/qaq-p/1106696
Code below returns me all counties as a string, I can see that by using inspect method.
def self.all_counties
response['ChargeDevice'].each do |charger|
puts ['ChargeDeviceLocation']['Address']['County'].inspect
end
end
What would be the right way to store every returned string in one array so I can manipulate it later?
JSON
"ChargeDeviceLocation" => {
"Latitude" =>"51.605591",
"Longitude" =>"-0.339510",
"Address" => {
"County" =>"Greater London",
"Country" =>"gb"
}
This works if the response has all the keys for every item:
counties = response['ChargeDevice'].map do |r|
r.dig('ChargeDeviceLocation', 'Address', 'County')
end
Something like this will give you nils when the tree doesn't have entries for all items:
counties = response['ChargeDevice'].map do |r|
r.fetch('ChargeDeviceLocation', {}).
fetch('Address', {}).
fetch('County', nil)
end
You could also use JSONPath (and ruby JSONPath gem).
require 'jsonpath'
counties = JsonPath.new('$..County').on(response.to_json)
i have the following sql statement inside a function..
my $sth = $dbh->prepare(qq[SELECT device_uuid,device_name FROM ].DB_SCHEMA().qq[.user_device WHERE user_id = ?]);
$sth->execute($user_id) || die $dbh->errstr;
the results are being fetched using the following statement
while(my $data = $sth->fetchrow_arrayref()) {
}
my question is how can i create and return a json structure containing objects for every row being fetched?something like this
{
object1:{
"device_uuid1":"id1",
"device_name1":"name1"
},
object2:{
"device_uuid2":"id2",
"device_name2":"name2"
},
object3:{
"device_uuid3":"id3",
"device_name3":"name3"
}
}
the total number of json objects will be equal to the number of rows returned by the sql statement.
i have managed to build the structure like this
$VAR1 = [{"device_name":"device1","device_id":"device_id1"},{"device_name":"device2","device_id":"device_id2"}]
how can i iterate through the array refs and get "device_name" and "device_id" values?
For your needs, this library should work well. What you need to do is have a scalar variable defined as below and push the element for each iteration in while loop
my $json = JSON->new->utf8->space_after->encode({})
while(my $data = $sth->fetchrow_arrayref()) {
#Push new element here in $json using incr_parse method
#or using $json_text = $json->encode($perl_scalar)
}
Hope this helps you.
finally what i did was to create an array ref and push the fetched rows which are being returned as hash refs
my #device = ();
while(my $data = $sth->fetchrow_hashref()) {
push(#device, $data);
}
last i convert the #device array ref to json and return the outcome
return encode_json(\#device);
The statement handle method fetchall_arrayref() can return an array reference where each element in the referenced array is a hash reference containing details of one row in the resultset. This seems to me to be exactly the data structure that you want. So you can just call that method and pass the returned data structure to a JSON encoding function.
# Passing a hash ref to fetchall_arrayref() tells it to
# return each row as a hash reference.
my $json = encode_json($sth->fetchall_arrayref({});
Your sample JSON is incorrect - JSON is actually quite nicely represented by perl data structures - [] denotes array, {} denotes key-value (very similar to hash).
I would rather strongly suggest though, that what you've asked for is probably not what you want - you've seemingly gone for globally unique keys, which ... isn't good style when they're nested.
Why? well, so you can do things like this:
print $my_data{$_}->{'name'} for keys %my_data;
Far better to go for something like:
#!/usr/bin/env perl
use strict;
use warnings;
use Data::Dumper;
use JSON;
my %my_data = (
object1 => {
uuid => "id1",
name => "name1"
},
object2 => {
uuid => "id2",
name => "name2"
},
object3 => {
uuid => "id3",
name => "name3"
},
);
print Dumper \%my_data;
print to_json ( \%my_data, { 'pretty' => 1 } )."\n";
Now, that does assume your 'object1' is a unique key - if it isn't, you can instead do something like this - an array of anonymous hashes (for bonus points, it preserves ordering)
my #my_data = (
{ object1 => {
uuid => "id1",
name => "name1"
}
},
{ object2 => {
uuid => "id2",
name => "name2"
}
},
{ object3 => {
uuid => "id3",
name => "name3"
}
},
);
Now, how to take your example and extend it? Easy peasy really - assemble what you want to add to your structure in your loop, and insert it into the structure:
while(my $data = $sth->fetchrow_arrayref()) {
my $objectname = $data -> [0]; #assuming it's this element!
my $uuid = $data -> [1];
my $name = $data -> [2];
my $new_hash = { uuid => $uuid, name => $name };
$mydata{$objectname} = $new_hash;
}
is there a way to split a logstash (1.4.2) event into multiple other events?
My input looks like this:
{ "parts" => ["one", "two"],
"timestamp" => "2014-09-27T12:29:17.601Z"
"one.key=> "1", "one.value"=>"foo",
"two.key" => "2", "two.value"=>"bar"
}
And I'd like to create two events with the following content:
{ "key" => "1", "value" => "foo", "timestamp" => "2014-09-27T12:29:17.601Z" }
{ "key" => "2", "value" => "bar", "timestamp" => "2014-09-27T12:29:17.601Z" }
Problem is that I can't know the actual "parts"...
Thanks for your help :)
Updating a very old answer because there is a better way to do this in newer versions of logstash without resorting to a custom filter.
You can do this using a ruby filter and a split filter:
filter {
ruby {
code => '
arrayOfEvents = Array.new()
parts = event.get("parts")
timestamp = event.get("timestamp")
parts.each { |part|
arrayOfEvents.push({
"key" => event.get("#{part}.key"),
"value" => event.get("#{part}.value"),
"timestamp" => timestamp
})
event.remove("#{part}.key")
event.remove("#{part}.value")
}
puts arrayOfEvents
event.remove("parts")
event.set("event",arrayOfEvents)
'
}
split {
field => 'event'
}
mutate {
rename => {
"[event][key]" => "key"
"[event][value]" => "value"
"[event][timestamp]" => "timestamp"
}
remove_field => ["event"]
}
}
My original answer was:
You need to resort to a custom filter for this (you can't call yield from a ruby code filter which is what's needed to generate new events).
Something like this (dropped into lib/logstash/filters/custom_split.rb):
# encoding: utf-8
require "logstash/filters/base"
require "logstash/namespace"
# custom code to break up an event into multiple
class LogStash::Filters::CustomSplit < LogStash::Filters::Base
config_name "custom_split"
milestone 1
public
def register
# Nothing
end # def register
public
def filter(event)
return unless filter?(event)
if event["parts"].is_a?(Array)
event["parts"].each do |key|
e = LogStash::Event.new("timestamp" => event["timestamp"],
"key" => event["#{key}.key"],
"value" => event["#{key}.value"])
yield e
end
event.cancel
end
end
end
And then just put filter { custom_split {} } into your config file.
For future reference and based on #alcanzar answer, it is now possible to do things like this:
ruby {
code => "
# somefield is an array
array = event.get('somefield')
# drop the current event (this was my use case, I didn't need the feeding event)
event.cancel
# iterate over to construct new events
array.each { |a|
# creates a new logstash event
generated = LogStash::Event.new({ 'foo' => 'something' })
# puts the event in the pipeline queue
new_event_block.call(generated)
}
"
}