I am taking information from my MongoDB database (#bs). #bs has tons of information that I'm not interested, so what I need is to cycle trough all the information and create a new object with the information I need.
For that, I created a new array (#final) and I'm getting information and adding it to #final. The information seems to be getting there, however, when I convert it to JSON it's not a valid JSON object. What I intend to create in #final.json is this:
{ Something: [ {Email: "xxx#xxx.com", At: "date", ....}, {...}, ....] }
But when I do to_json I get [["At: date","Email: mail_test#tidgdfp.org","Message-id: .....
#bs = coll.find("headers.from" => email, "date" => {"$gte" => initial_date, "$lte" => Time.now.utc})
#bs = #bs.to_a.map { |obj| obj.delete("completo"); obj.delete("_id"); obj.delete("date"); obj.delete("headers" => "content_type"); obj }
#final = Array.new
#bs.each do |a|
elem = Array.new
elem << "At: #{a["date"]}"
elem << "Email: #{a["headers"]["to"]}"
elem << "Message: #{a["headers"]["message_id"]}"
elem << "Type: #{a["headers"]["status"]}"
#final << elem
end
puts #final
#final = #final.to_json
puts #final["Email"]
Please help.
Thanks
In your loop, create a hash rather than an array. to_json should make this a JSON Object.
#bs.each do |a|
#final << { :At => a['date'], :Email => a['headers']['to'], :Message => a['headers']['message_id'], :Type => a['headers']['status'] }
end
Related
I have a field call query_data defined as text in my MySQL database.
In my model I defined this field as serialize :query_data, JSON.
The JSON format I would like to save and retrieve look like that:
{:items => [
{:id => 1},
{:id => 2},
{:id => 3}
]}
I have a collection (in that case, called items) that contain an array of objects.
I was wondering, what's the best way to add or delete an Item.
Ex: remove {:id => 2} from my items list and add `{:id => 4} to it
Ruby on Rails has some nice methods to move seamlessly between JSON and Ruby.
thing = {:items => [
{:id => 1},
{:id => 2},
{:id => 3}
]}
thing.to_json # "{\"items\":[{\"id\":1},{\"id\":2},{\"id\":3}]}"
thing.to_json is essentially what's happening in the serializer. If you want them back to Ruby, you can just do:
#items = #thing.query_data
JSON.parse(#items) # "items"=>[{"id"=>1}, {"id"=>2}, {"id"=>3}]}
Now that we can easily move between the two, lets just use Ruby syntax to deal with adding and deleting keys.
thing = {:items => [
{:id => 1},
{:id => 2},
{:id => 3}
]}
thing[:items] = thing[:items].append({:id => 4}) # adding a new item
thing[:items] = thing[:items].select { |item| item[:id] != 2 } # removing an item
First: The second argument to serialize should be the class of object you're storing in the field. You should have serialize :query_data, Hash instead.
Besides that, there aren't really any established best practices for working with serialized data. It really just depends too much on the structure of your data. You might as well ask, "what's the best way to add or delete an item from a hash?"
But since this is a hash you should make sure to keep dirty attributes in mind. If you were to do something like:
items = my_model.query_data[:items]
items.reject! {|item| item[:id] == 2}
items += {id: 4}
then the model wouldn't know that query_data changed and should be updated on save.
my_model.changed?
# => false
my_model.save
# Won't actually save changes to db.
To avoid this, you can:
A) Make sure you only ever set my_model.query_data directly
B) Explicitly call my_model.query_data_will_change! after changing that field so that it will be properly updated on save.
Base on #veridian-dynamics (thanks for your help!) Here what I did.
Model:
class MyModel < ApplicationRecord
serialize :item_data, JSON
end
Controller:
class ItemController < ApplicationController
before_action :authenticate_user!
def add_item
begin
mymodel = MyModel.find_or_create_by(id: param[:model_id])
if mymodel .item_data.blank?
item = {:items => []}
else
item = mymodel.item_data.deep_symbolize_keys
end
bookmark_exist = item[:items].any? {|i| i[:id] == params[:id]}
if !bookmark_exist
item[:items] = item[:items ].append({id: params[:id]}) # adding a new item
end
mymodel.item_data = item
mymodel.save
return render :json => item, :status=> 200
rescue Exception => e
return render :json =>{:errors=>e.message}, :status=> 400
puts "ERROR: #{e.message}"
end
end
def delete_item
begin
mymodel = MyModel.find_by(id: params[:model_id])
if mymodel.present? && mymodel.item_data.present?
item = mymodel.item_data.deep_symbolize_keys
item[:items] = (item[:items].select { |itm| itm[:id] != params[:id] }) # remove an item
mymodel.item_data = item
mymodel.save
return render :json => item, :status=> 200
end
rescue Exception => e
return render :json =>{:errors=>e.message}, :status=> 400
puts "ERROR: #{e.message}"
end
end
end
I try to merge some object into one array
I did output by
q = [["99","99","99"],["9"]]
o = [["b","1"],["c","3"],["d","1"],["c","30"]]
puts q.zip(o).map { |k,v| [*k,v] }.to_json
=> [["99",["b","1"]],["99",["c","3"]],["99",["d","1"]],["9",["c","30"]]]
i'm looking for best way to
[{"99"=>{"b"=>"1", "c"=>"3", "d"=>"1"}},{"9"=>{"c"=>"30"}]
a = [["9",["b","8"]],["9",["c","2"]],["9",["d","6"]]]
a.group_by(&:first).transform_values{|a| a.map(&:last).to_h}
# => {"9"=>{"b"=>"8", "c"=>"2", "d"=>"6"}}
a.group_by(&:first).transform_values{|a| a.map(&:last).to_h}.map{|k, v| {k => v}}
# => [{"9"=>{"b"=>"8", "c"=>"2", "d"=>"6"}}]
Looking for something like this?:
some_array = [["9",["b","8"]], ["9",["c","2"]], ["9",["d","6"]]]
some_hash = some_array.each_with_object(Hash.new{ |h,k| h[k] = {} }) do |(k, (sub_key, sub_val)), hash|
hash[k][sub_key] = sub_val
end
p some_hash
#=> {"9"=>{"b"=>"8", "c"=>"2", "d"=>"6"}}
I've a problem with json format, when I want to use it with a board.
In my json file each information have separate with double brackets while i should have only bracket, is it possible to delete one of them?
code :
elsif (params[:which] == "BigSize")
res= []
# PrintType.where("width > 70").where("width <= 120").where("height > 118.9").where("height <= 150").pluck('DISTINCT artwork_id')
big = PrintType.where("width > 70").where("width <= 120").where("height > 118.9").where("height <= 150").pluck('DISTINCT artwork_id')
res << [big]
render :json => res.to_json(include: { :images => { :except => :img_orig } })
result : here
You are pushing an array into an array res << [big]
To stop creating two brackets just write res << big or if it is necesarry for another reason flatten the res before you convert it to json
how can I populate %select options in the haml view with JSON parameters I have in sinatra controller.
In the sinatra controller I have:
response = JSON.parse(curl_resp)
nestedData = response["data"][0]
nestedData.each do |c|
names = c["attributes"]["names"]
end
return haml :newPage, :locals => {:name => example: name in names}
and this the %select options in newPage.haml view:
%select{:name => "select names"}
%option{:value => "id1"} #{locals[:name]}.[0]
%option{:value => "id2"} #{locals[:name]}.[1]
%option{:value => "id3"} #{locals[:name]}.[2]
%option{:value => "id4"} #{locals[:name]}.[3]
this is a sample JSON I get from curl:
{"data":[
{"id":"id1","attributes":{"name":"gnu"}},
{"id":"id2","attributes":{"name":"Alice"}},
{"id":"id3","attributes":{"name":"testsubject"}},
{"id":"id4","attributes":{"name":"testissuer"}}
]}
If your requirement is to iterate over the entire dataset and display <option> tags, you can use something like:
# app.rb
get '/' do
# This is obtained from JSON.parse-ing the incoming data. I've used the JSON
# value directly
#json = {
data:[
{id:"id1",attributes:{name:"gnu"}},
{id:"id2",attributes:{name:"Alice"}},
{id:"id3",attributes:{name:"testsubject"}},
{id:"id4",attributes:{name:"testissuer"}}
]
}
haml :index
end
And in the view:
/ index.haml
%select
= #json[:data].each do |data_item|
%option{ value: data_item[:id] }
= data_item[:attributes][:name]
This way, you don't have to hard-code the number of option tags in the template, which would make it more complicated.
Logstash is awesome. I can send it JSON like this (multi-lined for readability):
{
"a": "one"
"b": {
"alpha":"awesome"
}
}
And then query for that line in kibana using the search term b.alpha:awesome. Nice.
However I now have a JSON log line like this:
{
"different":[
{
"this": "one",
"that": "uno"
},
{
"this": "two"
}
]
}
And I'd like to be able to find this line with a search like different.this:two (or different.this:one, or different.that:uno)
If I was using Lucene directly I'd iterate through the different array, and generate a new search index for each hash within it, but Logstash currently seems to ingest that line like this:
different: {this: one, that: uno}, {this: two}
Which isn't going to help me searching for log lines using different.this or different.that.
Any got any thoughts as to a codec, filter or code change I can make to enable this?
You can write your own filter (copy & paste, rename the class name, the config_name and rewrite the filter(event) method) or modify the current JSON filter (source on Github)
You can find the JSON filter (Ruby class) source code in the following path logstash-1.x.x\lib\logstash\filters named as json.rb. The JSON filter parse the content as JSON as follows
begin
# TODO(sissel): Note, this will not successfully handle json lists
# like your text is '[ 1,2,3 ]' JSON.parse gives you an array (correctly)
# which won't merge into a hash. If someone needs this, we can fix it
# later.
dest.merge!(JSON.parse(source))
# If no target, we target the root of the event object. This can allow
# you to overwrite #timestamp. If so, let's parse it as a timestamp!
if !#target && event[TIMESTAMP].is_a?(String)
# This is a hack to help folks who are mucking with #timestamp during
# their json filter. You aren't supposed to do anything with
# "#timestamp" outside of the date filter, but nobody listens... ;)
event[TIMESTAMP] = Time.parse(event[TIMESTAMP]).utc
end
filter_matched(event)
rescue => e
event.tag("_jsonparsefailure")
#logger.warn("Trouble parsing json", :source => #source,
:raw => event[#source], :exception => e)
return
end
You can modify the parsing procedure to modify the original JSON
json = JSON.parse(source)
if json.is_a?(Hash)
json.each do |key, value|
if value.is_a?(Array)
value.each_with_index do |object, index|
#modify as you need
object["index"]=index
end
end
end
end
#save modified json
......
dest.merge!(json)
then you can modify your config file to use the/your new/modified JSON filter and place in \logstash-1.x.x\lib\logstash\config
This is mine elastic_with_json.conf with a modified json.rb filter
input{
stdin{
}
}filter{
json{
source => "message"
}
}output{
elasticsearch{
host=>localhost
}stdout{
}
}
if you want to use your new filter you can configure it with the config_name
class LogStash::Filters::Json_index < LogStash::Filters::Base
config_name "json_index"
milestone 2
....
end
and configure it
input{
stdin{
}
}filter{
json_index{
source => "message"
}
}output{
elasticsearch{
host=>localhost
}stdout{
}
}
Hope this helps.
For a quick and dirty hack, I used the Ruby filter and below code , no need to use the out of box 'json' filter anymore
input {
stdin{}
}
filter {
grok {
match => ["message","(?<json_raw>.*)"]
}
ruby {
init => "
def parse_json obj, pname=nil, event
obj = JSON.parse(obj) unless obj.is_a? Hash
obj = obj.to_hash unless obj.is_a? Hash
obj.each {|k,v|
p = pname.nil?? k : pname
if v.is_a? Array
v.each_with_index {|oo,ii|
parse_json_array(oo,ii,p,event)
}
elsif v.is_a? Hash
parse_json(v,p,event)
else
p = pname.nil?? k : [pname,k].join('.')
event[p] = v
end
}
end
def parse_json_array obj, i,pname, event
obj = JSON.parse(obj) unless obj.is_a? Hash
pname_ = pname
if obj.is_a? Hash
obj.each {|k,v|
p=[pname_,i,k].join('.')
if v.is_a? Array
v.each_with_index {|oo,ii|
parse_json_array(oo,ii,p,event)
}
elsif v.is_a? Hash
parse_json(v,p, event)
else
event[p] = v
end
}
else
n = [pname_, i].join('.')
event[n] = obj
end
end
"
code => "parse_json(event['json_raw'].to_s,nil,event) if event['json_raw'].to_s.include? ':'"
}
}
output {
stdout{codec => rubydebug}
}
Test json structure
{"id":123, "members":[{"i":1, "arr":[{"ii":11},{"ii":22}]},{"i":2}], "im_json":{"id":234, "members":[{"i":3},{"i":4}]}}
and this is whats output
{
"message" => "{\"id\":123, \"members\":[{\"i\":1, \"arr\":[{\"ii\":11},{\"ii\":22}]},{\"i\":2}], \"im_json\":{\"id\":234, \"members\":[{\"i\":3},{\"i\":4}]}}",
"#version" => "1",
"#timestamp" => "2014-07-25T00:06:00.814Z",
"host" => "Leis-MacBook-Pro.local",
"json_raw" => "{\"id\":123, \"members\":[{\"i\":1, \"arr\":[{\"ii\":11},{\"ii\":22}]},{\"i\":2}], \"im_json\":{\"id\":234, \"members\":[{\"i\":3},{\"i\":4}]}}",
"id" => 123,
"members.0.i" => 1,
"members.0.arr.0.ii" => 11,
"members.0.arr.1.ii" => 22,
"members.1.i" => 2,
"im_json" => 234,
"im_json.0.i" => 3,
"im_json.1.i" => 4
}
The solution I liked is the ruby filter because that requires us to not write another filter. However, that solution creates fields that are on the "root" of JSON and it's hard to keep track of how the original document looked.
I came up with something similar that's easier to follow and is a recursive solution so it's cleaner.
ruby {
init => "
def arrays_to_hash(h)
h.each do |k,v|
# If v is nil, an array is being iterated and the value is k.
# If v is not nil, a hash is being iterated and the value is v.
value = v || k
if value.is_a?(Array)
# "value" is replaced with "value_hash" later.
value_hash = {}
value.each_with_index do |v, i|
value_hash[i.to_s] = v
end
h[k] = value_hash
end
if value.is_a?(Hash) || value.is_a?(Array)
arrays_to_hash(value)
end
end
end
"
code => "arrays_to_hash(event.to_hash)"
}
It converts arrays to has with each key as the index number. More details:- http://blog.abhijeetr.com/2016/11/logstashelasticsearch-best-way-to.html