Populate select options in a haml view with JSON data - json

how can I populate %select options in the haml view with JSON parameters I have in sinatra controller.
In the sinatra controller I have:
response = JSON.parse(curl_resp)
nestedData = response["data"][0]
nestedData.each do |c|
names = c["attributes"]["names"]
end
return haml :newPage, :locals => {:name => example: name in names}
and this the %select options in newPage.haml view:
%select{:name => "select names"}
%option{:value => "id1"} #{locals[:name]}.[0]
%option{:value => "id2"} #{locals[:name]}.[1]
%option{:value => "id3"} #{locals[:name]}.[2]
%option{:value => "id4"} #{locals[:name]}.[3]
this is a sample JSON I get from curl:
{"data":[
{"id":"id1","attributes":{"name":"gnu"}},
{"id":"id2","attributes":{"name":"Alice"}},
{"id":"id3","attributes":{"name":"testsubject"}},
{"id":"id4","attributes":{"name":"testissuer"}}
]}

If your requirement is to iterate over the entire dataset and display <option> tags, you can use something like:
# app.rb
get '/' do
# This is obtained from JSON.parse-ing the incoming data. I've used the JSON
# value directly
#json = {
data:[
{id:"id1",attributes:{name:"gnu"}},
{id:"id2",attributes:{name:"Alice"}},
{id:"id3",attributes:{name:"testsubject"}},
{id:"id4",attributes:{name:"testissuer"}}
]
}
haml :index
end
And in the view:
/ index.haml
%select
= #json[:data].each do |data_item|
%option{ value: data_item[:id] }
= data_item[:attributes][:name]
This way, you don't have to hard-code the number of option tags in the template, which would make it more complicated.

Related

How to add an element into an array of a Serialize Field using Ruby on Rails

I have a field call query_data defined as text in my MySQL database.
In my model I defined this field as serialize :query_data, JSON.
The JSON format I would like to save and retrieve look like that:
{:items => [
{:id => 1},
{:id => 2},
{:id => 3}
]}
I have a collection (in that case, called items) that contain an array of objects.
I was wondering, what's the best way to add or delete an Item.
Ex: remove {:id => 2} from my items list and add `{:id => 4} to it
Ruby on Rails has some nice methods to move seamlessly between JSON and Ruby.
thing = {:items => [
{:id => 1},
{:id => 2},
{:id => 3}
]}
thing.to_json # "{\"items\":[{\"id\":1},{\"id\":2},{\"id\":3}]}"
thing.to_json is essentially what's happening in the serializer. If you want them back to Ruby, you can just do:
#items = #thing.query_data
JSON.parse(#items) # "items"=>[{"id"=>1}, {"id"=>2}, {"id"=>3}]}
Now that we can easily move between the two, lets just use Ruby syntax to deal with adding and deleting keys.
thing = {:items => [
{:id => 1},
{:id => 2},
{:id => 3}
]}
thing[:items] = thing[:items].append({:id => 4}) # adding a new item
thing[:items] = thing[:items].select { |item| item[:id] != 2 } # removing an item
First: The second argument to serialize should be the class of object you're storing in the field. You should have serialize :query_data, Hash instead.
Besides that, there aren't really any established best practices for working with serialized data. It really just depends too much on the structure of your data. You might as well ask, "what's the best way to add or delete an item from a hash?"
But since this is a hash you should make sure to keep dirty attributes in mind. If you were to do something like:
items = my_model.query_data[:items]
items.reject! {|item| item[:id] == 2}
items += {id: 4}
then the model wouldn't know that query_data changed and should be updated on save.
my_model.changed?
# => false
my_model.save
# Won't actually save changes to db.
To avoid this, you can:
A) Make sure you only ever set my_model.query_data directly
B) Explicitly call my_model.query_data_will_change! after changing that field so that it will be properly updated on save.
Base on #veridian-dynamics (thanks for your help!) Here what I did.
Model:
class MyModel < ApplicationRecord
serialize :item_data, JSON
end
Controller:
class ItemController < ApplicationController
before_action :authenticate_user!
def add_item
begin
mymodel = MyModel.find_or_create_by(id: param[:model_id])
if mymodel .item_data.blank?
item = {:items => []}
else
item = mymodel.item_data.deep_symbolize_keys
end
bookmark_exist = item[:items].any? {|i| i[:id] == params[:id]}
if !bookmark_exist
item[:items] = item[:items ].append({id: params[:id]}) # adding a new item
end
mymodel.item_data = item
mymodel.save
return render :json => item, :status=> 200
rescue Exception => e
return render :json =>{:errors=>e.message}, :status=> 400
puts "ERROR: #{e.message}"
end
end
def delete_item
begin
mymodel = MyModel.find_by(id: params[:model_id])
if mymodel.present? && mymodel.item_data.present?
item = mymodel.item_data.deep_symbolize_keys
item[:items] = (item[:items].select { |itm| itm[:id] != params[:id] }) # remove an item
mymodel.item_data = item
mymodel.save
return render :json => item, :status=> 200
end
rescue Exception => e
return render :json =>{:errors=>e.message}, :status=> 400
puts "ERROR: #{e.message}"
end
end
end

Logstash indexing JSON arrays

Logstash is awesome. I can send it JSON like this (multi-lined for readability):
{
"a": "one"
"b": {
"alpha":"awesome"
}
}
And then query for that line in kibana using the search term b.alpha:awesome. Nice.
However I now have a JSON log line like this:
{
"different":[
{
"this": "one",
"that": "uno"
},
{
"this": "two"
}
]
}
And I'd like to be able to find this line with a search like different.this:two (or different.this:one, or different.that:uno)
If I was using Lucene directly I'd iterate through the different array, and generate a new search index for each hash within it, but Logstash currently seems to ingest that line like this:
different: {this: one, that: uno}, {this: two}
Which isn't going to help me searching for log lines using different.this or different.that.
Any got any thoughts as to a codec, filter or code change I can make to enable this?
You can write your own filter (copy & paste, rename the class name, the config_name and rewrite the filter(event) method) or modify the current JSON filter (source on Github)
You can find the JSON filter (Ruby class) source code in the following path logstash-1.x.x\lib\logstash\filters named as json.rb. The JSON filter parse the content as JSON as follows
begin
# TODO(sissel): Note, this will not successfully handle json lists
# like your text is '[ 1,2,3 ]' JSON.parse gives you an array (correctly)
# which won't merge into a hash. If someone needs this, we can fix it
# later.
dest.merge!(JSON.parse(source))
# If no target, we target the root of the event object. This can allow
# you to overwrite #timestamp. If so, let's parse it as a timestamp!
if !#target && event[TIMESTAMP].is_a?(String)
# This is a hack to help folks who are mucking with #timestamp during
# their json filter. You aren't supposed to do anything with
# "#timestamp" outside of the date filter, but nobody listens... ;)
event[TIMESTAMP] = Time.parse(event[TIMESTAMP]).utc
end
filter_matched(event)
rescue => e
event.tag("_jsonparsefailure")
#logger.warn("Trouble parsing json", :source => #source,
:raw => event[#source], :exception => e)
return
end
You can modify the parsing procedure to modify the original JSON
json = JSON.parse(source)
if json.is_a?(Hash)
json.each do |key, value|
if value.is_a?(Array)
value.each_with_index do |object, index|
#modify as you need
object["index"]=index
end
end
end
end
#save modified json
......
dest.merge!(json)
then you can modify your config file to use the/your new/modified JSON filter and place in \logstash-1.x.x\lib\logstash\config
This is mine elastic_with_json.conf with a modified json.rb filter
input{
stdin{
}
}filter{
json{
source => "message"
}
}output{
elasticsearch{
host=>localhost
}stdout{
}
}
if you want to use your new filter you can configure it with the config_name
class LogStash::Filters::Json_index < LogStash::Filters::Base
config_name "json_index"
milestone 2
....
end
and configure it
input{
stdin{
}
}filter{
json_index{
source => "message"
}
}output{
elasticsearch{
host=>localhost
}stdout{
}
}
Hope this helps.
For a quick and dirty hack, I used the Ruby filter and below code , no need to use the out of box 'json' filter anymore
input {
stdin{}
}
filter {
grok {
match => ["message","(?<json_raw>.*)"]
}
ruby {
init => "
def parse_json obj, pname=nil, event
obj = JSON.parse(obj) unless obj.is_a? Hash
obj = obj.to_hash unless obj.is_a? Hash
obj.each {|k,v|
p = pname.nil?? k : pname
if v.is_a? Array
v.each_with_index {|oo,ii|
parse_json_array(oo,ii,p,event)
}
elsif v.is_a? Hash
parse_json(v,p,event)
else
p = pname.nil?? k : [pname,k].join('.')
event[p] = v
end
}
end
def parse_json_array obj, i,pname, event
obj = JSON.parse(obj) unless obj.is_a? Hash
pname_ = pname
if obj.is_a? Hash
obj.each {|k,v|
p=[pname_,i,k].join('.')
if v.is_a? Array
v.each_with_index {|oo,ii|
parse_json_array(oo,ii,p,event)
}
elsif v.is_a? Hash
parse_json(v,p, event)
else
event[p] = v
end
}
else
n = [pname_, i].join('.')
event[n] = obj
end
end
"
code => "parse_json(event['json_raw'].to_s,nil,event) if event['json_raw'].to_s.include? ':'"
}
}
output {
stdout{codec => rubydebug}
}
Test json structure
{"id":123, "members":[{"i":1, "arr":[{"ii":11},{"ii":22}]},{"i":2}], "im_json":{"id":234, "members":[{"i":3},{"i":4}]}}
and this is whats output
{
"message" => "{\"id\":123, \"members\":[{\"i\":1, \"arr\":[{\"ii\":11},{\"ii\":22}]},{\"i\":2}], \"im_json\":{\"id\":234, \"members\":[{\"i\":3},{\"i\":4}]}}",
"#version" => "1",
"#timestamp" => "2014-07-25T00:06:00.814Z",
"host" => "Leis-MacBook-Pro.local",
"json_raw" => "{\"id\":123, \"members\":[{\"i\":1, \"arr\":[{\"ii\":11},{\"ii\":22}]},{\"i\":2}], \"im_json\":{\"id\":234, \"members\":[{\"i\":3},{\"i\":4}]}}",
"id" => 123,
"members.0.i" => 1,
"members.0.arr.0.ii" => 11,
"members.0.arr.1.ii" => 22,
"members.1.i" => 2,
"im_json" => 234,
"im_json.0.i" => 3,
"im_json.1.i" => 4
}
The solution I liked is the ruby filter because that requires us to not write another filter. However, that solution creates fields that are on the "root" of JSON and it's hard to keep track of how the original document looked.
I came up with something similar that's easier to follow and is a recursive solution so it's cleaner.
ruby {
init => "
def arrays_to_hash(h)
h.each do |k,v|
# If v is nil, an array is being iterated and the value is k.
# If v is not nil, a hash is being iterated and the value is v.
value = v || k
if value.is_a?(Array)
# "value" is replaced with "value_hash" later.
value_hash = {}
value.each_with_index do |v, i|
value_hash[i.to_s] = v
end
h[k] = value_hash
end
if value.is_a?(Hash) || value.is_a?(Array)
arrays_to_hash(value)
end
end
end
"
code => "arrays_to_hash(event.to_hash)"
}
It converts arrays to has with each key as the index number. More details:- http://blog.abhijeetr.com/2016/11/logstashelasticsearch-best-way-to.html

Rails 3: How to return a big JSON document

I want to return about 90k items in a JSON document but I'm getting this error when I make the call:
Timeout::Error in ApisController#api_b
time's up!
Rails.root: /root/api_b
I am simply running "rails s" with the default rails server.
What's the way to make this work and return the document?
Thanks
#bs.each do |a|
puts "dentro do bs.each"
#final << { :Email => a['headers']['to'], :At => a['date'], :subject => a['headers']['subject'], :Type => a['headers']['status'], :Message_id => a['headers']['message_id'] }
end
Being #bs the BSON object from MongoDB. The timeout is in "#final << ..."
If you are experiencing timeouts from rails and it is possible to cache the data (e.g. the data changes infrequently), I would generate the response in the background using resque or delayed_job and than have Rails dump that to the client. Or if the data cannot be cached, use a lightweight Rack handler like Sinatra and Metal to generate the responses.
Edited to reflect sample data
I was able to run the following code in a Rails 3.0.9 instance against a high performance Mongo 1.8.4 instance. I was using Mongo 1.3.1, bson_ext 1.3.1, webrick 1.3.1 and Ruby 1.9.2p180 x64. It did not time out but it took some time to load. My sample Mongo DB has 100k records and contains no indexes.
before_filter :profile_start
after_filter :profile_end
def index
db = #conn['sample-dbs']
collection = db['email-test']
#final = []
#bs = collection.find({})
#bs.each do |a|
puts "dentro do bs.each"
#final << { :Email => a['headers']['to'], :At => a['date'], :subject => a['headers']['subject'], :Type => a['headers']['status'], :Message_id => a['headers']['message_id'] }
end
render :json => #final
end
private
def profile_start
RubyProf.start
end
def profile_end
RubyProf::FlatPrinter.new(RubyProf.stop).print
end
A more efficient way to dump out the records would be
#bs = collection.find({}, {:fields => ["headers", "date"]})
#final = #bs.map{|a| {:Email => a['headers']['to'], :At => a['date'], :subject => a['headers']['subject'], :Type => a['headers']['status'], :Message_id => a['headers']['message_id'] }}
render :json => #final
My data generator
100000.times do |i|
p i
#coll.insert({:date =>Time.now(),:headers => {"to"=>"me#foo.com", "subject"=>"meeeeeeeeee", "status" => "ffffffffffffffffff", "message_id" => "1234634673"}})
end

Rails 3: Create a valid JSON Object from an Array of data

I am taking information from my MongoDB database (#bs). #bs has tons of information that I'm not interested, so what I need is to cycle trough all the information and create a new object with the information I need.
For that, I created a new array (#final) and I'm getting information and adding it to #final. The information seems to be getting there, however, when I convert it to JSON it's not a valid JSON object. What I intend to create in #final.json is this:
{ Something: [ {Email: "xxx#xxx.com", At: "date", ....}, {...}, ....] }
But when I do to_json I get [["At: date","Email: mail_test#tidgdfp.org","Message-id: .....
#bs = coll.find("headers.from" => email, "date" => {"$gte" => initial_date, "$lte" => Time.now.utc})
#bs = #bs.to_a.map { |obj| obj.delete("completo"); obj.delete("_id"); obj.delete("date"); obj.delete("headers" => "content_type"); obj }
#final = Array.new
#bs.each do |a|
elem = Array.new
elem << "At: #{a["date"]}"
elem << "Email: #{a["headers"]["to"]}"
elem << "Message: #{a["headers"]["message_id"]}"
elem << "Type: #{a["headers"]["status"]}"
#final << elem
end
puts #final
#final = #final.to_json
puts #final["Email"]
Please help.
Thanks
In your loop, create a hash rather than an array. to_json should make this a JSON Object.
#bs.each do |a|
#final << { :At => a['date'], :Email => a['headers']['to'], :Message => a['headers']['message_id'], :Type => a['headers']['status'] }
end

rails 3 json custom json formatting

I have a collection of #clients with attributes id and email
I want to render this json format
[
{"id":" 1","label":"johndoe#yahoo.com","value":"1"},{"id":" 2","label":"paulsmith#gmail.com.com","value":"2"}
]
in clients_controller I defined the following method
def search
#clients = Client.where(:user_id => current_user.id).select('id','email')
render :partial => "clients/search"
end
and here is the view _search.json.erb
[
<%= raw #client.map{|client| '{"id":"' +" #{client.id}" +'","label":"' + "#{client.email}" + '","value":"' +"#{client.id}" +'"}' }.join(",") %>
]
this is working, but I found it fugly...is there a more elegant way to generate a custom json format in a view?
Use a helper function you call from the view to format the output or a library function you call from the controller. Example (of later):
def search
#clients = Client.where(:user_id => current_user.id).select('id','email')
respond_to do |format|
format.html
format.json do
render :json => custom_json_for(#clients)
end
end
end
private
def custom_json_for(value)
list = value.map do |client|
{ :id => " #{client.id}",
:label => client.email.to_s,
:value => client.id.to_s
}
end
list.to_json
end
You just need use the to_json method. In you case it's
#client.to_json(:only => [:id, :label, :value])
You could use jBuilder gem from GitHub
for clients_controller
def search
#clients = Client.where(:user_id => current_user.id)
end
and search.json.jbuilder
json.id #clients.id
json.label #clients.email
json.value #clients.id
For more info you can visit Jbuilder on RailsCast
You can use https://github.com/dewski/json_builder/ to customize your json response in the view and separate it from the controller. It's good when you need to add some "current user" depending attributes like
[{:attending => event.attending?(current_user)}]