This compiles:
let inputFile = open_in("test.txt");
let line = try(input_line(inputFile)) {
| End_of_file => "end of file"
};
print_endline(line);
But not this:
let inputFile = open_in("test.txt");
try(input_line(inputFile)) {
| line => print_endline(line)
| exception End_of_file => print_endline("end of file")
};
For the latter I get an error: "Exception patterns must be at the top level of a match case"
I'm confused because it seems like an identical pattern to the one in the docs (https://reasonml.github.io/docs/en/exception.html)
let theItem = "a";
let myItems = ["b","a","c"];
switch (List.find((i) => i === theItem, myItems)) {
| item => print_endline(item)
| exception Not_found => print_endline("No such item found!")
};
Which compiles without error.
Changing the order of the match cases, or removing the "exception" keyword, doesn't change the error.
What does this error mean? I'm not sure what "top level" means.
try is used with exception(s) handling similar to try/catch in JavaScript. In your case, you want to do pattern matching and also catch an exception (which reasonml allows), so you could just use switch.
let inputFile = open_in("test.txt");
switch(input_line(inputFile)) {
| line => print_endline(line)
| exception End_of_file => print_endline("end of file")
};
Related
I have large Json object. The object describes among other things a tree type relation of how its components objects are connected hierarchally. The object knows who its children are, but does not know (directly) who its parent is. "my_hash" below exemplifies the structure. Every object has an id 101, 102, etc., a name "one", "two" etc and it can have 0, 1 or more children. I am trying to build the "path" to the every object. E.g. object name "five" should a result have a path of "/one/two/four" as a result of the code. Basically I am trying to build a sort of directory structure of the hierarchy of the objects.
The code below works, but it looks quite long'ish, not very elegant, not very Ruby'ish.
I would be grate for suggestion on how to to do this more efficiently and elegantly. And I got hunch that my code may not be very robust, ie deal well with exceptions.
Any thoughts or help ar appreciated.
On a side note, I am just learning Ruby and so far have mainly programmed in Perl.
class Tree
def initialize
#my_hash = {
101 => ["one", [102, 107]],
102 => ["two", [103, 104]],
103 => ["three", []],
104 => ["four", [105, 106]],
105 => ["five", []],
106 => ["six", []],
107 => ["seven", [108]],
108 => ["eight", []],
}
#child_to_parent_node = {}
#id_to_name = {}
#my_path_hash = {}
#my_hash.keys.each do |key|
#my_path_hash[key] = ""
end
#parent_path_id = []
end
def map_child_to_parent
#my_hash.each do |key, value|
#id_to_name.store(key, value[0])
node_name, children = value[0], value[1]
children.each do |child_id|
#child_to_parent_node.store(child_id, node_name)
end
end
end
def build_path(id)
parent = #child_to_parent_node[id]
parent.nil? ? return : #parent_path_id << parent
id = #id_to_name.key(parent)
build_path(id)
#parent_path_id
end
def update_tree
#id_to_name.keys.each do |id|
tmp_array = self.build_path(id)
path = ""
if (tmp_array.nil?)
path = "/"
else
tmp_array.reverse.each do
path = path + "/" + tmp_array.pop
end
end
puts "id: #{id} path: #{path}"
end
end
end
my_tree = Tree.new
my_tree.map_child_to_parent
my_tree.update_tree
In fact, to solve this task you have to traverse the tree from the leaf to root, right? So, your representation of the tree is very inconvenient for this particular task. If you are ok to trade off some memory for a more clean solution, I'd create an auxiliary structure that contains parents for each node. Let's say
#parents = #my_hash.each_with_object({}) do |(pid, props), acc|
_, children = props
children.each { |cid| acc[cid] = pid }
end
#=> {102=>101, 107=>101, 103=>102, 104=>102, 105=>104, 106=>104, 108=>107}
Now, the task can be solved in a quite concise way with a couple of auxiliary functions. For example:
def id_by_name(name)
id, _ = #my_hash.find { |k, v| v.first == name }
id
end
def name_by_id(id)
#my_hash[id].first
end
def path_to(node)
path = [node]
id = id_by_name(node)
path.unshift(name_by_id(id)) while id = #parents[id]
path.join("/")
end
path_to "five" #=> "one/two/four/five"
Please, note: the solution is still very inefficient - mostly because to fetch a node's id by its name we have to iterate over the whole initial hash in the worst case. This is the price we pay for the data structure that doesn't fit the task well.
Given this example:
let value = try (lazy raise(Exception())).Value with | _ -> false
Why is the exception not captured in the try block. Keep in mind that I raise the exception like this to prove the example. The point is that if I have a lazy code that throws and exception, it seems to be unable to capture in the try block. Do I have to be specific in my match of the exception or do I have to capture the exception inside the lazy expression itself?
The following prints both the exception and the value which is false.
let value =
try
(lazy raise(System.Exception())).Value
with
| exn -> printfn "%A" exn
false
printfn "%A" value
I'm not able to rep your results. What are you seeing?
let value =
try
(lazy raise(Exception())).Value
true
with
| ex ->
printfn "got exception";
false
Gives me
got exception
val value : bool = false
Edit:
Adding debugging image
Here below is my code to find a document by ObjectID:
def find(selector: JsValue, projection: Option[JsValue], sort: Option[JsValue],
page: Int, perPage: Int): Future[Seq[JsValue]] = {
var query = collection.genericQueryBuilder.query(selector).options(
QueryOpts(skipN = page * perPage)
)
projection.map(value => query = query.projection(value))
sort.map(value => query = query.sort(value.as[JsObject]))
// this is the line where the call crashes
query.cursor[JsValue].collect[Vector](perPage).transform(
success => success,
failure => failure match {
case e: LastError => DaoException(e.message, Some(DATABASE_ERROR))
}
)
}
Now let's suppose we invoke this method with an invalid ObjectID:
// ObjectId 53125e9c2004006d04b605abK is invalid (ends with a K)
find(Json.obj("_id" -> Json.obj("$oid" -> "53125e9c2004006d04b605abK")), None, None, 0, 25)
The call above causes the following exception when executing query.cursor[JsValue].collect[Vector](perPage) in the find method:
Caused by: java.util.NoSuchElementException: JsError.get
at play.api.libs.json.JsError.get(JsResult.scala:11) ~[play-json_2.10.jar:2.2.1]
at play.api.libs.json.JsError.get(JsResult.scala:10) ~[play-json_2.10.jar:2.2.1]
at play.modules.reactivemongo.json.collection.JSONGenericHandlers$StructureBufferWriter$.write(jsoncollection.scala:44) ~[play2-reactivemongo_2.10-0.10.2.jar:0.10.2]
at play.modules.reactivemongo.json.collection.JSONGenericHandlers$StructureBufferWriter$.write(jsoncollection.scala:42) ~[play2-reactivemongo_2.10-0.10.2.jar:0.10.2]
at reactivemongo.api.collections.GenericQueryBuilder$class.reactivemongo$api$collections$GenericQueryBuilder$$write(genericcollection.scala:323) ~[reactivemongo_2.10-0.10.0.jar:0.10.0]
at reactivemongo.api.collections.GenericQueryBuilder$class.cursor(genericcollection.scala:342) ~[reactivemongo_2.10-0.10.0.jar:0.10.0]
at play.modules.reactivemongo.json.collection.JSONQueryBuilder.cursor(jsoncollection.scala:110) ~[play2-reactivemongo_2.10-0.10.2.jar:0.10.2]
at reactivemongo.api.collections.GenericQueryBuilder$class.cursor(genericcollection.scala:331) ~[reactivemongo_2.10-0.10.0.jar:0.10.0]
at play.modules.reactivemongo.json.collection.JSONQueryBuilder.cursor(jsoncollection.scala:110) ~[play2-reactivemongo_2.10-0.10.2.jar:0.10.2]
at services.common.mongo.MongoDaoComponent$MongoDao$$anon$1.find(MongoDaoComponent.scala:249) ~[classes/:na]
... 25 common frames omitted
Any idea? Thanks.
Logstash is awesome. I can send it JSON like this (multi-lined for readability):
{
"a": "one"
"b": {
"alpha":"awesome"
}
}
And then query for that line in kibana using the search term b.alpha:awesome. Nice.
However I now have a JSON log line like this:
{
"different":[
{
"this": "one",
"that": "uno"
},
{
"this": "two"
}
]
}
And I'd like to be able to find this line with a search like different.this:two (or different.this:one, or different.that:uno)
If I was using Lucene directly I'd iterate through the different array, and generate a new search index for each hash within it, but Logstash currently seems to ingest that line like this:
different: {this: one, that: uno}, {this: two}
Which isn't going to help me searching for log lines using different.this or different.that.
Any got any thoughts as to a codec, filter or code change I can make to enable this?
You can write your own filter (copy & paste, rename the class name, the config_name and rewrite the filter(event) method) or modify the current JSON filter (source on Github)
You can find the JSON filter (Ruby class) source code in the following path logstash-1.x.x\lib\logstash\filters named as json.rb. The JSON filter parse the content as JSON as follows
begin
# TODO(sissel): Note, this will not successfully handle json lists
# like your text is '[ 1,2,3 ]' JSON.parse gives you an array (correctly)
# which won't merge into a hash. If someone needs this, we can fix it
# later.
dest.merge!(JSON.parse(source))
# If no target, we target the root of the event object. This can allow
# you to overwrite #timestamp. If so, let's parse it as a timestamp!
if !#target && event[TIMESTAMP].is_a?(String)
# This is a hack to help folks who are mucking with #timestamp during
# their json filter. You aren't supposed to do anything with
# "#timestamp" outside of the date filter, but nobody listens... ;)
event[TIMESTAMP] = Time.parse(event[TIMESTAMP]).utc
end
filter_matched(event)
rescue => e
event.tag("_jsonparsefailure")
#logger.warn("Trouble parsing json", :source => #source,
:raw => event[#source], :exception => e)
return
end
You can modify the parsing procedure to modify the original JSON
json = JSON.parse(source)
if json.is_a?(Hash)
json.each do |key, value|
if value.is_a?(Array)
value.each_with_index do |object, index|
#modify as you need
object["index"]=index
end
end
end
end
#save modified json
......
dest.merge!(json)
then you can modify your config file to use the/your new/modified JSON filter and place in \logstash-1.x.x\lib\logstash\config
This is mine elastic_with_json.conf with a modified json.rb filter
input{
stdin{
}
}filter{
json{
source => "message"
}
}output{
elasticsearch{
host=>localhost
}stdout{
}
}
if you want to use your new filter you can configure it with the config_name
class LogStash::Filters::Json_index < LogStash::Filters::Base
config_name "json_index"
milestone 2
....
end
and configure it
input{
stdin{
}
}filter{
json_index{
source => "message"
}
}output{
elasticsearch{
host=>localhost
}stdout{
}
}
Hope this helps.
For a quick and dirty hack, I used the Ruby filter and below code , no need to use the out of box 'json' filter anymore
input {
stdin{}
}
filter {
grok {
match => ["message","(?<json_raw>.*)"]
}
ruby {
init => "
def parse_json obj, pname=nil, event
obj = JSON.parse(obj) unless obj.is_a? Hash
obj = obj.to_hash unless obj.is_a? Hash
obj.each {|k,v|
p = pname.nil?? k : pname
if v.is_a? Array
v.each_with_index {|oo,ii|
parse_json_array(oo,ii,p,event)
}
elsif v.is_a? Hash
parse_json(v,p,event)
else
p = pname.nil?? k : [pname,k].join('.')
event[p] = v
end
}
end
def parse_json_array obj, i,pname, event
obj = JSON.parse(obj) unless obj.is_a? Hash
pname_ = pname
if obj.is_a? Hash
obj.each {|k,v|
p=[pname_,i,k].join('.')
if v.is_a? Array
v.each_with_index {|oo,ii|
parse_json_array(oo,ii,p,event)
}
elsif v.is_a? Hash
parse_json(v,p, event)
else
event[p] = v
end
}
else
n = [pname_, i].join('.')
event[n] = obj
end
end
"
code => "parse_json(event['json_raw'].to_s,nil,event) if event['json_raw'].to_s.include? ':'"
}
}
output {
stdout{codec => rubydebug}
}
Test json structure
{"id":123, "members":[{"i":1, "arr":[{"ii":11},{"ii":22}]},{"i":2}], "im_json":{"id":234, "members":[{"i":3},{"i":4}]}}
and this is whats output
{
"message" => "{\"id\":123, \"members\":[{\"i\":1, \"arr\":[{\"ii\":11},{\"ii\":22}]},{\"i\":2}], \"im_json\":{\"id\":234, \"members\":[{\"i\":3},{\"i\":4}]}}",
"#version" => "1",
"#timestamp" => "2014-07-25T00:06:00.814Z",
"host" => "Leis-MacBook-Pro.local",
"json_raw" => "{\"id\":123, \"members\":[{\"i\":1, \"arr\":[{\"ii\":11},{\"ii\":22}]},{\"i\":2}], \"im_json\":{\"id\":234, \"members\":[{\"i\":3},{\"i\":4}]}}",
"id" => 123,
"members.0.i" => 1,
"members.0.arr.0.ii" => 11,
"members.0.arr.1.ii" => 22,
"members.1.i" => 2,
"im_json" => 234,
"im_json.0.i" => 3,
"im_json.1.i" => 4
}
The solution I liked is the ruby filter because that requires us to not write another filter. However, that solution creates fields that are on the "root" of JSON and it's hard to keep track of how the original document looked.
I came up with something similar that's easier to follow and is a recursive solution so it's cleaner.
ruby {
init => "
def arrays_to_hash(h)
h.each do |k,v|
# If v is nil, an array is being iterated and the value is k.
# If v is not nil, a hash is being iterated and the value is v.
value = v || k
if value.is_a?(Array)
# "value" is replaced with "value_hash" later.
value_hash = {}
value.each_with_index do |v, i|
value_hash[i.to_s] = v
end
h[k] = value_hash
end
if value.is_a?(Hash) || value.is_a?(Array)
arrays_to_hash(value)
end
end
end
"
code => "arrays_to_hash(event.to_hash)"
}
It converts arrays to has with each key as the index number. More details:- http://blog.abhijeetr.com/2016/11/logstashelasticsearch-best-way-to.html
In adding a TO_JSON method (to convert a blessed reference via JSON.pm) into CGI::Cookie if I do this:
package CGI::Cookie;
sub TO_JSON {
return {
map { name => $_->name,
value => $_->value,
domain => $_->domain,
path => $_->path,
expires => $_->expires }
shift
}
}
syntax error at XXX.pm line 76, near "shift " syntax error at XXX.pm
line 77, near "}" Compilation failed in require at (eval 50) line 3.
But if I do this:
package CGI::Cookie;
sub TO_JSON {
return {
map { ''.'name' => $_->name,
value => $_->value,
domain => $_->domain,
path => $_->path,
expires => $_->expires }
shift
}
}
it works
Can't for the life of me figure out why. Also just quoting "name" doesn't help. I have to concatenate an empty string for it to work.
I'm mystified.
The Perl grammar is a bit ambiguous when it comes to blocks and anonymous hashrefs. When Perl cannot guess correctly, you can force the correct interpretation:
Hashref by +{ ... }
Codeblock by {; ... }
Forcing the block after map to be a codeblock resolves the issue. Previously it thought the block was an anonymous hash, and missed a comma before the shift: map can be of the form map EXPR, LIST, and a hashref is a valid expression.
The sub uses misuses map to assign one element to $_. It would better be written:
sub TO_JSON {
my $o = shift; # my $_ should work as well, but that is beside the point
return +{
name => $o->name,
value => $o->value,
domain => $o->domain,
path => $o->path,
expires => $o->expires,
};
}
But it could be abbreviated to
sub TO_JSON {
my $o = shift;
return +{
map { $_ => $o->$_() } qw/name value domain path expires/
};
}