I'm storing a config file in version control (GitLab) which contains information to be read by my ruby app. This info is stored as an object containing objects containing objects.
(Update adding more detail and examples for clarity as requested...)
From within my app I can successfully GET the file (which returns the following JSON Object (some bits trimmed with ... for readability):
{"file_name"=>"approval_config.json", "file_path"=>"approval_config.json", "size"=>1331, "encoding"=>"base64", "content_sha256"=>"1c21cbb...fa453fe", "ref"=>"master", "blob_id"=>"de...915", "commit_id"=>"07e...4ff", "last_commit_id"=>"07e...942f", "content"=>"ogICAg...AgICB"}
I can JSON parse the above object and access the contents property on that object. The value of the contents property is a base64Encoded string containing the actual contents of my file in GitLab. I can successfully decode this and see the JSON string stored in GitLab:
"{"G000":{"1":{"max":"4000","name":"Matthew Lewis","id":"ord-matthewl","email":"matthew.lewis#companyx.com"},"2":{"max":"4000","name":"Brendan Jones","id":"ord-brendanj","email":"brendan.jones#companyx.com"},"3":{"max":"20000","name":"Henry Orson","id":"ord-henryo","email":"henry.orson#companyx.com"},"4":{"max":"10000000","name":"Chris Adams","id":"ord-chrisa","email":"chris.adams#companyx.com"}},"G15":{"1":{"max":"4000","name":"Mike Butak","id":"ord-mikebu","email":"mike.butak#companyx.com"},"2":{"max":"4000","name":"Joseph Lister","id":"ord-josephl","email":"joseph.lister#companyx.com"},"3":{"max":"20000","name":"Mike Geisler","id":"ord-mikeg","email":"mike.geisler#companyx.com"},"4":{"max":"10000000","name":"Samuel Ahn","id":"ord-samuela","email":"samuel.ahn#companyx.com"}}}"
THIS string (above), I cannot JSON parse. I get an "unexpected token at '{ (JSON::ParserError)" error.
While writing this update it occurs to me that this "un-parsable" string is simply what I put in the file to begin with. Perhaps the method I used to stringify the file's contents in the first place is the issue. I simply pasted a valid javascript object in my browser's console, JSON.stringify'd it, copied the result from the console, and pasted it in my file in GitLab. Perhaps I need to use Ruby's JSON.stringify method to stringify it?
Based on feedback from #ToddA.Jacobs, I tried the following in my ruby script:
require 'rest-client'
require 'json'
require 'base64'
data = RestClient.get 'https://gitlab.companyx.net/api/v4/projects/3895/repository/files/approval_config.json?ref=master', {'PRIVATE-TOKEN':'*********'}
# get the encoded data stored on the 'content' key:
content = JSON.parse(data)['content']
# decode it:
config = Base64.decode64(content)
# print some logs
$evm.log(:info, config)
$evm.log(:info, "config is a Hash? :" + config.is_a?(Hash).to_s) #prints false
$evm.log(:info, "config is a string? :" + config.is_a?(String).to_s) #prints true
hash = JSON.parse(config)
example = hash.dig "G000" "4" "id"
$evm.log(:info, "print exmaple on next line")
$evm.log(:info, example)
That last line prints:
The following error occurred during method evaluation: NoMethodError: undefined method 'gsub' for nil:NilClass (drbunix:///tmp/automation_engine20200903-3826-1nbuvl) /usr/local/ lib/ruby/gems/2.5.0/gems/manageiq-password-0.3.0/lib/manageiq/password.rb:89:in 'sanitize_string'
Remove Outer Quotes
Your input format is invalid: you're nesting unescaped double quotes, and somehow expecting that to work. Just leave off the outer quotes. For example:
require 'json'
json = <<~'EOF'
{"G000":{"1":{"max":"4000","name":"Matthew Lewis","id":"ord-matthewl","email":"matthew.lewis#companyx.com"},"2":{"max":"4000","name":"Brendan Jones","id":"ord-brendanj","email":"brendan.jones#companyx.com"},"3":{"max":"20000","name":"Henry Orson","id":"ord-henryo","email":"henry.orson#companyx.com"},"4":{"max":"10000000","name":"Chris Adams","id":"ord-chrisa","email":"chris.adams#companyx.com"}},"G15":{"1":{"max":"4000","name":"Mike Butak","id":"ord-mikebu","email":"mike.butak#companyx.com"},"2":{"max":"4000","name":"Joseph Lister","id":"ord-josephl","email":"joseph.lister#companyx.com"},"3":{"max":"20000","name":"Mike Geisler","id":"ord-mikeg","email":"mike.geisler#companyx.com"},"4":{"max":"10000000","name":"Samuel Ahn","id":"ord-samuela","email":"samuel.ahn#companyx.com"}}}
EOF
hash = JSON.parse(json)
hash.dig "G000", "4", "id"
#=> "ord-chrisa"
hash.dig "G15", "4", "id"
#=> "ord-samuela"
This question was answered by users on another post I opened: Why can Ruby not parse local JSON file?
Ultimately the issue was not Ruby failing to parse my JSON. Rather it was the logging function being unable to log the hash.
Some of my logs contain json in their message field. I use the json filter as follow:
json {
skip_on_invalid_json => true
source => "message"
target => "json"
}
To try to parse the message field, and if it contains valid json add it to the json field.
Unfortunately from time to time, I receive logs which contain a single string like "some random message" in the message field. In these logs the string from message end-up in the json and messes up the index mapping.
I htried to filter this out by adding:
prune {
blacklist_values => { "json" => "/.+/" }
}
But this seems to always remove the json field.
Is there a way to parse the message field or keep the json field only when it contains an object and not a single string?
You could do it using a ruby filter that tests the field you are interested in
ruby {
code => '
s = event.get("json")
if s and s.instance_of? String
event.remove("json")
end
'
}
That will not remove [json] if it is a hash or array.
I am parsing a json file with "codec => json" in the input and " json { source=>message }" in the filter.
I have also tried alternating the two.
The parsed fields cannot be read by logstash using "if [comment]". This will not work despite the being about to see the field with values with "stdout { codec => rubydebug }" as output
I just found out that the fields that I am trying to work with are actually sub fields. I was trying to access them like normal fields.
I'm using a service to load my form data into an array in my angular2 app.
The data is stored like this:
arr = []
arr.push({title:name})
When I do a console.log(arr), it is shown as Object. What I need is to see it
as [ { 'title':name } ]. How can I achieve that?
you may use below,
JSON.stringify({ data: arr}, null, 4);
this will nicely format your data with indentation.
To print out readable information. You can use console.table() which is much easier to read than JSON:
console.table(data);
This function takes one mandatory argument data, which must be an array or an object, and one additional optional parameter columns.
It logs data as a table. Each element in the array (or enumerable property if data is an object) will be a row in the table
Example:
first convert your JSON string to Object using .parse() method and then you can print it in console using console.table('parsed sring goes here').
e.g.
const data = JSON.parse(jsonString);
console.table(data);
Please try using the JSON Pipe operator in the HTML file. As the JSON info was needed only for debugging purposes, this method was suitable for me. Sample given below:
<p>{{arr | json}}</p>
You could log each element of the array separately
arr.forEach(function(e){console.log(e)});
Since your array has just one element, this is the same as logging {'title':name}
you can print any object
console.log(this.anyObject);
when you write
console.log('any object' + this.anyObject);
this will print
any object [object Object]
Using Logstash 1.4.2 with ElasticSearch 1.3 (I'm aware it's not the latest ES version available) on Ubuntu 14.04 LTS.
We have an event stream which contains JSON inside one of its fields named "message".
We'd like to replace the event fields by the JSON of that field if it's found.
We'd also like to remove the ORIGINAL "message" field (the one which contains the JSON string) if found and parsed.
The problem is that the JSON object inside the field's text could define a new "message" field, which we have to retain.
The following removes the "message" field always after parsing it:
json {
source => "message"
remove_field => [ "message" ]
}
Which is wrong, we want to keep it in case there was a "message" field inside the value of the original "message" field.
I tried to do the following trick, but it seems to still remove the "message" field from the result:
mutate {
rename => [ "message", "___temp_logstash_filter_message___" ]
}
json {
source => "___temp_logstash_filter_message___"
}
mutate {
remove_field => [ "___temp_logstash_filter_message___" ]
}
i.e. I try to rename the original "message" field to an arbitrary internal name which I don't expect to appear in the input value, parse the JSON string using that temporary name as a source, then remove the renamed original field.
That way I was hoping to distinguish between the "original" message field and any "message" field which may be contained inside its JSON value. But this doesn't seem to make a difference - the "message" field is still missing from the result.
Is there a way to achieve what I need?
Thanks.
Instead of renaming the field, copy the first field into a new one.
This can be done with:
filter {
...
ruby {
code => "event['new_field'] = event['old_field']"
}
...
}