let's say I have a #hash ={a:1,b:2}
I know json.some_key #hash will output it as a json under some_key,
but let's say I have some filter or transformation work to do,
json.some_key do
#hash.each do |key,value|
json.send(key,value) #try to output {"a":1} when key=="a"
end
end
unfornately above doesn't work, json can't have send,
how do I do it?(essential it's same to output with the key,
but the key's value is only known at runtime)
Related
I have searched what I can and I don't seem to be finding the answer I need. Granted I may not be wording it properly. I have tried using .find or even .rindex to count backwards, but no such luck. The value I receive from the JSON looks something like this:
"AdditionalData":"<Data><Entry Key=\"utm_campaign\" Value=\"j2c\" />
<Entry Key=\"utm_medium\" Value=\"cpc\" /><Entry Key=\"utm_source\"
Value=\"j2c\" /><Entry Key=\"job_id\" Value=\"300_xxxx_10703\" /></Data>"
I need to be able to grab the value for the key "job_id", so the "300_xxxx_11233". This value will change per object returned by the JSON response. Any help would be appreciated, and please let me know if this is already out there and I just missed it.
If the response format remains the same with every request, you could use a plain regexp expression to fetch your data, even without parsing JSON. Example:
response = "<Data><Entry Key=\"utm_campaign\" Value=\"j2c\" /><Entry Key=\"utm_medium\" Value=\"cpc\" /><Entry Key=\"utm_source\" Value=\"j2c\" /><Entry Key=\"job_id\" Value=\"300_xxxx_10703\" /></Data>"
match = response.match(%r{job_id\\?"\s+Value=\\?"(.+)\\?"}i)
match[1] if match # => "300_xxxx_10703"
If the response format can change (for example, if the order of the attributes of Entry element can change), then you need to parse JSON and use some HTML parser, such as Nokigiri, to fetch required attrbute. Code example:
parsed_response = JSON.parse(response)
doc = Nokogiri::HTML(parsed_response['AdditionalData'])
job_id = nil
doc.css('Entry').each do |el|
if el['Key'] == 'job_id'
job_id = el['Value']
break
end
end
I would like to dump a nested datastructure in ruby to json (I am aware of the Marshal module but I need a standard format) and be able to load/parse the datastructure again. Catch: I use structs (or easier for the example: hashes) as keys of hashes. Example:
require 'json'
h = {{hello: 123} => 123}
JSON.parse(JSON.generate(h)) #=> {"{:hello=>123}"=>123}
So the problem is, that JSON.generate(h) serialises the key {:hello=>123} as a string and when I parse the result again, it remains a string.
How can I solve this and regain the original structure after generate/parse?
JSON only allows strings as object keys. For this reason to_s is called for all keys.
You'll have the following options to solve your issue:
The best option is changing the data structure so it can properly be serialized to JSON.
You'll have to handle the stringified key yourself. An Hash produces a perfectly valid Ruby syntax when converted to a string that can be converted using Kernel#eval like Andrey Deineko suggested in the comments.
result = json.transform_keys { |key| eval(key) }
# json.transform_keys(&method(:eval)) is the same as the above.
The Hash#transform_keys method is relatively new (available since Ruby 2.5.0) and might currently not be in you development environment. You can replace this with a simple Enumerable#map if needed.
result = json.map { |k, v| [eval(k), v] }.to_h
Note: If the incoming JSON contains any user generated content I highly sugest you stay away from using eval since you might allow the user to execute code on your server.
I need a standard format
YAML is a standard format that would suffice here:
▶ h = {{hello: 123} => 123}
#⇒ {{:hello=>123}=>123}
▶ YAML.dump h
#⇒ "---\n? :hello: 123\n: 123\n"
▶ YAML.load _
#⇒ {{:hello=>123}=>123}
As already pointed by mudasobwa, YAML is a good tool: allows you to store also custom class objects:
require 'yaml'
class MyCaptain
attr_accessor :name, :ship
def initialize(name, ship)
#name = name
#ship = ship
end
end
kirk = MyCaptain.new('James T. Kirk', 'USS Enterprise NCC-1701')
picard = MyCaptain.new('Jean-Luc Picard', 'Enterprise NCC-1701D')
captains = [kirk, picard]
File.open("my_captains.yml","w") do |file|
file.write captains.to_yaml
end
p YAML.load_file('my_captains.yml')
#=> [#<MyCaptain:0x007f889d0973b0 #name="James T. Kirk", #ship="USS Enterprise NCC-1701">, #<MyCaptain:0x007f889d096b40 #name="Jean-Luc Picard", #ship="Enterprise NCC-1701D">]
So I have a JSON file of a somewhat known format { String => JSON::Type, ... }. So it is basically of type Hash(String, JSON::Type). But when I try and read it from file to memory like so: JSON.parse(File.read(#cache_file)).as(Hash(String, JSON::Type)) I always get an exception: can't cast JSON::Any to Hash(String, JSON::Type)
I'm not sure how I am supposed to handle the data if I can't cast it.
What I basically want to do is the following:
save JSON::Type data under a String key
replace JSON::Type data with other JSON::Type data under a String key
And of course read from / write to file...
Here's the whole thing I've got so far:
class Cache
def initialize(#cache_file = "/tmp/cache_file.tmp")
end
def cache(cache_key : (String | Symbol))
mutable_cache_data = data
value = mutable_cache_data[cache_key.to_s] ||= yield.as(JSON::Type)
File.write #cache_file, mutable_cache_data
value
end
def clear
File.delete #cache_file
end
def data
unless File.exists? #cache_file
File.write #cache_file, {} of String => JSON::Type
end
JSON.parse(File.read(#cache_file)).as(Hash(String, JSON::Type))
end
end
puts Cache.new.cache(:something) { 10 } # => 10
puts Cache.new.cache(:something) { 'a' } # => 10
TL;DR I want to read a JSON file into a Hash(String => i_dont_care), replace a value under a given key name and serialize it to file again. How do I do that?
JSON.parse returns an JSON::Any, not a Hash so you can't cast it. You can however access the underlying raw value as JSON.parse(file).raw and cast this as hash.
Then your code is basically working (I've fixed a few error): https://carc.in/#/r/28c1
You can use use Hash(String, JSON::Type).from_json(File.read(#cache_file)). Hopefully you can restrict the type of JSON::Type down to something more sensible too. JSON::Any and JSON.parse_raw are very much a last resort compared to simply representing your schema using Hash, Array and custom types using JSON.mapping.
I think I've written myself in a corner. Basically, I have an array of hashes, like so.
my_hashes = [{"colorName"=>"first", "hexValue"=>"#f00"}, {"colorName"=>"green", "hexValue"=>"#0f0"},
{"colorName"=>"blue", "hexValue"=>"#00f"}, {"colorName"=>"cyan", "hexValue"=>"#0ff"},
{"colorName"=>"magenta", "hexValue"=>"#f0f"}, {"colorName"=>"yellow", "hexValue"=>"#ff0"},
{"colorName"=>"black", "hexValue"=>"#000"}]
I need to use JSON.parse to eventually be able to transform these hashes into CSV format. The only problem is I can't get JSON.parse to work as long as the "=>" symbol is present. I've tried just doing a regular gsub('=>', ':') but it appears that I cannot use it as this is an array of hashes. I've tried variations of the following method:
my_hashes.each do |hash|
hash.each do |key, value|
key.gsub!('=>', ':')
value.gsub!('=>', ':')
end
end
I need these hash values to stay intact, so even if I transform them intro strings, if I transform them back they'll still have the '=>' symbol available. Any advice?
Changing => to : wouldn't make a Ruby hash to a JSON object. And in fact you cannot just change a hash like that at all. Because the written representation of a hash is not the same as the interpreted version in memory.
But that doesn't solve your problem: You need a JSON representation of a Ruby hash, just use to_json:
my_hashes = [
{"colorName"=>"first", "hexValue"=>"#f00"},
{"colorName"=>"green", "hexValue"=>"#0f0"},
{"colorName"=>"blue", "hexValue"=>"#00f"},
{"colorName"=>"cyan", "hexValue"=>"#0ff"},
{"colorName"=>"magenta", "hexValue"=>"#f0f"},
{"colorName"=>"yellow", "hexValue"=>"#ff0"},
{"colorName"=>"black", "hexValue"=>"#000"}
]
require 'json'
my_hashes.to_json
#=> "[{"colorName":"first","hexValue":"#f00"},{"colorName":"green","hexValue":"#0f0"},{"colorName":"blue","hexValue":"#00f"},{"colorName":"cyan","hexValue":"#0ff"},{"colorName":"magenta","hexValue":"#f0f"},{"colorName":"yellow","hexValue":"#ff0"},{"colorName":"black","hexValue":"#000"}]"
my_hashes=[{"colorName"=>"first", "hexValue"=>"#f00"}]
new_data = my_hashes.to_json.gsub(/\=\>/, ':')
data = Json.parse new_data
I'm using Postgrex in Elixir, and when it returns query results, it returns them in the following struct format:
%{columns: ["id", "email", "name"], command: :select, num_rows: 2, rows: [{1, "me#me.com", "Bobbly Long"}, {6, "email#tts.me", "Woll Smoth"}]}
It should be noted I am using Postgrex directly WITHOUT Ecto.
The columns (table headers) are returned as a collection, but the results (rows) are returned as a list of tuples. (which seems odd, as they could get very large).
I'm trying to find the best way to programmatically create JSON objects for each result in which the JSON key is the column title and the JSON value the corresponding value from the tuple.
I've tried creating maps from both, merging and then serialising to JSON objects but it seems there should be an easier/better way of doing this.
Has anyone dealt with this before? What is the best way of creating a JSON object from a separate collection and tuple?
Something like this should work:
result = Postgrex.query!(...)
Enum.map(result.rows, fn row ->
Enum.zip(result.columns, Tuple.to_list(row))
|> Enum.into(%{})
|> JSON.encode
end)
This will result in a list of json objects where each row in the resultset is a json object.