I am new in ruby on rails and I want to read data from a JSON file from a specified directory, but I constantly get an error in chap3(File name)
Errno::ENOENT in TopController#chap3. No such file or directory # rb_sysopen - links.json.
In the console, I get a message
Failed to load resource: the server responded with a status of 500 (Internal Server Error)
How I can fix that?
Code:
require "json"
class TopController < ApplicationController
def index
#message = "おはようございます!"
end
def chap3
data = File.read('links.json')
datahash = JSON.parse(data)
puts datahash.keys
end
def getName
render plain: "名前は、#{params[:name]}"
end
def database
#members = Member.all
end
end
JSON file:
{ "data": [
{"link1": "http://localhost:3000/chap3/a.html"},
{"link2": "http://localhost:3000/chap3/b.html"},
{"link3": "http://localhost:3000/chap3/c.html"},
{"link4": "http://localhost:3000/chap3/d.html"},
{"link5": "http://localhost:3000/chap3/e.html"},
{"link6": "http://localhost:3000/chap3/f.html"},
{"link7": "http://localhost:3000/chap3/g.html"}]}
I would change these two lines
data = File.read('links.json')
datahash = JSON.parse(data)
in the controller to
datahash = Rails.root.join('app/controllers/links.json').read
Note: I would consider moving this kind of configuration file into the /config folder and creating a simple Ruby class to handle it. Additionally, you might want to consider paths instead of URLs with a host because localhost:3000 might work in the development environment but in production, you will need to return non-localhost URLs anyway.
Rails use the content of file in the controller
#data = File.read("#{Rails.root}/app/controllers/links.json")
Related
For my application, new file uploaded to storage is read and the data is added to a main file. The new file contains 2 lines, one a header and other an array whose values are separated by a comma. The main file will need maximum of 265MB. The new files will have maximum of 30MB.
def write_append_to_ecg_file(filename,ecg,patientdata):
file1 = open('/tmp/'+ filename,"w+")
file1.write(":".join(patientdata))
file1.write('\n')
file1.write(",".join(ecg.astype(str)))
file1.close()
def storage_trigger_function(data,context):
#Download the segment file
download_files_storage(bucket_name,new_file_name,storage_folder_name = blob_path)
#Read the segment file
data_from_new_file,meta = read_new_file(new_file_name, scale=1, fs=125, include_meta=True)
print("Length of ECG data from segment {} file {}".format(segment_no,len(data_from_new_file)))
os.remove(new_file_name)
#Check if the main ecg_file_exists
file_exists = blob_exists(bucket_name, blob_with_the_main_file)
print("File status {}".format(file_exists))
data_from_main_file = []
if ecg_file_exists:
download_files_storage(bucket_name,main_file_name,storage_folder_name = blob_with_the_main_file)
data_from_main_file,meta = read_new_file(main_file_name, scale=1, fs=125, include_meta=True)
print("ECG data from main file {}".format(len(data_from_main_file)))
os.remove(main_file_name)
data_from_main_file = np.append(data_from_main_file,data_from_new_file)
print("data after appending {}".format(len(data_from_main_file)))
write_append_to_ecg_file(main_file,data_from_main_file,meta)
token = upload_files_storage(bucket_name,main_file,storage_folder_name = main_file_blob,upload_file = True)
else:
write_append_to_ecg_file(main_file,data_from_new_file,meta)
token = upload_files_storage(bucket_name,main_file,storage_folder_name = main_file_blob,upload_file = True)
The GCF is deployed
gcloud functions deploy storage_trigger_function --runtime python37 --trigger-resource patch-us.appspot.com --trigger-event google.storage.object.finalize --timeout 540s --memory 8192MB
For the first file, I was able to read the file and write the data to the main file. But after uploading the 2nd file, its giving Function execution took 70448 ms, finished with status: 'connection error' On uploading the 3rd file, it gives the Function invocation was interrupted. Error: memory limit exceeded. Despite of deploying the function with 8192MB memory, I am getting this error. Can I get some help on this.
Searched online and read through the documents, but have not been able to find an answer. I am fairly new and part of learning Ruby I wanted to make the script below.
The Script essentially does a Carrier Lookup on a list of numbers that are provided through a CSV file. The CSV file has just one row with the column header "number".
Everything runs fine UNTIL the API gives me an output that is different from the others. In this example, it tells me that one of the numbers in my file is not a valid US number. This then causes my script to stop running.
I am looking to see if there is a way to either ignore it (I read about Begin and End, but was not able to get it to work) or ideally either create a separate file with those errors or just put the data into the main file.
Any help would be much appreciated. Thank you.
Ruby Code:
require 'csv'
require 'uri'
require 'net/http'
require 'json'
number = 0
CSV.foreach('data1.csv',headers: true) do |row|
number = row['number'].to_i
uri = URI("https://api.message360.com/api/v3/carrier/lookup.json?PhoneNumber=#{number}")
req = Net::HTTP::Post.new(uri)
req.basic_auth 'XXX' , 'XXX'
res = Net::HTTP.start(uri.hostname, uri.port, :use_ssl => true) {|http|
http.request(req)
}
json = JSON.parse(res.body)
new = json["Message360"]["Carrier"].values
CSV.open("new.csv", "ab") do |csv|
csv << new
end
end
File Data:
number
5556667777
9998887777
Good Response example in JSON:
{"Message360"=>{"ResponseStatus"=>1, "Carrier"=>{"ApiVersion"=>"3", "CarrierSid"=>"XXX", "AccountSid"=>"XXX", "PhoneNumber"=>"+19495554444", "Network"=>"Cellco Partnership dba Verizon Wireless - CA", "Wireless"=>"true", "ZipCode"=>"92604", "City"=>"Irvine", "Price"=>0.0003, "Status"=>"success", "DateCreated"=>"2018-05-15 23:05:15"}}}
The response that causes Script to stop:
{
"Message360": {
"ResponseStatus": 0,
"Errors": {
"Error": [
{
"Code": "ER-M360-CAR-111",
"Message": "Allowed Only Valid E164 North American Numbers.",
"MoreInfo": []
}
]
}
}
}
It would appear you can just check json["Message360"]["ResponseStatus"] first for a 0 or 1 to indicate failure or success.
I'd probably add a rescue to help catch any other errors (malformed JSON, network issue, etc.)
CSV.foreach('data1.csv',headers: true) do |row|
number = row['number'].to_i
...
json = JSON.parse(res.body)
if json["Message360"]["ResponseStatus"] == 1
new = json["Message360"]["Carrier"].values
CSV.open("new.csv", "ab") do |csv|
csv << new
end
else
# handle bad response
end
rescue StandardError => e
# request failed for some reason, log e and the number?
end
I am very new on Sketchup and ruby , I have worked with java and c# but this is the first time with ruby.
Now I have one problem, I need to serialize all scene in one json (scene hierarchy, object name, object material and position this for single object) how can I do this?
I have already done this for unity3D (c#) without a problem.
I tried this:
def main
avr_entities = Sketchup.active_model.entities # all objects
ambiens_dictionary = {}
ambiens_list = []
avr_entities.each do |root|
if root.is_a?(Sketchup::Group) || root.is_a?(Sketchup::ComponentInstance)
if root.name == ""
UI.messagebox("this is a group #{root.definition.name}")
if root.entities.count > 0
root.entities.each do |leaf|
if leaf.is_a?(Sketchup::Group) || leaf.is_a?(Sketchup::ComponentInstance)
UI.messagebox("this is a leaf #{leaf.definition.name}")
end
end
end
else
# UI.messagebox("this is a leaf #{root.name}")
end
end
end
end
Have you tried the JSON library
require 'json'
source = { a: [ { b: "hello" }, 1, "world" ], c: 'hi' }.to_json
source.to_json # => "{\"a\":[{\"b\":\"hello\"},1,\"world\"],\"c\":\"hi\"}"
Used the code below to answer a question Here, but it might also work here.
The code can run outside of SketchUp for testing in the terminal. Just make sure to follow these steps...
Copy the code below and paste it on a ruby file (example: file.rb)
Run the script in terminal ruby file.rb.
The script will write data to JSON file and also read the content of JSON file.
The path to the JSON file is relative to the ruby file created in step one. If the script can't find the path it will create the JSON file for you.
module DeveloperName
module PluginName
require 'json'
require 'fileutils'
class Main
def initialize
path = File.dirname(__FILE__)
#json = File.join(path, 'file.json')
#content = { 'hello' => 'hello world' }.to_json
json_create(#content)
json_read(#json)
end
def json_create(content)
File.open(#json, 'w') { |f| f.write(content) }
end
def json_read(json)
if File.exist?(json)
file = File.read(json)
data_hash = JSON.parse(file)
puts "Json content: #{data_hash}"
else
msg = 'JSON file not found'
UI.messagebox(msg, MB_OK)
end
end
# # #
end
DeveloperName::PluginName::Main.new
end
end
I am trying to save data from a Hash to a file. I convert it to JSON and dump it into the file.
When I try to parse back from file to hash I get JSON::ParserError
Code to convert Hash to JSON file: (works fine)
user = {:email => "cumber#cc.cc", :passwrd => "hardPASSw0r|)"}
student_file = File.open("students.txt", "a+") do |f|
f.write JSON.dump(user)
end
After adding a few values one by one to the file it looks something like this:
{"email":"test1#gmail.com","passwrd":"qwert123"}{"email":"test3#gmail.com","passwrd":"qwert12345"}{"email":"cumber#cc.cc","passwrd":"hardPASSw0r|)"}
I tried the following code to convert back to Hash but it doesn't work:
file = File.read('students.txt')
data_hash = JSON.parse(file)
I get
System/Library/Frameworks/Ruby.framework/Versions/2.0/usr/lib/ruby/2.0.0/json/common.rb:155:in `parse': 757: unexpected token at '{"email":"test3#gmail.com","passwrd":"qwert12345"}{"email":"cumber#cc.cc","passwrd":"hardPASSw0r|)"}' (JSON::ParserError)
from /System/Library/Frameworks/Ruby.framework/Versions/2.0/usr/lib/ruby/2.0.0/json/common.rb:155:in `parse'
from hash_json.rb:25:in `<main>'
My goal is to be able to add and remove values from the file.
How do I fix this, where was my mistake? Thank you.
This should work:
https://repl.it/EXGl/0
# as adviced by #EricDuminil, on some envs you need to include 'json' too
require 'json'
user = {:email => "cumber#cc.cc", :passwrd => "hardPASSw0r|)"}
student_file = File.open("students.txt", "w") do |f|
f.write(user.to_json)
end
file = File.read('students.txt')
puts "saved content is: #{JSON.parse(file)}"
p.s. hope that this is only an example, never store passwords in plain-text! NEVER ;-)
I'm using a CHI interface to memcached (or File in devel) in my Dancer app, but I'm getting an error in the serializer when I cache an object. I have the following in my dancer config:
engines:
JSON:
allow_blessed: 1
convert_blessed: 1
What else do I need?
Error message:
Error while loading bin/app.pl: encountered object 'C3M::CMF=HASH(0x3ef8aa8)', but neither allow_blessed nor convert_blessed settings are enabled at /usr/lib/perl5/site_perl/5.10/CHI/Serializer/JSON.pm line 19.
CHI::Serializer::JSON doesn't use the same serializer as Dancer::Serializer::JSON. Dancer::Serializer::JSON uses setting('engines') in config.yml, but there's no way to send configuration options to CHI::Serializer::JSON.
workaround:
use CHI::Serializer::JSON;
my $JSON = JSON->new->utf8->canonical;
$JSON->allow_blessed(1);
$JSON->convert_blessed(1);
*CHI::Serializer::JSON::serialize = sub { $JSON->encode( $_[1] ) };
*CHI::Serializer::JSON::deserialize = sub { $JSON->decode( $_[1] ) };