Is there a ruby gem or such for MySQL connection pooling that isn't part of rails? I simply have a ruby script (again, I don't do anything with rails).
Seamless Database Pool is supposed to work "with any ActiveRecord application", and ActiveRecord is easy to use without Rails. I've used ActiveRecord in a plain Ruby app, and it was just a matter of configuring the logger and the database connection, something like
ActiveRecord::Base.logger = App.logger
dbconfig = YAML::load(File.open("#{APP_ROOT}/config/database.yml"))
ActiveRecord::Base.establish_connection(dbconfig[ENV["APP_ENV"]])
I haven't used Seamless Database Pool outside of Rails, but I couldn't find any connection poolers aimed at plain Ruby apps after a quick search so it might be your best bet.
If you use JRuby (with ActiveRecord and JDBC adapter) you can configure a J2EE container to handle DB connection pool and pass the pool in database.yml via jndi similar to here (example with oracle).
Also, some slides for MYSQL pools via ActiveRecord
ActiveRecord::Base.establish_connection(
:adapter => 'mysql',
:username => 'root',
:password => '123456',
:database => 'database',
:pool => 5 # <- CONN POOL
)
You could wrap the connection_pool gem around a couple single connections.
Examples from the doc (redis):
Example usage with block (faster):
#pool = ConnectionPool.new { Redis.new }
#pool.with do |redis|
redis.lpop('my-list') if redis.llen('my-list') > 0
end
Example usage replacing an existing connection (slower):
$redis = ConnectionPool.wrap { Redis.new }
def do_work
$redis.lpop('my-list') if $redis.llen('my-list') > 0
end
Related
We have a server with mysql on port 3306. We have sertifications and key and we try to connect to this server. But we see such problem:
Peer certificate CN='SomeName' did not match expected CN='someIP'
I've read a lot of articles and can't find answer for PDO PHP. The most interesting is that the SQLYog could connect with all settings.
I've read that I verify_peer_names can be disabled (I hope I understand what is peer_names...), but only if we use openssl_{functions} or mysqli, not PDO. Both options are not appropriate for me. I need PDO.
What I tried to do:
switch between versions of php. It helped me, but I need 5.6 or higher. For php 7.0 the same error.
find another versions of openssl and pdo; fast I understood that its a bad idea :)
find some settings in php.ini, but no settings for my problem, only for creating ssl.
My code for connection:
$dbInfo = array
(
'dsn' => 'mysql:host=123.45.67.890;dbname=someDB;port=3306',
'user' => 'user',
'pass' => 'userpassword'
);
$con = new PDO
(
$dbInfo['dsn'], $dbInfo['user'], $dbInfo['pass'],
array(
PDO::MYSQL_ATTR_SSL_CIPHER => 'AES256-SHA',
PDO::MYSQL_ATTR_SSL_CA => 'SSLCert/ca-cert.pem',
PDO::MYSQL_ATTR_SSL_KEY => 'SSLCert/client-key.pem',
PDO::MYSQL_ATTR_SSL_CERT => 'SSLCert/client-cert.pem',
)
);
echo 'Connection OK!';
We got it working for our internal self-signed certs by not using IP addresses but machine(+domain) names as the CN and connection settings.
So, put 'dbServer1.company.local' as the CN for the server certificate and use the same 'dbServer1.company.local' address as the host part of the DSN for the PDO connection. If you like, you can just use 'dbServer1' but make sure you use it in both places.
This will get you going:
$pdo_options = array(
PDO::MYSQL_ATTR_SSL_KEY => 'path/to/client-key.pem',
PDO::MYSQL_ATTR_SSL_CERT => 'path/to/client-cert.pem',
PDO::MYSQL_ATTR_SSL_CA => 'path/to/ca.pem'
);
PDO::__construct('mysql:host=dbServer1.company.local;dbname=someDB','someUser', 'somePass', $pdo_options);
We manage our own DNS so resolving dbServer1.company.local is not an issue but if your webserver cannot resolve it you or you don't/can't manage the DNS entry, then hack in something like the following to your etc/hosts file:
10.5.5.20 dbServer1.company.local
or
10.5.5.20 dbServer1
I am trying to create some small REST API using ruby with Sinatra gem running on thin server. The point is to get an idea about how easy/hard is to build such REST API consisting of micro web services and to compare this with other programming languages / technologies available at Amazon's AWS. I have created one quite easily, here's the code (just minimal working project, not yet considering any kind of optimization):
require 'sinatra'
require 'mysql'
require 'json'
set :environment, :development
db_host = 'HOST_URL'
db_user = 'USER'
db_pass = 'PASSWORD'
db_name = 'DB_NAME'
db_enc = 'utf8'
select = 'SELECT * FROM table1 LIMIT 30'
db = Mysql.init
db.options Mysql::SET_CHARSET_NAME, db_enc
db = db.real_connect db_host, db_user, db_pass, db_name
get '/brands' do
rs = db.query select
#db.close
result = []
rs.each_hash do |row|
result.push row
end
result.to_json
end
Running this with ruby my_ws.rb starts Sinatra running on thin, no problem.
Using curl from my terminal like curl --get localhost:4567/brands is also not a problem returning the desired JSON response.
The real problem I am tackling now for few hours already (and searching on Google of course, reading lot of resources also here on SO) is when I try to benchmark the micro WS using Siege with more concurrent users:
sudo siege -b http://localhost:4567/brands -c2 -r2
This should run in benchnark mode issuing 2 concurrent request (-c2 switch) 2 times (-r2 switch). In this case I always get an error in the console stating Mysql::ProtocolError - invalid packet: sequence number mismatch(102 != 2(expected)): while the number 102 is always different on each run. If I run the benchmark only for one user (one concurrent request, i.e. no concurrency at all) I can run it even 1000 times with no errors (sudo siege -b http://localhost:4567/brands -c1 -r1000).
I tried adding manual threading into my code like:
get '/brands' do
th = Thread.new do
rs = db.query select
#db.close
result = []
rs.each_hash do |row|
result.push row
end
result.to_json
end
th.join
th.value
end
but with no help.
From what I have found:
Sinatra is multithreaded by default
thin is also multithreaded if run from Sinatra if run by ruby script.rb
multithreading seems to have no effect on DB queries - here it looks like no concurrency is possible
I'm using ruby-mysql gem as I have found out it's newer then (just) mysql gem but in the end have no idea which one to use (found old articles to use mysql and some other to use ruby-mysql instead).
Any idea on how to run concurrent requests to my REST API? I need to benchmark and compare it with other languages (PHP, Python, Scala, ...).
The problem is solved with two fixes.
The first one is by replacing the mysql adapter with mysql2.
The second one was the real cause of the problem: the MySQL connection was created once for runtime before the I could even dive into the thread (i.e. before the code for route was executed) causing (logically) connection locks.
Now with mysql2 and connection to DB moved under the route execution it is all working perfectly fine even for 250 concurrent requests! The final code:
require 'sinatra'
require 'mysql2'
require 'json'
set :environment, :development
db_host = 'HOST_URL'
db_user = 'USER'
db_pass = 'PASSWORD'
db_name = 'DB_NAME'
db_enc = 'utf8'
select = 'SELECT * FROM table1 LIMIT 30'
get '/brands' do
result = []
Mysql2::Client.new(:host => db_host, :username => db_user, :password => db_pass, :database => db_name, :encoding => db_enc).query(select).each do |row|
result.push row
end
result.to_json
end
Running sudo siege -b http://localhost:4567/brands -c250 -r4 gives me now:
Transactions: 1000 hits
Availability: 100.00 %
Elapsed time: 1.54 secs
Data transferred: 2.40 MB
Response time: 0.27 secs
Transaction rate: 649.35 trans/sec
Throughput: 1.56 MB/sec
Concurrency: 175.15
Successful transactions: 1000
Failed transactions: 0
Longest transaction: 1.23
Shortest transaction: 0.03
I have a configuration where, in addition to the local postgresql database, my Rails app also accesses a remote AWS database. My problem is that, even in tests that don't involve the remote database, the app establishes a connection to the remote database every time, so my tests run sloooooowly.
Is there a clean way to disable access to the remote database server for rspec tests that don't need it? (And enable it for those tests that do need it?)
The best I can think of is partition my rspec tests into two separate parts -- those that don't need to access the remote db and those that do -- and use environment variables to enable or disable parts of config/database.yaml accordingly.
To make this clear, my config/database.yaml file contains (in part):
# file: config/database.yaml
# Define connections to the external database
remote:
adapter: mysql
database: remote
username: <%= ENV['PRODUCTION_DB_USERNAME'] || 'root' %>
password: <%= ENV['PRODUCTION_DB_PASSWORD'] || '' %>
host: awsserver-production-mysql.abcdef1234567890.us-west-2.rds.amazonaws.com
port: 3306
test:
adapter: postgresql
encoding: unicode
database: MyApp_test
pool: 5
username: <%= ENV['POSTGRESQL_DB_USERNAME'] || 'MyApp' %>
password: <%= ENV['POSTGRESQL_DB_PASSWORD'] || '' %>
(NOTE: In case you're wondering, using mocks won't help: the remote db connection is established even before the tests start to run. And vcr only intercepts HTTP connections -- database connections use a different mechanism.)
update
I've found examples of how to dynamically establish a connection:
def connect_to_myapp_database
ActiveRecord::Base.establish_connection(:adapter => "mysql",
:database => 'myapp',
:username => ENV['MYAPP_DB_USERNAME'],
:password => ENV['MYAPP_DB_PASSWORD'],
:host => 'mayapp-mysql.abcdefg123456.us-west-2.rds.amazonaws.com',
:port => 3306,
)
end
which works fine -- I can use this to connect to the external database just for those tests that need it. But this begs the question: how do I disconnect from the external database once I've done this?
TL;DR: Use nulldb and a test-aware parent class
Use nulldb when you're testing and the real db otherwise. Here's how:
First, include this in your Gemfile:
group :development, :test do
gem 'activerecord-nulldb-adapter', :git => 'git://github.com/nulldb/nulldb.git'
end
and then do the usual bundle install
Define a base class for all models that are backed in the external database:
class ExternalModel < ActiveRecord::Base
if Rails.app.test?
establish_connection(:adapter => :nulldb)
else
establish_connection(:myapp)
end
def readonly?; true; end
end
Then all the external models inherit from ExternalModel (I should have done this from the start):
class ExternalUser < ExternalModel
...
end
When run in a test environment, it won't try to connect to the external table. Of course, attempts to access an instance of ExternalUser will fail, but you can selectively establish a connection with the external database during integration testing, or stub or mock references to the external model otherwise.
Most importantly, all my tests run really fast now.
[n00b alert] I'm probably doing this all wrong... RSpec outputs this failure:
1)... #skipped irrelevant info
Failure/Error: graph.read_db('example1')
Not connected to any DB. #error msg
#./prim.rb:135:in 'read_db'
#./prim_spec.rb:171:in 'block (2 levels) in <top (required)>'
I have set up a MySQL database on the same machine. The program provides an algorithm for computing a graph's minimum spanning tree. Has methods for file I/O, database I/O using ActiveRecord, etc. All WORKS WELL except RSpec tests.
Code (irrelevant parts left out):
prim_spec.rb
describe PGraph, "online" do
before (:all) do
ActiveRecord::Base.establish_connection(
:adapter => "mysql2",
:host => "localhost",
:username => "root",
:password => "xxxxx",
:database => "rubytasks" )
#the exact same statement works perfectly when running the program itself, but fails in RSpec
end
before (:each) do
#graph = PGraph.new
end
it "should correctly retrieve data from database" do
#graph.read_db('example1') #line 171
#business part goes here
end
end
prim.rb
class PGraph
def read_db(graphID)
#the error which is raised (line 135):
raise "Not connected to any DB." unless ActiveRecord::Base.connected?
#reading goes here
end
end
Connection and PGraph manipulation is performed in ui.rb.
So, ummm, what's the correct way to access a real DB (I'm lazy) for testing (or is the problem elsewhere?)? Preferably something simple, since this is just a school assignment. And without messing with Rails or other gems.
PS: I'm using the most recent versions of all gems and server. On Windows 7 x86. Ruby 1.9.2. Thanks.
I'm guessing not everything is loaded properly when you run your rspec tests. Are all classes that setup your database connection loaded properly and with the right parameters when running rspec?
How do we monitor that the connectivity between the rails app and the database is established. will rails try to re-connect the connection with mysql if it closes?
AFAIK yes, you could still check periodically a special controller you would create that displays whether the DB is up or down (using ActiveRecord::Base.connected? for instance).
Edit Tue Mar 30:
I believe you could instead write a Metal and periodically check the result from it (using a rake task or whatever monitoring tool you're used to):
class ConnectivityChecker
def self.call(env)
if ActiveRecord::Base.connected?
[200, { 'Content-Type' => 'text/html' }, ["Connected."]]
else
# Leave it to Rails to deal with the request.
[500, { 'Content-Type' => 'text/html' }, ['Database is not reachable.']]
end
end
end