Ruby mysql2 error when executing statements in rapid succession - mysql

I have a strange issue using the Mysql2 client in Ruby. When trying to execute the following:
client.query("CREATE DATABASE ...; INSERT INTO ..."); #SQL truncated for brevity
client.query("SELECT 1 FROM ...") #SQL truncated for brevity
Ruby throws an error that the table I'm selecting from doesn't exist. However if I try the following:
client.query("CREATE DATABASE ...; INSERT INTO ..."); #SQL truncated for brevity
sleep 1
client.query("SELECT 1 FROM ...") #SQL truncated for brevity
The query works with no problems. It seems as though I need to give the MySQL server some time to load the data before I'm able to query it. Can anyone explain why this is happening and how to programmatically overcome this without using sleep?
Update
I initialize the client as so:
Mysql2::Client.new({
:adapter => "mysql2",
:host => ip_address,
:username => db_username,
:password => db_password,
:flags => Mysql2::Client::MULTI_STATEMENTS
})
I checked the 'query_options' attribute and async is set to false. I have tried explicitly setting the async => false flag to no avail.
The same issue happens if I use
Model.connection.execute(SQL HERE)
Note, this is all executed from within a Rails unit test.
Thanks

For some reason the only thing that ended up working and not needing the sleep 1 in between is the following:
#model = Model.new
Mysql2::Client.default_query_options[:connect_flags] |= Mysql2::Client::MULTI_STATEMENTS
#model.connection.reconnect!

Related

Ruby Mysql2 Client not taking backslash while insert

We are using production and staging databases in our application.
Our requirement is to insert all the records to staging database when ever a record is added in production database, so that both the servers are consistent and same data.
I have used Mysql2 client pool to connect to staging server and insert the record that is added to production.
here is my code:
def create
#aperson = Person.new
#person = #aperson.save
if #person && Rails.env == "production"
#add_new_person_to_staging
client = Mysql2::Client.new(:host => dbconfig[:host], :username => dbconfig[:username], :password => dbconfig[:password], :database => dbconfig[:database])
#person_result = client.query('INSERT INTO user_types(user_name, regex, code) Values ("myname" , "\.myregex\." , "ns" );')
end
end
Here "#person_result" record is inserted to mysql table but the "regex" column eliminates "\" slashes.
like : user_name = myname, regex = .myregex., code = ns
when I manually execute the "Insert" query in mysql command line it inserts as it is along with \ slash. but not through "client.query"
Why does \ slash is eliminated. please help me here.
Thanks.
\ is likely being removed by the MySQL2 client as part of a SQL injection protection preprocessor.
Have you looked at trying either a double backslash or using the escape method to properly escape the string?
Try using this
#person_result = client.query('INSERT INTO user_types(user_name, regex, code) Values (myname , "\."+myregx+".\" , ns )')

Inserting data from CSV into MySQL DB is very slow

Trying to insert data from a CSV file to a MySQL DB using Ruby, and it's very slow. Note that this is not a Rails application, just stand-alone Ruby script.
Here is my code:
def add_record (data1, data2, time)
date = DateTime.strptime(time, "%m/%d/%y %H:%M")
<my table>.create(data1: data1, data2: data2, time: date)
end
def parse_file (file)
path = #folder + "\\" + file
CSV.foreach(path, {headers: :first_row}) do |line|
add_record(line[4], line[5], line[0])
end
end
def analyze_data ()
Dir.foreach #folder do |file|
next if file == '.' or file == '..'
parse_file file
end
end
And my connection:
#connection = ActiveRecord::Base.establish_connection(
:adapter=> "mysql2",
:host => "localhost",
:database=> <db>,
:username => "root",
:password => <pw>
)
Any help appreciated.
Use Load Data Infile.
Here is a nice article on performance and strategies titled Testing the Fastest Way to Import a Table into MySQL. Don't let the mysql version of the title or inside the article scare you away. Jumping to the bottom and picking up some conclusions:
The fastest way you can import a table into MySQL without using raw
files is the LOAD DATA syntax. Use parallelization for InnoDB for
better results, and remember to tune basic parameters like your
transaction log size and buffer pool. Careful programming and
importing can make a >2-hour problem became a 2-minute process. You
can disable temporarily some security features for extra performance
You might just find your times greatly reduced.
Use the zdennis/activerecord-import gem. you can insert tons records quickly.

Parallel mysql I/O in Ruby

Good day to you. I'm writing a cron job that hopefully will split a huge MySQL table to several threads and do some work on them. This is the minimal sample of what I have at the moment:
require 'mysql'
require 'parallel'
#db = Mysql.real_connect("localhost", "root", "", "database")
#threads = 10
Parallel.map(1..#threads, :in_processes => 8) do |i|
begin
#db.query("SELECT url FROM pages LIMIT 1 OFFSET #{i}")
rescue Mysql::Error => e
#db.reconnect()
puts "Error code: #{e.errno}"
puts "Error message: #{e.error}"
puts "Error SQLSTATE: #{e.sqlstate}" if e.respond_to?("sqlstate")
end
end
#db.close
The threads don't need to return anything, they get their job share and they do it. Only they don't. Either connection to MySQL is lost during the query, or connection doesn't exist (MySQL server has gone away?!), or no _dump_data is defined for class Mysql::Result and then Parallel::DeadWorker.
How to do that right?
map method expects a result; I don't need a result, so I switched to each:
Parallel.each(1..#threads, :in_processes => 8) do |i|
Also this solves a problem with MySQL: I just needed to start the connection inside the parallel process. When using each loop, it's possible. Of course, connection should be closed inside the process also.

Dashing: Ruby: CentOS: Not closing MySQL processes

I am having trouble with my server.
It is a CentOS RedHat Linux server and runs "Dashing" a Ruby/Sinatra-based dashboard.
I am trying to close the active connections as defined by my MySQL database "SHOW PROCESSLIST;"
Example.rb File
require 'mysql2'
SCHEDULER.every '10s'do
db = Mysql.new('host_name', 'database_name', 'password', 'table')
mysql1 = "SELECT `VAR` from `TABLE` ORDER BY `VAR` DESC LIMIT 1"
result1 = db.query(mysql1)
result1.each do |row|
strrow1 = row[0]
$num1 = strrow1.to_i
end
...
db.close
LINK[0] = { label: 'LABEL', value: $num1}
...
send_event('LABEL FOR HTML', { items: LINK.values })
end
However, after a few clicks back and forth, it is clear that the database does not drop the connections, but instead keeps them. This causes the browser to slow down to the point that loading a page becomes impossible and the output of the log reads:
"max_user_connections" reached
Can anyone think of a way to fix this?
It is a best practice for DB/File/handle stuff to be in a begin/rescue/ensure block. It could be that something is happening and Rufus/Dashing is just being quiet about the error since they trap exceptions and go on their merry way. This would prevent your db connection from closing. The symptoms you are having could be from a similar problem, either way it's a good idea.
SCHEDULER.every '10s'do
begin
db = Mysql.new('host_name', 'database_name', 'password', 'table')
# .... stuff ....
rescue
# what happens if an error happens? log it, toss it, ignore it?
ensure
db.close
end
# ... more stuff if you want ...
end

Ruby + mysql2 querying fails with variables

Working with Ruby 2.0, Qt4 gem and Mysql2 gem. I need to compare the text of two lineedit and make a query with them, which is a failure so far.
client = Mysql2::Client.new(:host => "localhost", :username => "root", :password => "123456", :database => "school")
# text of both lineedits saved into local variables
tName=#leName.text()
tPass=#lePass.text()
#then
res= client.query("SELECT usr_name, usr_pass, usr_tipo FROM user WHERE usr_name = tName AND usr_pass = tPass")
The only thing that fails is that query. I've tried to put the local variable as global (#tName, #tPass), or put them into #{}, which search for columns in the table user named tName and tPass, also tried to put them into '' but that only search for a user named tName.
I want the query to search for usr_name= "text inside tName". What am I doing wrong?
EDIT: if you are wondering, tName, tPass are strings and the fields usr_name and usr_pass are varchar(50).
Looks like you didn't interpolate the variables. do the following
res= client.query("SELECT usr_name, usr_pass, usr_tipo
FROM user
WHERE usr_name = '#{tName}' AND usr_pass = '#{tPass}'")