Ruby On Rails: Testing deletes tables - mysql

I'm creating an application in RoR and I'm implementing unit testing in all my models.
When I run every test on his own (by running ruby test/unit/some_test.rb) all tests are successful.
But when I run all tests together (by running rake test:units) some tables from both databases (development e test) are deleted.
I'm using raw SQL (mysql) do create tables because I need composite primary keys and physical constraints so I figured it would be the best. Maybe this be the cause?
All my tests are in this form:
require File.dirname(FILE) + '/../test_helper'
require File.dirname(FILE) + '/../../app/models/order'
class OrderTestCase < Test::Unit::TestCase
def setup
#order = Order.new(
:user_id => 1,
:total => 10.23,
:date => Date.today,
:status => 'processing',
:date_concluded => Date.today,
:user_address_user_id => 3,
:user_address_address_id => 5,
:creation_date => Date.today,
:update_date => Date.today
)
end
################ Happy Path
def test_happy_path
assert #order.valid?, #order.errors.full_messages
end
...
The errors I get when running the tests are something like this:
3) Error:
test_empty_is_primary(AddressTestCase):
ActiveRecord::StatementInvalid: Mysql::Error: Table 'shopshop_enterprise_test.addresses' doesn't exist: SHOW FIELDS FROM addresses
/test/unit/address_test.rb:9:in new'
/test/unit/address_test.rb:9:insetup'
Any guesses?
Thanks!
PS: When using postgres as the database engine, everything works fine with rake test:units! (of course, with the correct changes so the sql statements can work with postgres)

Related

Ruby Mysql2 Client not taking backslash while insert

We are using production and staging databases in our application.
Our requirement is to insert all the records to staging database when ever a record is added in production database, so that both the servers are consistent and same data.
I have used Mysql2 client pool to connect to staging server and insert the record that is added to production.
here is my code:
def create
#aperson = Person.new
#person = #aperson.save
if #person && Rails.env == "production"
#add_new_person_to_staging
client = Mysql2::Client.new(:host => dbconfig[:host], :username => dbconfig[:username], :password => dbconfig[:password], :database => dbconfig[:database])
#person_result = client.query('INSERT INTO user_types(user_name, regex, code) Values ("myname" , "\.myregex\." , "ns" );')
end
end
Here "#person_result" record is inserted to mysql table but the "regex" column eliminates "\" slashes.
like : user_name = myname, regex = .myregex., code = ns
when I manually execute the "Insert" query in mysql command line it inserts as it is along with \ slash. but not through "client.query"
Why does \ slash is eliminated. please help me here.
Thanks.
\ is likely being removed by the MySQL2 client as part of a SQL injection protection preprocessor.
Have you looked at trying either a double backslash or using the escape method to properly escape the string?
Try using this
#person_result = client.query('INSERT INTO user_types(user_name, regex, code) Values (myname , "\."+myregx+".\" , ns )')

Inserting data from CSV into MySQL DB is very slow

Trying to insert data from a CSV file to a MySQL DB using Ruby, and it's very slow. Note that this is not a Rails application, just stand-alone Ruby script.
Here is my code:
def add_record (data1, data2, time)
date = DateTime.strptime(time, "%m/%d/%y %H:%M")
<my table>.create(data1: data1, data2: data2, time: date)
end
def parse_file (file)
path = #folder + "\\" + file
CSV.foreach(path, {headers: :first_row}) do |line|
add_record(line[4], line[5], line[0])
end
end
def analyze_data ()
Dir.foreach #folder do |file|
next if file == '.' or file == '..'
parse_file file
end
end
And my connection:
#connection = ActiveRecord::Base.establish_connection(
:adapter=> "mysql2",
:host => "localhost",
:database=> <db>,
:username => "root",
:password => <pw>
)
Any help appreciated.
Use Load Data Infile.
Here is a nice article on performance and strategies titled Testing the Fastest Way to Import a Table into MySQL. Don't let the mysql version of the title or inside the article scare you away. Jumping to the bottom and picking up some conclusions:
The fastest way you can import a table into MySQL without using raw
files is the LOAD DATA syntax. Use parallelization for InnoDB for
better results, and remember to tune basic parameters like your
transaction log size and buffer pool. Careful programming and
importing can make a >2-hour problem became a 2-minute process. You
can disable temporarily some security features for extra performance
You might just find your times greatly reduced.
Use the zdennis/activerecord-import gem. you can insert tons records quickly.

get ids as well for all models with seed_dump gem

Hi I am using seed_dump gem to create seeds.rb from existing data but I stuck one thing I want to get ids for all models as well how can I do this for example currently if I run
rake db:seed:dump
I just get code like this
Product.create(title: "title", description: "text")
but I want this
Product.create(id: 1, title: "title", description: "text")
how can i do that?
rake db:seed:dump EXCLUDE=[]
This overwrites the default exclude of [:id, :created_at, :updated_at] so that it includes the id
Create your own export. Assuming that your model's name is Country:
lib/tasks/export.rake
namespace :export do
desc "Exports data for using in a seeds.rb."
task :seeds_format => :environment do
Country.order(:id).all.each do |country|
puts "Country.create(#{country.serializable_hash.
delete_if {|key, value| ['created_at','updated_at'].
include?(key)}.to_s.gsub(/[{}]/,'')})"
end
end
end
You could run that with this command:
rake export:seeds_format > db/seeds.rb

Ruby mysql2 error when executing statements in rapid succession

I have a strange issue using the Mysql2 client in Ruby. When trying to execute the following:
client.query("CREATE DATABASE ...; INSERT INTO ..."); #SQL truncated for brevity
client.query("SELECT 1 FROM ...") #SQL truncated for brevity
Ruby throws an error that the table I'm selecting from doesn't exist. However if I try the following:
client.query("CREATE DATABASE ...; INSERT INTO ..."); #SQL truncated for brevity
sleep 1
client.query("SELECT 1 FROM ...") #SQL truncated for brevity
The query works with no problems. It seems as though I need to give the MySQL server some time to load the data before I'm able to query it. Can anyone explain why this is happening and how to programmatically overcome this without using sleep?
Update
I initialize the client as so:
Mysql2::Client.new({
:adapter => "mysql2",
:host => ip_address,
:username => db_username,
:password => db_password,
:flags => Mysql2::Client::MULTI_STATEMENTS
})
I checked the 'query_options' attribute and async is set to false. I have tried explicitly setting the async => false flag to no avail.
The same issue happens if I use
Model.connection.execute(SQL HERE)
Note, this is all executed from within a Rails unit test.
Thanks
For some reason the only thing that ended up working and not needing the sleep 1 in between is the following:
#model = Model.new
Mysql2::Client.default_query_options[:connect_flags] |= Mysql2::Client::MULTI_STATEMENTS
#model.connection.reconnect!

Cannot search on a particular index

Model:
class TechRequest < ActiveRecord::Base
...
define_index do
...
indexes :hot_request
indexes :status_id, :as => :current_status_id
...
has :hot_request , :as => :hot_request
set_property :delta => true
end
DB:
hot_request - tinyint(1)
When I execute the controller code-
#query_string = '(#hot_request 1)(#current_status_id 1 | 2 | 3)'
#tech_requests = TechRequest.search #query_string, :match_mode => :extended
the following error is thrown up:
ThinkingSphinx::SphinxError: index tech_request_core,tech_request_delta: query error: no field 'tech_hot_request' found in schema
from D:/Current/TechAssistTest/vendor/plugins/thinking-sphinx/lib/thinking_sphinx/search.rb:392:in 'populate'
from D:/Current/TechAssistTest/vendor/plugins/thinking-sphinx/lib/thinking_sphinx/search.rb:508:in 'call'
from D:/Current/TechAssistTest/vendor/plugins/thinking-sphinx/lib/thinking_sphinx/search.rb:508:in 'retry_on_stale_index'
from D:/Current/TechAssistTest/vendor/plugins/thinking-sphinx/lib/thinking_sphinx/search.rb:379:in 'populate'
from D:/Current/TechAssistTest/vendor/plugins/thinking-sphinx/lib/thinking_sphinx/search.rb:167:in 'method_missing'
from D:/ruby/lib/ruby/1.8/irb.rb:302:in 'output_value'
from D:/ruby/lib/ruby/1.8/irb.rb:151:in 'eval_input'
from D:/ruby/lib/ruby/1.8/irb.rb:263:in 'signal_status'
from D:/ruby/lib/ruby/1.8/irb.rb:147:in 'eval_input'
from D:/ruby/lib/ruby/1.8/irb.rb:146:in 'eval_input'
from D:/ruby/lib/ruby/1.8/irb.rb:70:in 'start'
from D:/ruby/lib/ruby/1.8/irb.rb:69:in 'catch'
from D:/ruby/lib/ruby/1.8/irb.rb:69:in 'start'
from D:/ruby/bin/irb:13
The search works fine when I use hot_request as an attribute. The
search also works fine when I use #query_string = '(#current_status_id 1 | 2 | 3)'.
I've just run into similar looking problems - there are two possible reasons why this errors that I can see. First is that according to http://sphinxsearch.com/forum/view.html?id=2103 you can use an sql column as a field or an attribute but not both (without cloning it). The other, which had me baffled for a while, is that you may need to specify the type - so if hot_request is actually an integer, you probably need to have something like
indexes hot_request :as => hr, :type => :integer
or you get that cryptic error message
Hope this helps someone ...