Currently we're relying on Sphinx's PHP library to manage our faceted search, which depends on the ability to use Sphinx's multi-queries feature.
The latest Sphinx search documentation describes how to perform the same multi-query procedure in SphinxQL, via MySQL. It gives an example using PHP.
http://sphinxsearch.com/docs/manual-2.0.4.html#sphinxql-multi-queries
Do any MySQL gems exist for ruby that support multi-queries in this way?
I'm looking at the mysql2 gem, which seems to be the latest thing, but it doesn't appear to support it. Am I still at a loss when it comes to Sphinx multi-queries in ruby?
I'm going to write a client that supports them in the next few days at work anyway if not, but obviously SphinxQL would make this much easier. I'd also rather not have to make my gem connect to two different protocols for RT indexes (which can only be written to via SphinxQL). It seems like SphinxQL is basically where it's at.
It appears the ruby-mysql gem supports this: https://github.com/tmtm/ruby-mysql/blob/master/lib/mysql.rb#L406-419
I assume it is processing this correctly. It says 'execute', but in reality it appears to simply be fetching the next set of results from a query that was already executed.
# execute next query if multiple queries are specified.
# === Return
# true if next query exists.
def next_result
return false unless more_results
check_connection
#fields = nil
nfields = #protocol.get_result
if nfields
#fields = #protocol.retr_fields nfields
#result_exist = true
end
return true
end
There is reference to it here too: http://zetcode.com/db/mysqlrubytutorial/ (Under 'Multiple Statements')
I ended up writing my own gem, which includes a little wrapper around MySQL. Not a fully-fledged mysql client, but a minimal bridge in order to support SphinxQL.
You can see the gem here: https://github.com/d11wtq/oedipus
And the C extension here: https://github.com/d11wtq/oedipus/blob/master/ext/oedipus/oedipus.c
Related
I tried sqlSave() in RODBC, but it runs super slowly. Is there an alternative way of doing this?
You could look at the package RMySQL. I am using it and it offers quite some convenience loading and reading data from a MySQL database. That being said it is limited in the queries you can use (e.g. HAVING is not possible IIRC). I can't say it's super-quick or my data is that big, but it's several 2-digits MB of text and it's ok. Depends on what you expect. However it's convenient:
con <- dbConnect(MySQL(), user="user", password="pass",
dbname="mydb", host="localhost",client.flag=CLIENT_MULTI_STATEMENTS)
dbListTables(con)
yourtable <- dbReadTable(con,"sometable")
# write it back
dbWriteTable(con,"yourTableinMySQL",yourtable,overwrite=T)
# watch out with the overwrite argument it does what it says :)
dbDisconnect(con)
yourtable will be a data.frame. Sometimes it bugs me that the modes are not set like i'd expect, but I have a custom made function for that. Just need to improve it then I'll post it here.
http://cran.r-project.org/web/packages/RMySQL/
You need to have mysql client codes installed before you try installing RMySQL. Without knowing what OS and version you use it is really not possible to give any better answer.
Is there any MySQL library for Ruby that supports parameterization? The documentation for mysql2 gives this example:
escaped = client.escape("gi'thu\"bbe\0r's")
results = client.query("SELECT * FROM users WHERE group='#{escaped}'")
And that seems kind of clunky and screw-up-able to me.
Sequel does, too. But for MySQL it only simulates them:
The MySQL ruby driver does not support bound variables, so the bound variable methods fall back to string interpolation.
Apparently DBI does
http://ruby-dbi.rubyforge.org/
v. Rails 2.3.8
What I'm looking to achieve is to dynamically modify outbound links in Rails so that the changes are cached using fragment caching. How would you go about doing this?
Note: This time I am intentionally not including my own ideas and source code here, as I'd like to hear suggestions without bias.
Thanks.
Here is the solution:
In ActionController::Caching::Fragments.fragment_for, change these lines:
pos = buffer.length
block.call
write_fragment(name, buffer[pos..-1], options)
to this:
pos = buffer.length
fragment = Nokogiri::HTML::fragment(block.call)
fragment.css('a').each do |a|
unless a['href'].nil?
a.set_attribute('rel', 'nofollow') unless (a['href'].starts_with?('/') || a['href'].starts_with?("http://#{ENV['BASE_URL']}"))
end
end
buffer[pos..-1] = fragment.to_html
write_fragment(name, buffer[pos..-1], options)
Please note that:
I use ENV['BASE_URL'] to store the base URL of the site (loaded from
database during initialization).
You must have the Nokogiri gem installed.
This solution works for Rails 2.3.8 - I have not tested in version 3.
I'm creating a set of SQL INSERT statements for a database that doesn't exist yet, and I'm saving them to file.
How can I use Perl's powerful DBI module to create those INSERT statements without accessing a specific database. In particular, it looks like using the $dbh->quote() function requires that I instantiate $dbh with a connection to a database.
Unfortunately, the actual quote() behaviour isn't always a portable characteristic, so each driver will do them differently. Unless you connect to a driver, you don't know which quoting format to use in practice. (There is one module that might do this without a connection, DBIx::Abstract, but it is not especially current.).
The quote() method is actually implemented by the corresponding driver class, in the DBD::* namespace. You might attempt to load the driver you need and call the function directly (see http://search.cpan.org/~timb/DBI-1.616/lib/DBI/DBD.pm#Writing_DBD::Driver::db::quote) but this feels grubby.
I'd still make a DBI connection, if only so that you get the right format of quoting. You don't need to actually send it any statements, but then you do know that the quoting format will be correct for the database you will use.
From DBI::quote:
For most database types, at least those that conform to SQL standards, quote would return 'Don''t' (including the outer quotation marks). For others it may return something like 'Don\'t'
That is, the behavior of DBI::quote varies from database to database, and it doesn't make sense to call it in a database-independent way.
Make a trivial connection to a database of the same type you are writing for, or learn your database's quoting conventions and implement a quote method yourself. See the DBI source for a reference implementation.
Usually you would use DBI by specifying a database like so:
my $dbh = DBI->connect("DBI:mysql:database=$db_name;host=127.0.0.1");
However, your database does not yet exist so you cannot connect to it. You can use DBI without specifying a database like so:
my $dbh = DBI->connect("DBI:mysql:;host=127.0.0.1");
You could use DBD::CSV or DBD::AnyData as a dummy database. SQLite is good for this purpose, too.
A hidden advantage of using SQLite here is that it's a semi-real database, and will tend to make you write code in a way that's decoupled from any specific database.
According to perl -MDBI -E 'say join(q{,},DBI->available_drivers);'
in clean Debian chroot with only DBI (package "libdbi-perl") installed the following drivers are available right away:
DBM,ExampleP,File,Gofer,Proxy,Sponge
The minimum DBI connect statement that works for me is
my $dbh=DBI->connect("DBI:DRIVER:"); # DRIVER is one of [DBM,File,ExampleP,Sponge,mysql,SQLite]
That is enough to use $dbh->quote() with no database whatsoever.
DBM and File escape q{'} as q{\'} ("mysql" style);
ExampleP and Sponge escape: q{'} as q{''} ("SQLite" style).
You can also use:
DBD::_::db->quote()
To access the quote function without setting up a database handle. I don't believe it is specific to MySQL though.
How can I log all executed SQL statements to the console/stdout when using Ruby MySQL?
Like the log/development.log output from rails.
If you mean Ruby/MySQL, that provides a debug() function which performs the same function as mysql_debug() -- if your client library has been compiled with debugging, you can get DBUG to trace things for you. This might give what you're after (with a bit of clean-up.)
Another approach would be to capture the MySQL packets using tcpdump and decode them with maatkit.
A third approach would be to alias Mysql.query and Mysql.real_query with your own functions that do logging. Something like this (untested! trivial example! doesn't handle blocks!):
class Mysql
alias_method :old_query, :query
def query(sql)
$stderr.puts "SQL: #{sql}"
old_query(sql)
end
end
Rails uses ORMs like ActiveRecord which has it's logger associated to it.
I don't think MySQL gem has a logger...