mysql performance - mysql

We are developing our database in MySql with innoDB engine. The database contains a column that is of varchar type with each entry containing about 3000 characters. We are to provide search on this column. For speeding up purpose, we need to add index on this column. Can you put in some information in this regard?
Which type of index do we need to put in to speed up the search? Do we need to take some other care about it for performance improvement?

If by search you mean you'll be performing a query like this:
SELECT * from cars WHERE car LIKE '%{search_str}%'
Then I am afraid that even if you add a key to the car column mysql will still have to perform a full-scan and your query might potentiolly be very slow.
If you are planning on supporting a significan amount of data to be searched and expect some high qps numbers, I would reccomend you have a look at Apaches Lucene project, which can drastically speed up any search query performed. Plus, it also supports full-text search.

Like ducky says, if you're going to query the column using a SQL LIKE, you're query is going to be very slow, no matter what index you put on the column.
There's 2 options:
Switch to MyIsam database instead of InnoDB and use full-text search on this column. This is done by placing a 'fulltext' index on the column. More Information.
Use a tool like Lucene for full-text searching

Related

MySQL search FTS vs Multiple Queries

Working on a project where schema is something like this:
id , key, value
The key and value columns are varchar, and the table is InnoDB.
A user can search on the basis of key value pairs ... Whats the best way to query in MySQL ? the options I can think of is:
For each key => value form a query and perform an inner join to get id matching all criterias.
Or in the background, populate a MyISAM table id, info with Full Text index on info and a single query using like '%key:value%key2:value2%'. The benefit of this will be later on if the website is popular and the table has a hundred thousand rows, I can easily port the code to Lucene but for now MySQL.
The pattern you're talking about is called relational division.
Option #1 (the self-join) is a much faster solution if you have the right indexes.
I compared the performance for a couple of solutions to relational division in my presentation
SQL Query Patterns, Optimized. The self-join solution worked in 0.005 seconds even against a table with millions of rows.
Option #2 with fulltext isn't correct anyway as you've written it, because you wouldn't use LIKE with fulltext search. You'd use MATCH(info) AGAINST('...' IN BOOLEAN MODE). I'm not sure you can use patterns in key:value format anyway. MySQL FTS prefers to match words.
#Bill Karwin
If you're going to do this for 1 condition, it will be super fast with this EAV-like schema, but if you do it for many (esp. with mixed ANDs and ORs) it will probably fall apart. The best you can hope for is some sort of super fast index merge, and that's elusive. You're going to get a temporary table in most DBMSes if you do anything fancy. I think I remember reading you're no fan of EAV, though, and maybe I'm misunderstanding you.
As I recall, a DBMS is also free to do multiple scans and then handle this with a disposable bitmap index. But fulltext indexes keep the document lists sorted and do a low-cost merge across all criteria with a FTS planner that starts strategically with the rarer keywords. That's all they do to execute "word1 & word2" all day. They're optimized for this sort of thing.
So if you have lots of simple facts, a FTS index is one decent way to do it I think. Am I missing something? You just need to change the facts to something indexable like COLORID_3, then search for "COLORID_3 & SOMETHINGELSEID_5."
If the queries involve no merging or sorting, I suspect it will be pretty much as wash. Nothing here but us BTREEs ...

Use sphinx vs MySQL on no text search query

I have this doubt:
Suppose I have a one big table with a relationship to to a smaller table of users.
The idea is to search in that really big table for dates bigger than a given date and order by a score (big int, for example), and obtain related user info at the same time.
The result of this query can change every 10 minutes or so.
So, there is no text search, but I have a really big table. Should I use sphinx (or other search engine) or should I just use some MySQL indexes?
If I use sphinx, it's sure that I can obtain really fast results; but maybe having the index refreshed, even with delta indexing, doesn't make a big difference with MySQL indexing. At the same time, the changes in the table are not necessary new inserts, but updates; and I have read that real time indexing and delta index can give problems.
Maybe it would be better to use MySQL indexes, and help with some kind of caching to avoid unnecessary queries .
Just use MySQL, you definitely don't need Sphinx for what you are doing.

User search use MySQL LIKE or integrate a search engine (such as solr, sphinx, lucene etc.)?

In my mysql db I have a user table consisting of 37,000 (or thereabouts) users.
When a user search for another user on the site, I perform a simple like wildcard (i.e. LIKE '{name}%}) to return the users found.
Would it be more efficient and quicker to use a search engine such a solr to do my 'LIKE' searches? furthermore? I believe in solr I can use wildcard queries (http://www.lucidimagination.com/blog/2009/09/08/auto-suggest-from-popular-queries-using-edgengrams/)
To be honest, it's not that slow at the moment using a LIKE query however as the number of users grows it'll become slower. Any tips or advice is greatly appreciated.
We had a similar situation about a month ago, our database is roughly around 33k~ and due to the fact our engine was InnoDB we could not utilize the MySQL full-text search feature (that and it being quite blunt).
We decided to implement sphinxsearch (http://www.sphinxsearch.com) and we're really impressed with the results (me becoming quite a 'fanboy' of it).
If we do a large index search with many columns (loads of left joins) of all our rows we actually halved the query response time against the MySQL 'LIKE' counterpart.
Although we havn't used it for long - If you're going to build for future scailablity i'd recommend sphinx.
you can speed up if the searchword must have minimum 3 chars to start the search and index your search column with a index size of 3 chars.
It's actually already built-in to MySQL: http://dev.mysql.com/doc/refman/5.0/en/fulltext-search.html
we're using solr for this purpose, since you can search in 1-2 ms even with milions of documents indexed. we're mirroring our mysql instance with Data Import Handler and then we search on Solr.
as neville pointed out, full text searches are built-in in mysql, but solr performances are way better, since it's born as a full text search engine

MySQL: Optimizing Searches with LIKE or FULLTEXT

I am building a forum and I am looking for the proper way to build a search feature that finds users by their name or by the title of their posts. What I have come up with is this:
SELECT users.id, users.user_name, users.user_picture
FROM users, subject1, subject2
WHERE users.id = subject1.user_id
AND users.id = subject2.user_id
AND (users.user_name LIKE '%{$keywords}%'
OR subject1.title1 LIKE '%{$keywords}%'
OR subject2.title2 LIKE '%{$keywords}%')
ORDER BY users.user_name ASC
LIMIT 10
OFFSET {$offset}
The LIMIT and the OFFSET is for pagination. My question is, would doing a LIKE search through multiple tables greatly slow down performance when the number of rows reach a significant amount?
I have a few alternatives:
One, perhaps I can rewrite that query to have the LIKE searches done inside a subquery that only returns indexed user_ids. Then, I would find the remaining user information based on that. Would that increase performance by much?
Second, I suppose I can have the $keyword string appear before the first wildcard as in LIKE {$keyword}%. This way, I can index the user_name, title1, and title2 columns. However, since I will be trading accuracy for speed here, how much of a difference in performance would this make? Will it be worth sacrificing this much accuracy to index these columns?
Third, perhaps I can give users 3 search fields to choose from, and have each search through only one table. Would this increase performance by much?
Lastly, should I consider using a FULLTEXT search instead of LIKE? What are the performance differences between the two? Also, my tables are using the InnoDB storage engine, and I am not able to use the FULLTEXT index unless I switch to MyISAM. Will there be any major differences in switching to MyISAM?
Pagination is another performance issue I am worried about, because in order to do pagination, I would need to find the total number of results the query returns. At the moment, I am basically doing the query I just mentioned TWICE because the first time it is used only to COUNT the results.
There are two things in your query that will prevent MySql from using indexes firstly your patterns start with a wildcard %, MySql can't use indexes to search for patterns that start with a wildcard, secondly you have OR in your WHERE clause you need to rewrite your query using UNION to avoid using OR which also prevents MySql from using indexes. Without using an index MySql needs to do a full table scan every time and the time needed for that will increase linearly as the number of rows grow in your table and yes as you put it "it would greatly slow down performance when the number of rows reach a significant amount" so I'd say your only real scalable option is to use FULLTEXT search.
Most of your questions are explained here: http://use-the-index-luke.com/sql/where-clause/searching-for-ranges/like-performance-tuning
InnoDB/fulltext indexing is announced for MySQL 5.6, but that will probably not help you right now.
How about starting with EXPLAIN <select-statement>? http://dev.mysql.com/doc/refman/5.6/en/explain.html
Switching to MyISAM should work seemlessly. The only downside is, that MyISAM is locking the whole table upon inserts/updates, which can be slow down tables with many more inserts than selects. Basically a rule of thumb in my opinion is to use MyISAM when you don't need foreign keys and the table has far more selects than inserts and use InnoDB when the table has far more inserts/updates than selects (e.g. for a statistic table).
In your case I guess switching to MyISAM is the better choice as a fulltext index is way more powerful and faster.
It also delivers the possibilty to use certain query modifiers like excluding words ("cat -dog") or similar. But keep in mind that it's not possible to look for words ending with a phrase anymore like with a LIKE-search ("*bar"). "foo*" will work though.

Wildcard search on column(s) in a large table (>10.000.000 rows) in MySQL

Which techniqes would you use to implement a search for contents in a column on a very big table in MySql? Say for instance that you have 10.000.000 emails stored in a table in the database and would like to implement a subject search, that would enable me to search for one or more words that was present in the email subject. If the user searched for "christmas santa" you should find a emails with subjects like "Santa visits us this christmas" and "christmas, will santa ever show".
My idea is to process all the words in the subjects (strip all numbers, special signs, commas etc) and save each word in an index table, where I have a unique index on the word column. Then I would link that to the email table by a many to many relationship table.
Is there a better way to perform wildcard searches on very big tables ?
Is there databases that natively supports this kind of searches ?
You could use FULLTEXT indexes if you are using MyISAM as the storage engine. However, MySQL in general is not very good with text search.
A much better option would be to go with a dedicated text indexing solution such as Lucene or Sphinx. Personally I'd recommend Sphinx - it has great integration with PHP and MySQL and is very, very fast (can be used to speed up even ordinary queries - performs very fast grouping and ordering).
Wikipedia has a nice list of different indexing engines - here.
MySQL's MyISAM tables support a FULLTEXT index, which helps in this kind of search.
But it's not the speediest technology available for this kind of search. And you can't use it on data stored in InnoDB tables.
I've heard some good things about Sphinx Search, but I haven't used it yet.
Here's another blog about Sphinx: http://capttofu.livejournal.com/13037.html
While a mysql fulltext index is possible, I suspect I would look at using something designed to be a search engine like Lucene.
This sounds like a a full text search, which SQL Server supports.
But your idea is generally sound. You're effectively computing an "index" on your table in advance to speed up searches.
You want to look at the MATCH...AGAINST function.
See, for example: Using MySQL Full-text Searching
check "full text search" in MySQL docs (AFAIK, all current DBMS support this)