Full Text Search with NodeJS - mysql

I want to implete the full text search(FTS) queries in my node js application. The database I am using is MySQL. I know that MySQL does have inbuild support for FTS but sadly it does not support Singular/Plurals, Synonyms and Inflectional words.
There other FTS libraries available that can work with MySQl. Following are the two I am interested in
Lucene
Sphinx Search
I am very much sure that Shpinx Search has npm package and can be used with node js. I am not sure if Lucene can be used with node js ?
Please let me know if lucene can be used with node js if so provide the documentation for the same.
Thanks !

You have several alternative like query-engine and several other tools available for Lucene.
Also, if you want to use FTS with node, you could have a look to Norch like suggest this answer on a look-alike topic.
Best,

Maintainer of search-index and Norch here. They might be what you are looking for. You can even use your MySQL database as a back end if you want.
https://github.com/fergiemcdowall/search-index (the lib)
https://github.com/fergiemcdowall/norch (the server)

Related

Can search-index be used for a project running on Node.js using mysql as database?

I have been looking for a suitable implementation of full text search for my application running on Node.js using MySQL as database. I have considered various options:
clucene - however, this has not been updated for a long time
lunr.js - more suitable for smaller data searches as it uses front end processing
Currently, the most hopeful one seems to be search-index, but I am not sure whether it can be implemented when the project is using MySQL as DB instead of LevelDB. According to what i found online, any database that are levelUP compliant (and a "Down" component) would work.
Does anyone has experience implementing search-index using MySQL? It'll be great to hear any suggestions regarding this, or any other full text search packages that works with Node.js and MySQL.
Yes, you can absolutely use search-index with mysql, in fact they make a great fit.
Use the mysqljs package, and check out: https://github.com/mysqljs/mysql#piping-results-with-streams2 . This will allow you to do the following:
connection.query('SELECT * FROM posts')
.stream({highWaterMark: 5})
.pipe(mySearchIndex.defaultPipeline())
.pipe(mySearchIndex.add()) // <- mysql docs are now searchable in mySearchIndex

Indexing MySQL data with ElasticSearch

I would like to get some feedback from anyone who's had experience indexing MySQL data with ElasticSearch for full-text searching. How did you accomplish this? I've been researching this a bit and unfortunately I've noticed that ElasticSearch has no official plugin to accomplish this although I've come across three different 3rd party tools:
elasticsearch-river-jdbc
go-mysql-elasticsearch
elasticsearch-river-mysql
I'm unsure which one would be best in terms of performance although I suspect the Go tool might have an advantage due to it's compiled nature and the fact that it uses the mysql binary logs. I would appreciate any advice or examples anyone could provide me with.
Thanks!
UPDATE:
you can now use Logstash to do it..here is an useful blog about it click ere
well you can use mysql UDF functions in mysql triggers to execute external scripts, which will index the newly updated or inserted data in elasticsearch. thats one way. check this to see how to's on mysql UDF

Locomotive with mysql

I am planning to use Locomotive for a project...But as the official website says it cannot work with MySQL. However my requirement is to use it with MySQL.
Has any one used it with mySQL? Any pointers or advises would be great. Thanks.
Cheers,
Abi
Locomotive uses Mongoid - which is an ORM for MongoDB. As such, you cannot use it with MySQL at all, since it leverages some features (dynamic attributes), which traditional RDBMS databases (such as MySQL) do not support.
That being said, MongoDB is pretty easy to install, so if you can get around your requirement of MySQL, then you should have no problems.
Decoupling Locomotive from MongoDB would be no small task at the moment, which is not to say that it is impossible.
Regarding your requirement of MySQL, if you are working on incorporating Locomotive into a larger web-application, one thing to consider is that your non-Locomotive models can still live in MySQL regardless of where Locomotive keeps its data.
From what i know it wont be possible to use locomotive only with mysql as it use heavily mondodb.
However it's not a problem to build your app living together with locomotive and use what ever orm you like.
Just configure your database.yml and the magic will happend ;)
cheers,
Gregory horion

Search engine for web app - multi lingual and multibase

I am working on a website project. We have a MySql and a MongoDb base.
We want to add a full-text search-engine over these bases (and if it can be linked with PostgreSql it's better).
These databases contain multilingual texts but we cannot determine the language.
I saw Solr, ElasticSearch and Sphinx, but what is your advice on this topic ?
Solr and Sphinx have stemmings but I am not sure we can use it without knowledge about content language...
Elastic is full JSON that could be better if we use more and more mongoDb...
It doesn't matter what search engine you use, stemming is highly language-dependent. IMHO you'll have to somehow detect the language in order to feed the text to the proper stemmer.
There is a product from Basis Technologies called the Rosette Language Platform that does autodetection of languages that you might look into.
Solr supports JSON for results (and indexing???) if that is a key integration mechanism. I would put "JSON" support a bit further down the list of things to scorecard on, and focus on How Relevant Will Results Be From Search Engine X For My Domain.

Spider that tosses results into mysql

Looking to use Sphinx for site search, but not all of my site is in mysql. Rather than reinvent the wheel, just wondering if there's an open source spider that easily tosses its findings into a mysql database so that Sphinx can then index it.
Thanks for any advice.
There's also the XML pipe datasource that can feed documents to Sphinx. Not sure if it'd be any easier to set something up to output your site's content as XML than it would be to insert it into the DB, but it's an option.
If you're not 100% stuck on using Sphinx you could consider Lucerne like this site is? This should work regardless of underlying technology (database driven or static pages).
I am also currently looking to implement a site search. This question may also help.