Can Mysql interact with Elasticsearch To Find Match? - mysql

I have a wordpress website with mysql database of real estate properties listed for sale and an elasticsearch database of addresses I have curated.
Can the mysql query the elasticsearch database to find if there is a matching address then send back to wordpress to move the property into a "xyz" category?
If not does anyone know a way for this type of process to happen?
Thanks

I don't know of any way MySQL can interact directly with ElasticSearch.
Typically you would develop code to interact with both ElasticSearch and MySQL in turn. This would be done with two separate queries. The first to query ElasticSearch to find a matching address, and then a second query against MySQL, using the information returned by the first query as a parameter for the MySQL query.
WordPress in this context is a client application for both MySQL and ElasticSearch. But each has its own protocol and connector. MySQL and ElasticSearch don't share data or queries.
Update: I found a way to use an ElasticSearch data source via MySQL's FEDERATED engine through a proxy that translates MySQL's protocol into ODBC to the ElasticSearch node. This is probably way too complex for your needs, but it is possible. See https://www.cdata.com/kb/tech/elasticsearch-odbc-mysql-federated-tables.rst
By analogy, this is like asking "can I drive my car while making my car drive another car?" Yes, you can — just weld your car's steering rods to some long metal struts that operate the hands of a puppet version of you sitting in the other car.
I don't recommend it.

Related

How to synchronize MySQL database with Amazon OpenSearch service

I am new to Amazon OpenSearch service, and i wish to know if there's anyway i can sync MySQL db with Opensearch on real time. I thought of Logstash but it seems like it doesn't support delete , update operations which might not update my OpenSearch cluster
I'm going to comment for Elasticsearch as that is the tag used for this question.
You can:
Read from the database (SELECT * from TABLE)
Convert each record to a JSON Document
Send the json document to elasticsearch, preferably using the _bulk API.
Logstash can help for that. But I'd recommend modifying the application layer if possible and send data to elasticsearch in the same "transaction" as you are sending your data to the database.
I shared most of my thoughts there: http://david.pilato.fr/blog/2015/05/09/advanced-search-for-your-legacy-application/
Have also a look at this "live coding" recording.
Side note: If you want to run Elasticsearch, have look at Cloud by Elastic, also available if needed from AWS Marketplace, Azure Marketplace and Google Cloud Marketplace.
Cloud by elastic is one way to have access to all features, all managed by us. Think about what is there yet like Security, Monitoring, Reporting, SQL, Canvas, Maps UI, Alerting and built-in solutions named Observability, Security, Enterprise Search and what is coming next :) ...
Disclaimer: I'm currently working at Elastic.
Keep a column that indicates when the row was last modified, then you will be able to do updates to OpenSearch. Similarly for deleting, just have a column indicating whether it is deleted or not (soft delete), and the date it was deleted.
With this db design, you can send the "delete" or "update" actions to OpenSearch/ElasticSearch to update/delete the indexes based on the last modified / deleted date. You can later have a scheduled maintenance job to delete these rows permanently from the database table.
Lastly, this article might be of help to you How to keep Elasticsearch synchronized with a relational database using Logstash and JDBC

Implementing a search with Elasticsearch using mysql data

I am new to Elasticsearch. I was using MySQL Full Text features till now.
I want my MySQL database as my primary database and want to use Elasticsearch alongside as a search engine in my website. I got several problems when thinking about it. The main problem is Syncing between MySQL database and Elastic search.
Some say to use Logstash. But even though I use it, would I need to write separate functions in my program to database transactions and Elasticsearch indexing?
You will need to run periodic job doing full reindex and/or send individual document updates for ES indexing. Logstash sounds like ill-suited thing for the purpose. You need just the usual ES API to index stuff.

elastic stack difference between mysqlbeat and logstash jdbc input

Is there any difference or recommandation using the first or the second one ?
Both bring mysql data to elastic
Thanks in advance :
https://github.com/adibendahan/mysqlbeat
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html
Push vs Pull: Logstash's jdbc input pulls data from remote SQL servers on a cron, while the beat will push the results of the query to Elasticsearch.
Location: Another thing to consider is that this beat will be running on every SQL server, vs having one Logstash with multiple jdbc input plugin blocks. There are pros and cons to this depending on your scale. If you have thousands of databases, it isn't scalable to have Logstash querying each one, particularly if the list of databases and queries are constantly changing. It would be much easier to manage one beat per SQL server. If you have a simple setup with just a few databases, it would probably be faster to use the Logstash input plugin, because then you wouldn't have another service to maintain (the beat).

Increase app performances with ElasticSearch on NodeJS

I didn't find exactly what I'm looking for and no one to explain me, so I ask here. Hope this is not a duplicate…
I have an application that run on NodeJS with the ORM sequelize and a MySQL database. The project start one year ago.
I would like, now, to increase performances etc, install ElasticSearch. But I'm not sure if I can.
Can I just tell my elasticserver, where my database is, and magically, it does the mapping alone ?
Or Have I to insert my data on elasticSearch ?
All my infrastructure is hosted on AWS, I'm using RDS. May be it will help ?
In fact I don't know where to put the elasticSearch layer ? Do you have any ideas to help me.
Because, I already worked with Symfony and the FosElasticaBundle, but it works pretty well that I don't know how.
UDPATE:
you can use the ES http api
your existing data in your database can you bulk index
You have to handle changes in your database, so you can use the observe pattern, such as created(), saved(), deleted(), so you can in these methods handle the ES actions to created, update or delete a document
Elasticsearch (ES) is a speed machine with a lot of possibilities (advanced queries, stats, etc.), not a database (read why here: https://www.elastic.co/blog/found-elasticsearch-as-nosql).
It's distributed (can work/share work via multiple servers)
It's document based (json documents, for example with nested, parent/child relationships)
It's mapping based
It's indexing based
More here: https://www.elastic.co/products/elasticsearch
You can use types in ES (like tables in your database).
ES uses indexed data, based on the mapping. Therefore, you still need a database and index the data (denormalized) to ES. You can query, filter, aggregate, analyze the data in your index.

Join tables from multiple remote database

I have a mysql database on server A, a postgres database on server B and another mysql database on server C. I need a way to join tables from the three servers to get a combined result. Is there a way to do this in ruby ? If not ruby any other language will also suffice.
I need to join somewhere around a few 1000 rows of data. The joined data needs to get pushed to elasticsearch. I was going to use the _bulk api in elasticsearch to push it.
If it's a one-off, just download the data from two of the DBs and upload it to the third. Do your work there.
If it's a regular thing it might be worthwhile putting the effort into linking the databases properly. PostgreSQL offers Foreign Data Wrapper plugins that let you talk to a different database (PostgreSQL or others). One is the mysql_fdw.
You define entries for each remote server, user mappings between local user and the remote user, and then describe each table to access. After that you can treat them as local (although performance will generally be much worse of course).
In theory you could write your own fdw plugin to link to elasticsearch too but there doesn't seem to be one currently available.
You can do it by LINQ and Entity Framework in Microsoft .Net