Good implementation of sending data to a REST api? - mysql

Each day hundreds of thousands of items are inserted, updated and deleted on our service (backend using .Net and a MySql database).
Now we are integrating our service with another service using their RESTful API. Each time an item is inserted, updated or deleted on our service we also need to connect to their web service and use POST, PUT, DELETE.
What is a good implementation of this case?
It seems like not a very good idea to connect to their API each time a user inserts an item on our service as it would be a quite slow experience for the user.
Another idea was to update our database like usual. Then set up another server constant connecting to our database and fetching data that needs to be posted to the RESTful API. Is this the way to go?
How would you solve it? Any guides of implementing stuff like this would be great! Thanks!

It depends if you delay in updating the other service is acceptable or not. If not, than create a event and put this in queue of event processor who can send this to second service.
If delay is acceptable than there can be background batch job that can run periodically and send the data.

Related

How to get real time changes into a MySql Database,Database and have instant response in frontend

I am trying to build a simple project online. I have a MySql Database,Database where I will store different information, such as fake orders made from fake clients. The application will be formed by a frontend made with Javascript and HTML/CSS, while the backend will be a Node/Express API that should handle all the requests and handle the database.
I wanted to know whether there is a way to, each time a new order is made, have a refresh in my own page, and see, for example, a new column in an hypotetical table in my HTML with a minumum latency, avoiding making a request from the client every x seconds. This because it could be quite expensive in terms of bandwith and also prett unefficient.
I thought that each time I connect to the site, I get subscribed to a sort of list in the server, that broadcast a trigger to then update the frontend when tha UPDATE function is triggered in the backend. In other words, every time an update is done on the backend, the server sends a trigger to the clients that he knows are currently connected. Then, the frontend asks for the update directly.
This solutions i really complicated to handle and may be not that performant. I was thinking if there where some functionalities of the frontend or the backend or the database, or any framework that allow me to do this thing.
I would like to have all as real time as possible, using the least bandwith possible. This is because I would like to use the free tier of some online service, and I don't want to consume all the bandwith.
If you have some suggestions of framework or functionalities, or any protocol, you are welcome. Thank you a lot in advice
You can use websockets. When a user creates an order and after there is a success saving to the data base, the backend will push or publish the data to the client who is subscribed to a specific channel. The logic is not complicated at all it is called the pub/sub pattern you should search for it.
Also https://socket.io/ this is library that used on both backend and front end to deal with websockets.

How to synchronize MySQL database with Amazon OpenSearch service

I am new to Amazon OpenSearch service, and i wish to know if there's anyway i can sync MySQL db with Opensearch on real time. I thought of Logstash but it seems like it doesn't support delete , update operations which might not update my OpenSearch cluster
I'm going to comment for Elasticsearch as that is the tag used for this question.
You can:
Read from the database (SELECT * from TABLE)
Convert each record to a JSON Document
Send the json document to elasticsearch, preferably using the _bulk API.
Logstash can help for that. But I'd recommend modifying the application layer if possible and send data to elasticsearch in the same "transaction" as you are sending your data to the database.
I shared most of my thoughts there: http://david.pilato.fr/blog/2015/05/09/advanced-search-for-your-legacy-application/
Have also a look at this "live coding" recording.
Side note: If you want to run Elasticsearch, have look at Cloud by Elastic, also available if needed from AWS Marketplace, Azure Marketplace and Google Cloud Marketplace.
Cloud by elastic is one way to have access to all features, all managed by us. Think about what is there yet like Security, Monitoring, Reporting, SQL, Canvas, Maps UI, Alerting and built-in solutions named Observability, Security, Enterprise Search and what is coming next :) ...
Disclaimer: I'm currently working at Elastic.
Keep a column that indicates when the row was last modified, then you will be able to do updates to OpenSearch. Similarly for deleting, just have a column indicating whether it is deleted or not (soft delete), and the date it was deleted.
With this db design, you can send the "delete" or "update" actions to OpenSearch/ElasticSearch to update/delete the indexes based on the last modified / deleted date. You can later have a scheduled maintenance job to delete these rows permanently from the database table.
Lastly, this article might be of help to you How to keep Elasticsearch synchronized with a relational database using Logstash and JDBC

What is the best solution to synchronize mysql and cloud firestore

Currently I am using MySQL as a main database which is not real time database. To synchronize client side data with server and keep data when offline I am considering to add real time database as a slave database in my architecture. Synchronizing data is not easy, so I want to use cloud firestore.
What I searched until now, It seems there is no pratical way to synchronize between not real time RDMS(in my case is MySQL) and cloud firestore. I can not migrate current data to cloud firestore because other services depend on that.
If there is no pratical solution for this, please suggest me the best way. Thanks.
Not sure, but it seems there is no solution to synchronize that. To synchronize such this case, on the client side has to be implemented manually.
I have been thinking about this as well. What I have thought up so far is this.
Use Firestore to create a document.
Write cloud function to listen to create events on the collection that stores that document.
send the created document over http to a rest endpoint that stores that data on a relational db (mysql, postgres), if server is down or status code is anything other than 200, assign true to a boolean flag on the document called syncFailed.
write a worker that periodically fetches documents where syncFailed is true and sends this document to the endpoint again until sync is a success.
THINGS TO CONSIDER
Be careful in adopting this approach with update events. You may accidently create an infinite loop that will cost you all your wealth. If you do add a update listener on a document, make sure you devise a way to avoid creating an infinite loop. You may update a document from an update-listening cloud function that in turn re-triggers this cloud function which in turn updates the document which in turn re-re-triggers this cloud function and so on. Make sure you use a boolean flag on the document or some field as a sentinel value to get out of such a looping situation.

MySQL trigger notifies a client

I have an Android frontend.
The Android client makes a request to my NodeJS backend server and waits for a reply.
The NodeJS reads a value in a MySQL database record (without send it back to the client) and waits that its value changes (an other Android client changes it with a different request in less than 20 seconds), then when it happens the NodeJS server replies to client with that new value.
Now, my approach was to create a MySQL trigger and when there is an update in that table it notifies the NodeJS server, but I don't know how to do it.
I thought two easiers ways with busy waiting for give you an idea:
the client sends a request every 100ms and the server replies with the SELECT of that value, then when the client gets a different reply it means that the value changed;
the client sends a request and the server every 100ms makes a SELECT query until it gets a different value, then it replies with value to the client.
Both are bruteforce approach, I would like to don't use them for obvious reasons. Any idea?
Thank you.
Welcome to StackOverflow. Your question is very broad and I don't think I can give you a very detailed answer here. However, I think I can give you some hints and ideas that may help you along the road.
Mysql has no internal way to running external commands as a trigger action. To my knowledge there exists a workaround in form of external plugin (UDF) that allowes mysql to do what you want. See Invoking a PHP script from a MySQL trigger and https://patternbuffer.wordpress.com/2012/09/14/triggering-shell-script-from-mysql/
However, I think going this route is a sign of using the wrong architecture or wrong design patterns for what you want to achieve.
First idea that pops into my mind is this: Would it not be possible to introduce some sort of messaging from the second nodjs request (the one that changes the DB) to the first one (the one that needs an update when the DB value changes)? That way the the first nodejs "process" only need to query the DB upon real changes when it receives a message.
Another question would be, if you actually need to use mysql, or if some other datastore might be better suited. Redis comes to my mind, since with redis you could implement the messaging to the nodejs at the same time...
In general polling is not always the wrong choice. Especially for high load environments where you expect in each poll to collect some data. Polling makes impossible to overload the processing capacity for the data retrieving side, since this process controls the maximum throughput. With pushing you give that control to the pushing side and if there is many such pushing sides, control is hard to achieve.
If I was you I would look into redis and learn how elegantly its publish/subscribe mechanism can be used as messaging system in your context. See https://redis.io/topics/pubsub

Access other Web API from my Web API

I have a requirement to make Endpoint calls to multiple Web API's designed by other companies. These calls will be made on periodic basis like once an hour, or once a day to post and retrieve some data (business to business transactions). Am working with .NET framework and ServiceStack.
Am not sure, what would be the best approach to achieve this type of functionality?
Maybe, I can have a Windows Service application which scans through the relevant config tables in SQL Server and generate CURL commands and execute them? Not sure, whether this will be the correct approach or there is something better you would like to propose?
I have never worked with CURL before, these are just initial thoughts.
To achieve this your backend needs a data structure to hold all necessary data for the requests (which can be a database table as you suggest) and a scheduling mechanism. This could be as simple as a timer and when triggered it picks up the requests and executes them (by using the built-in HttpClient for instance). IMO you should keep this logic within the application itself, no need to make things complicated by introducing a system-dependant service that then issues curl commands on the os level.