In my project, I am storing a lot of data in MySQL automatically. I want, however, the same things to sync with Node.js application's own data storage. What would be the fastest and easiest way to do this when storing data in both storages simultaneously isn't possible?
So, for example, I am storing variable "balance" in MySQL inside one Node.js application. I would want this same balance updated into other Node.js application's own storage but my current Node.js app is not connected to socket or other kind of data transporting mechanism. So how could I fetch data from MySQL in that other Node.js application?
sound look like your project structure is saga pattern.
in your question about update data is can use Kafka to create a topic and 2 node application consume message on same topic to update data into own database.
Related
In a flutter app stream are used. If some changes happens in firestore database then It'll be reflected in the app too. Can we do the same with the MySQL database as a backend server for storing data. Will the changes in MySQL reflected in the flutter app too?
MySQL does not have built-in realtime capabilities like Firestore. Firestore is fairly unique in this way. Typically, with SQL type databases, you have to repeat the query ("polling the database") to find any updates. There might be other middleware products you can use to simulate realtime updates, but you would have to search for and evaluate those for yourself.
1.)I have an application,where i have to get all inbox mails and store it locally.
Should i use mongoDB to store it or mysql is preffered.
Note: Regular insertion and updation of mails will be done,so which one is optimal for performance.
2.) How to configure mysql and mongoDB together in spring boot application.Is there any risk using both together.Since, I need to access both at service layer.
My project shall be deployed into multiple cities. And it uses a simple requests response policy, i.e. load data from mysql to redis when project boots, read requests get data from redis, write requests put data into redis and mysql at the same time.
However, I want the different deploy instances can share the same data asap. I used the mysql to synchronise data, but it works bad because of the bad network. Often, data written into one city, can hardly be read from another city for hours.
I want your suggestions. Many thanks.
I have one django application running on two different ports of apache.
Both are using different databases but schema is same (because both are same application).
I want a mechanism for automatic data synchronization between both databases.
What are the possible ways that I can follow to automate data synchronization?
Is there any third party API/Application to do this or writing own code will be better?
You can try django-synchro, a django app for database data synchronization.
I haven't used this app, but looks like it fits your need
This app is for synchronization of django objects between databases.
It logs information about objects' manipulations (additions, changes, deletions). When synchronization is launched, all objects logged from the last checkpoint are synced to another database.
We are planing to develop an application with Amazon Dynamo db. Actually this application is collecting information from my client's database(my client's are using MYSQL, Oracle,MSsql/ any other Relational database), doing some process in my application and send back results to the client's database. This synchronization process should work always(or every 1 minute interval).
I want to know is there any tools(or tricks) are available for synchronization between Amazon dynamo database and Relational database?
You can consider Elastic Map Reduce job, which reads from dynamo, transforms the data and writes back to relational database. (http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/EMRforDynamoDB.html)
Edit: Also look at Data Pipeline (http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-taskrunner-rdssecurity.html)