What is the best way to synchronize database via internet? MySQL - mysql

I want to build a desktop application somewhat like a POS. The user can enter data to it and save the data to a local database instead of accessing a remote database in the server. The reason I want to do this is to reduce the traffic and make my application more responsive since it will make less overhead of accessing a remote database.
I want to build at least 5 client of this desktop application and each of them has a local database. Along with these clients, I will setup a server database which I will use for reports or for online access that displays all the status and data of all my clients.
For example, a specific user uses the client machine, all of his data will be stored in the local database before it can be transported to the server database for synchronization. I would seem like this system doesn't give a real time data update to the server but this is what I need. Since the server database is only used for reporting purposes, there is no information in the server that is manipulated by a client.

You may looking for jumpmind product called SymmetricDS. It Sync data from branch offices with a central office, sharing data with all offices and syncing subsets of data with specific offices.

Related

Sync multiple local databases to one remotely

I need to create a system with local webservers on Raspberry Pi 4 running laravel for API calls, websockets, etc. Each RPI will be installed in multiple customers places.
For this project i want to have the abality to save/sync the database to a remote server (when the local system is connected to internet).
Multiple locale databases => One remote database cutomers based
The question is, how to synchronize databases and identify properly each customers data and render them in a mutualised remote dashboard.
My first thought was to set a customer_id or a team_id on each tables but it seems dirty.
The other way is to create multiple databases on the remote server for the synchronization and one extra database to set customers ids and database connection informations...
Someone has already experimented something like that? Is there a sure and clean way to do this?
You refer to locale but I am assuming you mean local.
From what you have said you have two options at the central site. The central database can either store information from the remote databases into a single table with an additional column that indicates which remote site it's from, or you can setup a separate table (or database) for each remote site.
How do you want to use the data?
If you only ever want to work with the data from one remote site at a time it doesn't really matter - in both scenarios you need to identify what data you want to work with and build your SQL statement to either filter by the appropriate column, or you need to direct it to the appropriate table(s).
If you want to work on data from multiple remote sites at the same time, then using different tables requires tyhat you use UNION queries to extract the data and this is unlikely to scale well. In that case you would be better off using a column to mark each record with the remote site it references.
I recommend that you consider using Uuids as primary keys - it may be that key collision will not be an issue in your scenario but if it becomes one trying to alter the design retrospectively is likely to be quite a bit of work.
You also asked about how to synchronize the databases. That will depend on what type of connection you have between the sites and the capabilities of your software, but typically you would have the local system periodically talk to a webservice at the central site. Assuming you are collecting sensor data or some such the dialogue would be something like:
Client - Hello Server, my last sensor reading is timestamped xxxx
Server - Hello Client, [ send me sensor readings from yyyy | I don't need any data ]
You can include things like a signature check (for example an MD5 sum of the records within a time period) if you want to but that may be overkill.

Easy way to sync Firebird and MySQL [duplicate]

I am looking for a tip how to synchronize data from a local firebird database into online db? Few comments:
On a local machine I use sales software which keeps data on firebird db. There is an internet connection, but I want to avoid direct db access (as the PC after 9pm is being turned off).
I would like to create an online app (based on foundation + php + database) in which I will be able to view daily sales and explore past data.
In local db, I will need to pull data from several different tables, and I would like to keep them in online/final db as a single table (with fields: #id, transaction date, transaction value, sales manager).
While mostly I know how to create frontend of the app, and partially backend still I wonder what would be best choice in terms of db - mysql? (it was my first thought). Or rather I should focus on NoSQL?
What's your recommendation on data sync? I should use symmetricsDB (pretty hard to configure) or equivalent, I should write a script which will push data from firebird into json/xml? I'm referring to your knowledge and best practices
Put a scheduled job that will invoke a simple data pump / replication script.
From the script, connect to the source sales db, retrieve the joined data added from last replication and insert them into the "online" database.
You may keep also Firebird as online DB as it works great with PHP.
Firebird also in version 2.5 has all technology already build in to implement a fully functional replication. We have implemented this in the largest installation for a big restaurant company with about 0.6 billion records, daily about 1 million new records and 150 locations where replicated servers are working online or offline with the back office software.
If you simply want to upload the data from your local db to a remote db, you can rent a virtual server at a provider you like, install firebird there, create a secure connection (we use ssh, but any tcp over vpn can be used). copy your local database to the remote server, if required open firewall fb port (3050 or other) and when you a low number of writes on your local database, simply implement a trigger on each table, that does the same insert/update/delete with the same values using the "execute statement on external" feature.
When your local database has higher workload, it is better to put the change data (table name and pk values) from trigger into a log table and let a second connection upload the records to the target db, where the same "execute statement on external" can be used.
this is just a hint how to do that, if budget allows, we can do it for you, but stopping the database pc in the evening seems to be only typical for smaller companies

Local Firebird db replication/sync to online db

I am looking for a tip how to synchronize data from a local firebird database into online db? Few comments:
On a local machine I use sales software which keeps data on firebird db. There is an internet connection, but I want to avoid direct db access (as the PC after 9pm is being turned off).
I would like to create an online app (based on foundation + php + database) in which I will be able to view daily sales and explore past data.
In local db, I will need to pull data from several different tables, and I would like to keep them in online/final db as a single table (with fields: #id, transaction date, transaction value, sales manager).
While mostly I know how to create frontend of the app, and partially backend still I wonder what would be best choice in terms of db - mysql? (it was my first thought). Or rather I should focus on NoSQL?
What's your recommendation on data sync? I should use symmetricsDB (pretty hard to configure) or equivalent, I should write a script which will push data from firebird into json/xml? I'm referring to your knowledge and best practices
Put a scheduled job that will invoke a simple data pump / replication script.
From the script, connect to the source sales db, retrieve the joined data added from last replication and insert them into the "online" database.
You may keep also Firebird as online DB as it works great with PHP.
Firebird also in version 2.5 has all technology already build in to implement a fully functional replication. We have implemented this in the largest installation for a big restaurant company with about 0.6 billion records, daily about 1 million new records and 150 locations where replicated servers are working online or offline with the back office software.
If you simply want to upload the data from your local db to a remote db, you can rent a virtual server at a provider you like, install firebird there, create a secure connection (we use ssh, but any tcp over vpn can be used). copy your local database to the remote server, if required open firewall fb port (3050 or other) and when you a low number of writes on your local database, simply implement a trigger on each table, that does the same insert/update/delete with the same values using the "execute statement on external" feature.
When your local database has higher workload, it is better to put the change data (table name and pk values) from trigger into a log table and let a second connection upload the records to the target db, where the same "execute statement on external" can be used.
this is just a hint how to do that, if budget allows, we can do it for you, but stopping the database pc in the evening seems to be only typical for smaller companies

How to store sensitive data of different clients in SQL server?

I work at a small company and I am trying to figure out a solution for storing sensitive data of multiple clients in Microsoft SQL server. Actually, I feel like this is a general database question and it is not specific to MSSQL.
Until now we have been using a proprietary database where the client data is stored as db files (flat files) in the client’s root directories in the file system. So the operating system permissions guarantee that the application used by client X can never fetch data from client Y’s database. Please note that there is no database server/instance/engine here…
However, for my project I want to use SQL database. But the security folks are expressing concerns over putting data of different clients on a single database.
One option is to create separate database instances for different clients. However, I am not sure if this idea is scalable.
So my questions are:
1) Is there any mechanism in MSSQL that enables you to store databases ‘separately’ in different files used by the SQL server?
2) Let’s say I have only one database instance where I have databases of client X and client Y. How can I make sure that client X’s requests can never (accidentally) get misdirected to client Y’s database? I do not want to rely on some parameter in my code to determine which database to fetch from! :)
So, is there any solid authentication scheme to guarantee that my queries could not be misdirected to fetch from an incorrect client table?
I think this is a very common problem and there has to be a good solution for this. What are other companies doing?
Please let me know if there are any good articles to read up on this.
Different databases are always stored in different files in SQL Server so you don't even have to do anything special for this. However, NTFS permissions will not help you in this case as the clients aren't ever accessing the files directly on disk.
One possible solution in SQL Server is to create separate sets of Windows user IDs and map those to separate SQL Logins for each customer. You could then only assign those logins access to the appropriate databases. For example, if you were hosting web sites for client X and client Y, you would set up the connection string(s) in the web.config for client X's web site to use the appropriate login(s) for client X's database. Vice versa for client Y. This guarantees that no matter what (barring a hard-coded login), the code from client X's site will never access client Y's database.
You can have 32,000 databases on a single instance of SQL server and having separate databases enables a number of improved serviceability scenarios (such as restoring a single customer's DB in case of a data problem without affecting all of your other customers).
http://technet.microsoft.com/en-us/library/ms143432.aspx

sync database between wp8 app and server

I am creating a WP8 App.
I have a created a sqlite database in the isolated storage.
Now my data keeps updating and I want to regularly download the latest data from the server database and update the local database.
The database in the WP8 cannot be changed at the client side so there will be only 1 side data merging.
Which is the best way and service to use?
If you do not work with a large database, you might prefer to replace the device database and not worry about merging. This can be as simple as making an export of the server database, transferring it to the device and then importing it into the device database. The appropriate method of dumping the database on the server side is dependent on the type of database (e.g. mysqldump in the case of MySQL).
If you do work with a large database, or if you are struggling with bandwidth issues on the device, you might want to use a technique to detect differences. One of the easiest methods is change tracking on the database. All modifications can then be logged with an change_at timestamp. The device can then remember which is the last modification it contains, get the new entries, and replicate the changes locally (For in-depth detailed explanation, please provide more information of the server environment and data structure).