Local and Remote data store sync - mysql

I have a situation where I would like a desktop application to be useable whether an internet connection is present or not.
I have a choice of MySQL on a web server and I could work with a local MySQL database or maybe MS Access database on the local drive and then just update data when connection is restored. My issues are as follows.
Sync local changes to remote server. Multi site / multi user scenario so how to keep db in sync when connection restored without loosing changes from other users in server data.
Sync remote changes to local. Multi site / multi user scenario so how to keep db in sync when connection restored without loosing changes made locally while updating with server data.
Currently I am using XML files and LingtoXML querying but it is unsatisfactory to continue with these files so a better solution is required.
Any help would be appreciated to identify what technology would work best and how to keep them in sync.
Thanks in Advance.

"Jet Replication Objects (JRO)", the replication features of the Access Database Engine, have been deprecated (ref: here). I believe that the related management features have also been completely removed from Access 2013. Therefore, native Access replication should no longer be considered a viable option.
Microsoft's recommendation would be to use SQL Server and its replication features. However, SQL Server Express has limitations on how much it can do (e.g., it can be a "Subscriber" but not a "Publisher" or "Distributor", ref: here) so presumably there would have to be a non-free copy of SQL Server involved somehow.
I haven't yet had the occasion to use MySQL replication myself, but it is probably worth considering. Chances are good that you could still use Access as a front-end (via ODBC linked tables).

Related

Sync Microsoft Access Database on a local network with a MySQL database on a web server

I'm a front-end web developer trying to devise a solution to sync a Microsoft Access Database on a local network with a MySQL database on a web server. I ran across some software that might perform this however it looks complicated to maintain and seems like it could be an additional point of failure.
There is also a way MS Access can use an ODBC to an external MySQL database. https://dev.mysql.com/doc/connector-odbc/en/connector-odbc-examples-tools-with-access-linked-tables....
If the ODBC connection to the external MySQL database works, is there a way to sync an external DB table with a local DB table so when a change occurs with one it's pushed to the other?
Or is there another recommendation to handle this process?
Thank you.

How do I migrate a SQLAnywhere 9 db running in a remote server into a mysql server on my machine?

I am working at a company that has some CRM software running in a remote Windows XP server that uses a SQLAnywhere 9 db to store its data; I have access to this remote server with an administrator account.
I would like to extract the db into a .sql file so that I can run the db locally on my machine without affecting the running db in the server (since it is key for the company's day to day operation).
The reason I need this is that we are going to test some BI Software and we need data from this database to test it, but we don't know the structure of the database since the developers of the CRM software didn't give us any documentation on it. So we need to have the database locally so that, without affecting the running CRM, we can:
understand the structure by looking at the DDL
make queries to it to get sample data
I researched a bit, and the most common solution to my problem was to use dbunload on the remote server to unload the db into a reload.sql file that contained what I needed. But most tutorials on the subject mention that I have to stop the db first (which would be catastrophic). If this is the only option, then I guess I am willing to do it on the weekend when the CRM is not used, but I wanted to know if there was another solution first.
If there is no other solution, can you point me to where I can find the proper and safer way to do this?
I have researched a lot, but prior to this day I have never even heard of SQLAnywhere, so I really need all the help I can get. My main concern is doing something that impacts negatively the CRM software.
Thank you.
You can run dbunload across the network, you just have to tell it to do an "external" unload. The default is to do an internal unload which would only work from the machine where the database server is running.
I don't have SQL Anywhere 9 documentation right now to look up the exact switch, but dbunload -? should show you all the possible switches.
Edit:
-an will create a new database and load the data and schema from another data
-xi switch will do external unload and internal reload.
-c parameters to connect to your remote database

How to copy new and updated rows from a offline to an online database?

It's one of those cases where you have a desktop application and its database is in a remote server. In my case, it's MySQL and the application is made in Delphi XE3. But when client wants his data both offline and online (for speed and security) we need to:
Login with remote server information (more updated);
Sync the online database to offline;
Do the tasks on the application and the database;
Sync back the offline database with online.
My question: Is there a standard way to do that, by MySQL instructions or another automatic way? Or Am I going to code all the rules to make it possible?
Luckily there is no need for code here.
Replication has been built into MySQL for many years.
The trick is to setup the remote host as master and the local copy as slave.
All updates go to the master.
And the slave reads from the remote.
The documentation is here: http://dev.mysql.com/doc/refman/5.7/en/replication.html
Here's a tutorial: http://www.howtoforge.com/mysql_master_master_replication
Note that there can really only be one master, if not the setup will get too complicated to be workable.
You could look at client data sets (as you need to update the local version I do not believe that mysql allows multiple masters). Basically your application connects to mysql when it is online and if your application goes offline you store the database and changes in a local xml database. One back online, you apply the updates. Downsides of cds are.: no sql locally and your local changes might conflict with changes made by other users so the apply updates must include logic to reconcile the conflicts. Also cds is involved still trying to get my head around it

Quickly copying a production database to development environment (SQL Server)

Often I need to pull the production database of some project to my local sql server to add features, test out stuff etc.
Today my procedure is to create a backup on the production server, somehow get that to my local machine - and then create a new database locally and restore the backup on top of that.
It is a pain, and takes more time than I like - and I would like to think, there must be a better way.
I have access via SQL Server Management Studio to the production database - isn't there an easier way, that requires fewer manual steps?
How do you do it?
Can't think of a quicker way using SQL Sever Management Studio. I'd recommend SQL Compare from Red Gate for synchronising the schema, SQL Data Compare can sync the data, but it's not quick for large databases over the internet.
You can use SSIS and copy objects between these environments, assuming you have a direct connection to the production server.
Another way, it's a must to make regular backups of the production databases, a simple maintenance plan can make a full backup at night, by example. If this is the case here, just request an early backup to the administrators and then mount it into your environment.
One more way, because production data disclosure can involve legal issues, you can just extract the database schema and then use testing data to make any development. This is also the fastest way to get a database.
A new option is to clone the database. Red Gate SQL Clone is one solution, and Windocks provides SQL Server containers with built-in DB cloning support. Full disclosure, I work for Windocks.
Backup and Restore is slow for my databases, so what I do is:
1. Detach production database
2. Copy files to my dev machine
3. Attach database to dev server.
but no body should work on the production database.
My backup procedure is similar - I am detaching the database and archive the files, it is faster than backing up.

Using MS Access & ODBC to connect to a remote PostgreSQL

I currently have an MS Access application that connects to a PostgreSQL database via ODBC. This successfully runs on a LAN with 20 users (each running their own version of Access). Now I am thinking through some disaster recovery scenarios, and it seems that a quick and easy method of protecting the data is to use log shipping to create a warm-standby.
This lead me to think about putting this warm-standby at a remote location, but then I have the question:
Is Access connecting to a remote database via ODBC usable?
I.e. the remote database is maybe in the same country with ok ping times and I have a 1mbit SDSL line.
onnodb,
The PostgreSQL ODBC driver is actively developed and an Access front-end combined with PostgreSQL server, in my opinion makes a great option on a LAN for rapid development. I have been involved in a reasonably big system (100+ PostgreSQL tables, 200+ Access forms, 1000+ Access queries & reports) and it has run excellently for a few years, with ~20 users. Any queries running slow because Access is doing something stupid can generally just be solved by using views, and any really data-intensive code can easily be moved into PostgreSQL functions and then called from Access.
The only main ODBC-related issue we have is that there is no way to kill a slow running query from Access, so we do often get users just killing Access and then massive queries are just left executing on the server.
Yes.
I don't have any experience using Access to hit PostgreSQL from a remote location but I have successfully used Access as a front-end to SQL Server & DB2 from a remote location with success.
Ironically, what you don't want to do is use Access to front-end an Access database (mdb) from a remote location over a high-latency link. Since hitting the MDB uses file-based operations it's pretty easy to end up with a corrupt database if you have anything more than a trivial db.
It depends a lot on the database you're using as a back-end. I've had rather terrible experiences with MySQL as a back-end. Make sure the ODBC link you're using is actively developed, stable and complete --- this was definitely not the case for MySQL. You may also want to check for any compatibility issues between Access and Postgre. And, of course, it won't hurt to test extensively.
Oh, and I think it'd be absolutely great if you could post back here later with your experiences!
PostgreSQL works great as a backend for MS Access, there are a couple of support functions you should use to make things easier. See here for more info on this:
http://www.amsoftwaredesign.com/smf/index.php?board=8.0