MySQL mirroring on EC2 - mysql

I'm running two MySQL server one on production and one on staging, both are EC2 Instance.
The same way i have two MySQL RDS Instances parallel to the production and staging.
Here want i wanted to do.
I would like to mirror the production database to the development server every few hours,
for 1. backup, 2. to run new features against the latest database changes.
I didn't find much information regarding this issue, can anyone help?
Thanks.
Additional information:
i'm running nginx on linux server, with php backend.

If you are running on RDS, you have two options.
Snapshot and restore your instance. You can automate this, but the time it make take more time the larger the DB is. Your endpoint will probably change too.
Dump the database from production, reload into development.

Related

How to Update Mysql database on live server with my local mysql database

Please I need help, I have an application running well on the Internet but the Network go down drastically sometimes,
So I imported the online database to my offline version. I want to be running the offline version and update the live version on regular basis either automatically with php code or through phpmyadmin. Please anyone with an idea of how I can do that should please help.
Thanks
You can update the local database automatically when the live server gets a chance.
This concept calls database replication(Master Slave Model).
You can slave your local database which doesn't need static IP. You can set it anywhere.
Master Database need must need static IP. And you can connect as many slaves as you want with the master database.
The Master database automatically sent changes to all slaves.

How do I migrate a SQLAnywhere 9 db running in a remote server into a mysql server on my machine?

I am working at a company that has some CRM software running in a remote Windows XP server that uses a SQLAnywhere 9 db to store its data; I have access to this remote server with an administrator account.
I would like to extract the db into a .sql file so that I can run the db locally on my machine without affecting the running db in the server (since it is key for the company's day to day operation).
The reason I need this is that we are going to test some BI Software and we need data from this database to test it, but we don't know the structure of the database since the developers of the CRM software didn't give us any documentation on it. So we need to have the database locally so that, without affecting the running CRM, we can:
understand the structure by looking at the DDL
make queries to it to get sample data
I researched a bit, and the most common solution to my problem was to use dbunload on the remote server to unload the db into a reload.sql file that contained what I needed. But most tutorials on the subject mention that I have to stop the db first (which would be catastrophic). If this is the only option, then I guess I am willing to do it on the weekend when the CRM is not used, but I wanted to know if there was another solution first.
If there is no other solution, can you point me to where I can find the proper and safer way to do this?
I have researched a lot, but prior to this day I have never even heard of SQLAnywhere, so I really need all the help I can get. My main concern is doing something that impacts negatively the CRM software.
Thank you.
You can run dbunload across the network, you just have to tell it to do an "external" unload. The default is to do an internal unload which would only work from the machine where the database server is running.
I don't have SQL Anywhere 9 documentation right now to look up the exact switch, but dbunload -? should show you all the possible switches.
Edit:
-an will create a new database and load the data and schema from another data
-xi switch will do external unload and internal reload.
-c parameters to connect to your remote database

how to synchronize two databases of different servers?

I have two databases. One is on local server and the other is on a production server. I'm continuously working on the local server and after approval I want to update production server. Wih the current setup, I need to take dump or copy or export of database and then import into production database, every time. Is there any way of synchronization method in phpmyadmin for database on different server.
It is possible to synchronize a model in MySQL Workbench with a live database?
Other then this, how can I do this? I'm able to use queries, the command line and phpmyadmin itself.
Please specified any simple method.
The "synchronisation" feature you are looking for is called replication. A replication can be set up between a master and a slave machine. It does not rely on a constant connection, but stores all changes on the master and replays all those changes on the slave once a connection is established.

Amazon AWS RDS Instance organisation

I am migrating to Amazon Web Services. I have a production site and associated mySQL Database, and I also have a staging site to try out changes prior to pushing to production.
In terms of AWS RDS, how do I manage these two databases? Should they both exist in the same instance, or should they each have their own instance?
My staging site does not have to be continuously available, does this change which option is best suited?
I think those databases should live in different instances. Production DB should definitely have its own instance. There are many reasons to that:
What happens if you have a bug in staging and you spam your instance with CPU-eating queries? If it is the same instance your production DB might be effected.
It is good to have instance monitoring only for production.
You will be able to set up configuration on the production instance that should not be set on staging such as multi-az availability.
You can stop the staging instance when ever you want and re-start it with just the data it had when you stopped it.
All in all, no reason to mix the two into the same instance. One for production, one for staging.

mysql: keeping local mysql database and remote server database synced up

I am kind of a amateur at web development. But it seems like most people develop on their local machines and the upload to their remote servers when everything is ready. I want to start doing this. I've installed Xampp (Apache) on my local machine. But in order for this to really work, i need the the mysql databases that already exist on my remote server to be "synched" or "duplicated" on my local machine. But I'm finding this somewhat hard to figure out.
First, Should I be using mysql "reduplication" feature (with my remote server as master) and local machine as slave? Or is there a better way to do this? Should I be synchronizing instead of reduplicating
Second, Is anyone willing to give me a quick description of how I achieve this "reduplication" or "synchronization"
Thanks
It may be tempting, but it's hopeless to try to keep the database in two places. Instead, always keep the database on the server because it's much easier to develop and debug your code if it's in just one place. "Resynch" and "reduplicate" and all that business is just too much trouble, as you are discovering. The DB is going to end up on the server anyway, so you may as well put it there right now.
Also, you will not need a web server on your local machine, which will unburden you.
This next is applicable if you are writing CGI. If you aren't sure whether or not you are writing CGI, then you are not (well, probably not). {
If you mainain just one database and it lives on your server, you'll be able to write one piece of portable code that will run equally well on your local machine and on your server. This is a huge win, take my word for it.
To get this working, you will need the mysql library on your local machine; no other mysql component is needed there. The mysql server will run on your server only.
Read up on mysql "connector" for the language you're using.
}