Migrate data using Ssis pacakage - ssis

I am new to ssis world, I need to migrate my client's old data to our new database, the problem is schema is completely differ and in some cases i need to save data in two or more different tables and arrange foreign key relationship between them.
Any help is much appriciated

Create a staging database basically a 'bridge', you will make the database as close to the new database as possible, yet still with the old database in mind, you send the old database data to the staging database using SSIS and from there if you wish you can clean the data without interrupting business processes and validate the data to make sure it is balancing to the previous database data count.
From there you move it straight into the new database again with SSIS there should be no issues if the staging database was set up correctly

Related

How to rename a table and some columns in production

I am using Entity Framework to handle the database in a server application. The server application is deployed to multiple nodes sitting behind a load balancer. I have now discovered the need to rename some of the tables and some of the table columns in the database. The problem is that I will need to do this without any down-time.
If I was writing my own database layer I could simply check the structure of the database and then run different code depending on if the database structure was upgraded or not and then I could potentially have 0% downtime.
Is it possible to do something like this in entity framework 4 or how do I actually make database changes in production with entity framework? I have searched the web for any solution but to no success. I have also considered having 2 entity framework instances in my app but the amount of code I have to write to switch between them would be far too much and error prone.
So to summarize the question:
Is there any way to rename tables and columns in a database in a production environment using Entity Framework 4 and without any downtime whatsoever?
EDIT
A new idea I got would be to create a view in place of the old table with the old naming scheme referencing the new table. I dont know if this will work. For it to work Entity Framework needs to accept this view like it would be a normal table. Could this be a good idea? Anyone tried anything like this with EF4 ?
Okay! Here is what thought I got -
Create new table in DB than renaming and build EntityFramework solution on that.
You need to deploy this by taking one server off from loadbalancer, update it,point entity framework code to new table structure and then again attach to loadbalancer.
Make the same procedure for all the servers in loadbalancer one by one. Once all are up with updated patch then sync old table and new table and then just delete old table. I am suggesting synching these tables so that you have all the data without data loss might happen in above process.

Modify database schema with MySQL Workbench

Using MySQL Workbench, I created an ERD and database schema. I've deployed the database to my production server and have live data.
I now need to modify my schema. Instead of making changes on the live server database, I would like to modify the ERD, test it, and then create a modify script to deploy on the production server. Obviously, I do not wish to loose data, and thus cannot drop a table or column and then add new ones.
Is this possible to do with MySQL workbench? If so, how?
This is possible and it's called "Synchronization". This is a two-way merge between a model and a live database. Synchronization doesn't touch the data in the schema, but as usual, when you modify a db structure (removing tables or columns) the associated data is lost, regardless how you do that. So take care for proper backups.

How to merge data from mysql schemas that have diverged?

I have two servers that share an original ancestor codebase, but which have changed during the past couple of months in terms of database schema (I'm using mysql). I'm about to use the second one as my new production server, but I have to update the data (there are new users, there's new data related to those users, etc.). I want the data in the server that's now live, but has the old schema to have the authority, yet I want the schema in the new one to be the final one. So it's kind of a weird merge: I want data from the old server to be imported into a new server with a (not vastly) different schema.
I was thinking of simply making a dump of the server with the most up-to-date data, but then loading it wouldn't work since the schema has changed quite a bit.
I was also thinking on dumping the schema of the new server, applying it to a copy of the old one, then dumping the data from the latter and loading it into the new one, but I'm not sure how to go about doing that and if it's the safest option.
I develop on mac OS X and both of my servers are debian.
Applying the schema from the new server to the old and then migrating data is the safest option, largely because it forces you to evaluate what specifically has changed and what you want to do about that in terms of data (e.g., where a new column is added, what do you want to put in it)?
Since you mentioned the schemata are not massively different, simply doing a mysqldump without data (i.e., tables only) of each server and manually comparing (e.g., with diff) would tell you what columns are different. You can then apply those changes with ALTER on the old database.
It's all a little kludgy, but then ultimately there isn't really a non-kludgy way of doing this.
Look here: http://bitbucket.org/idler/mmp - it is a tool for mysql schema versioning, but only schema, not the data. First you must migrate your schema, then load your new data.

How to migrate production data with collapsed rails migrations

What would be potential strategies to getting the old data into a new db structure? One strategy we are thinking of is to write some ruby which executes some sql on a per table basis.
Since this is a one time task(I am preconceiving things here), you might want to have two databases, one old and one created new, from fresh migration. And write a ruby script to copy data as you want from old to new database.
It will help retain the old database and hence the downtime associated with creating new db and reimporting data from dump. You can use your old code until data is migrated and as soon as data migration to new db completes, update the code and restart the server. Voila! whole lot of data migration with no down time! :)
To summarize what I am suggesting:
Keep the old database around, don't recreate and reimport to it
Craete a new databse from fresh migration
create and run the ruby script to copy data from old database to new one
update the application code, which works with new databse
restart the server for new database to kick in

how do i automatically backup mysql database from remote center using the internet to another database

hi i am a beginner application developer , on my new job i have been commission to design
an architecture that will allow me automatically back up data off MySQL databases from 3 different geographical distant centers using the internet to a central database, can anybody please point me the right direction and if it is possible to solve such as problem programmatically.Thank you
If you just want to backup the actual data, a combination of mysqldump, gzip, and scp will move a backup of the database to a remote location. Or you can use database replication.
If what you want to do is merge the data from 3 sources into another database, thats an entirely different problem from backing up.
Have them each backup locally, then download them using ftp or something.
For merging you need to restore then each to there own db,
Then create an empty schema with the same table structure. Each table gets an additional Field for "SourceSystem"
then Load each DB in the Central DB then while updating the "SourceSystem" field. The "SourceSystem" + the original Pk become the new Pk.