Database Migration (MSSQL->MySQL) using Talend's Dynamic Schema - mysql

I just recently tried using Talend Cloud Data Integration (Talend Studio 7.2)
and I'm figuring how to dynamically migrate a whole MSSQL server database to MySQL(blank DB).
So far, I heard that it's possible to do that with Talent's Dynamic Schema.
However, I only found a tutorial for a table (data) migration, or some post that didn't specifically explained much about settings and components to use.
Here's my job design
I'm using global variable to dynamically define the table, however, I'm not sure how i can define the schema dynamically.
** I have about 100 tables in each databases(MSSQL) and they all have different number and types of columns which is not applicable to the tutorial written here.
Can you let me know how I can define my schema dynamically?
I am hoping to find a solution where one job could perform a whole database migration. Eg) Iterating tables in a source DB then generate an output a destination DB.
Please let me know either if I should proceed with Dynamic Schema or Job Template + job design/components/setting needed.
Thank you

Related

Data migration tool for mysql in new upgraded application with deferrent schema and table structure

We are migrating our existing code base to a new robust code base for better performance and reliability. We have a MySQL database with current data. Now we have modified our entities in our spring boot application which will change our schema for the new database structure. I am in search of a tool which will help me migrate all the data from the old MySQL database to a newly created MySQL database with changes according to the latest schema design. I think I will have to write some code to match the new database architecture as no tool will do that refactoring according to my requirement. What tool should be helpful to achieve this?
Footnotes:
I am working in a microservice architecture.
I have integrated liquibase with maven plugin support.
I have seen Apache Spark and ETL, but they need
Provide your feedback if you have any relative experience.
We have done the migration of near about 3000 users data from 160(SQL) to near about 75 tables (MySQL) as per new schema.
As per our experience, I can suggest following things -
1. Prepare mapping sheet of the column in the table to table manner.
2. Migrate them into temporary tables.
3. Test the temporary table's data with the old one. Compare data of each table column by column. You can use ETL tool or excel for comparison.
4. Then write down sp or script for actual migration if no bugs found.

NodeJS application database versioning and data migration

I am planning to use Postgres as my database and sequelize as my ORM framework one of my NodeJS project. I need your suggestion on how to accomplish database table changes in production enviroment.
For example , we deployed our version-1 application with some database schema.
Later , there is a change in my schema in Version-2. Say one of the table is altered and few columns are dropped and few columns are added.
Also one of the table itself is not required in Version-2.
In my production , I have some database with data. When I install Version-2 of my NodeJS application , I need to keep the old data as it is and apply the new database schema of Version-2.
How to achieve this , Let me point out some reference to this.
Thanks
Sequelize (or rather, sequelize-cli) has built in support for migrations.
You can write migrations using the Sequelize js api (e.g. queryInterface.dropColumn('users', 'address')) or write raw SQL and pass it to sequelize.query()

How to rename a table and some columns in production

I am using Entity Framework to handle the database in a server application. The server application is deployed to multiple nodes sitting behind a load balancer. I have now discovered the need to rename some of the tables and some of the table columns in the database. The problem is that I will need to do this without any down-time.
If I was writing my own database layer I could simply check the structure of the database and then run different code depending on if the database structure was upgraded or not and then I could potentially have 0% downtime.
Is it possible to do something like this in entity framework 4 or how do I actually make database changes in production with entity framework? I have searched the web for any solution but to no success. I have also considered having 2 entity framework instances in my app but the amount of code I have to write to switch between them would be far too much and error prone.
So to summarize the question:
Is there any way to rename tables and columns in a database in a production environment using Entity Framework 4 and without any downtime whatsoever?
EDIT
A new idea I got would be to create a view in place of the old table with the old naming scheme referencing the new table. I dont know if this will work. For it to work Entity Framework needs to accept this view like it would be a normal table. Could this be a good idea? Anyone tried anything like this with EF4 ?
Okay! Here is what thought I got -
Create new table in DB than renaming and build EntityFramework solution on that.
You need to deploy this by taking one server off from loadbalancer, update it,point entity framework code to new table structure and then again attach to loadbalancer.
Make the same procedure for all the servers in loadbalancer one by one. Once all are up with updated patch then sync old table and new table and then just delete old table. I am suggesting synching these tables so that you have all the data without data loss might happen in above process.

Importing a MySQL schema to Xcode as a CoreData Data Model

I have an existing MySQL database, I would like to import the schema into Xcode and create a Core Data data model.
Is there a way (tool, process) to import the CREATE statements so I don't have to build the models "by hand"?
As an intermediary step I could convert to SQLite, I'm not worried about the relationships, foreign keys etc just auto-generating the Entities (Tables) and Properties (Columns).
Actually I needed the feature so badly too that I have decided to make an OSX utility to do so. BUT... then I found a utility in the Mac Appstore that (partially) solves this problem (it was free for some time, I do not know its current state). Its called JSONModeler and what it does is parsing a json tree and generates the coredata model and all derived NSManagedObject subclasses automatically. So a typical workflow would be:
Export the tables from MySQL to xml
Convert the xml to json
Feed the utility with that json and get your coredata model
Now, for a more complicated scenario (relationships etc) I guess you would have to tweak your xml so it would reflect a valid object tree. Then JSONModeler will be able to recreate that tree and export it for coredata.
The problem here is that entities are not tables and properties are not columns. Core Data is an object graph management system and not a database system. The difference is subtle but important. Core Data really doesn't have anything to do with SQL it just sometimes uses SQL as one its persistence options.
Core Data does use a proprietary sqlite schema and in principle you can duplicate that but I don't know of anyone who has succeeded in a robust manner except for very simple SQL databases. Even when they do, its a lot of work. Further, doing so is unsupported and the schema might break somewhere down the line.
The easiest and most robust solution is to write a utility app to read in the existing DB and create the object graph as it goes. You only have to run it once and you've got to create the data model anyway so it doesn't take much time.

Getting MySQL code from an existing database

I have a database (mdb file) that I am currently busy with. I would like to know if it is possible to generate MySQL code that would be used to create this database?
There are a couple of tools you can look at to try to do the conversion.
DataPump
Microsoft DTS (Nos Called SQL Server Integration Services)
Other option might be generate MySQL code from Access' DB MetaData you can access from JDBC, ODBC, ADO.NET or any other database access technology with metadata support. For this option you need to generate a piece of code (script). So it will only make sense if your access DataBase has a lot of table with a lot of columns or if you are planning to do this task several times.
Of course, using one of the mentioned tools will be faster if it works.
You can certainly write DDL to create and populate a MySQL database from the work that you've already done on Microsoft Access. Just put it in a text file that you execute using MySQL batch and you're all set.
If you intend to keep going with developing both, you'll want to think about how you'll keep the two in synch.