I am planning to use Postgres as my database and sequelize as my ORM framework one of my NodeJS project. I need your suggestion on how to accomplish database table changes in production enviroment.
For example , we deployed our version-1 application with some database schema.
Later , there is a change in my schema in Version-2. Say one of the table is altered and few columns are dropped and few columns are added.
Also one of the table itself is not required in Version-2.
In my production , I have some database with data. When I install Version-2 of my NodeJS application , I need to keep the old data as it is and apply the new database schema of Version-2.
How to achieve this , Let me point out some reference to this.
Thanks
Sequelize (or rather, sequelize-cli) has built in support for migrations.
You can write migrations using the Sequelize js api (e.g. queryInterface.dropColumn('users', 'address')) or write raw SQL and pass it to sequelize.query()
Related
I just recently tried using Talend Cloud Data Integration (Talend Studio 7.2)
and I'm figuring how to dynamically migrate a whole MSSQL server database to MySQL(blank DB).
So far, I heard that it's possible to do that with Talent's Dynamic Schema.
However, I only found a tutorial for a table (data) migration, or some post that didn't specifically explained much about settings and components to use.
Here's my job design
I'm using global variable to dynamically define the table, however, I'm not sure how i can define the schema dynamically.
** I have about 100 tables in each databases(MSSQL) and they all have different number and types of columns which is not applicable to the tutorial written here.
Can you let me know how I can define my schema dynamically?
I am hoping to find a solution where one job could perform a whole database migration. Eg) Iterating tables in a source DB then generate an output a destination DB.
Please let me know either if I should proceed with Dynamic Schema or Job Template + job design/components/setting needed.
Thank you
We are migrating our existing code base to a new robust code base for better performance and reliability. We have a MySQL database with current data. Now we have modified our entities in our spring boot application which will change our schema for the new database structure. I am in search of a tool which will help me migrate all the data from the old MySQL database to a newly created MySQL database with changes according to the latest schema design. I think I will have to write some code to match the new database architecture as no tool will do that refactoring according to my requirement. What tool should be helpful to achieve this?
Footnotes:
I am working in a microservice architecture.
I have integrated liquibase with maven plugin support.
I have seen Apache Spark and ETL, but they need
Provide your feedback if you have any relative experience.
We have done the migration of near about 3000 users data from 160(SQL) to near about 75 tables (MySQL) as per new schema.
As per our experience, I can suggest following things -
1. Prepare mapping sheet of the column in the table to table manner.
2. Migrate them into temporary tables.
3. Test the temporary table's data with the old one. Compare data of each table column by column. You can use ETL tool or excel for comparison.
4. Then write down sp or script for actual migration if no bugs found.
converting a project and looking for a way to connect to DB schema, read it and generate knexJS migration files to integrate in new project.
Is there a way to do this? and how?
No, there is no way to extract migration file for current schema of some database. If you really want to create one, it must be written manually.
But you don't even need one. You can just start using knex, with your old database dump and in the future when there should be changes done to the schema you can write new migrations for those specific incremental changes to the schema.
For running migrations you should use knex's migration functionality.
I have a NodeJS project running with mongodb database (using mongoose).
For a technological constraint reason I need to migrate the app from using mongodb to mysql - is there a way to migrate to mysql without having to rewrite the whole mongoose model files?
PS. although I'm using mongodb all the query is mainly still not on the nested document (I'm querying only by ID or by some first-level attribute) so actually putting nested document into a field in mysql table should still be fine
I would suggest letting your application run with Mongo for now. Meanwhile write a wrapper for MySQL that would translate your Mongo queries to mysql. Switch to that wrapper once done. Then write another wrapper for Mongo, just in case you need to switch back.
Try and keep all your Database specific function calls in the wrapper. So, that you won't need to do this again and again. Just write a new wrapper for whatever Database you will use and just switch.
And you'll probably need to run some sort of job to migrate your data from Mongo to MySQL.
I am using Entity Framework to handle the database in a server application. The server application is deployed to multiple nodes sitting behind a load balancer. I have now discovered the need to rename some of the tables and some of the table columns in the database. The problem is that I will need to do this without any down-time.
If I was writing my own database layer I could simply check the structure of the database and then run different code depending on if the database structure was upgraded or not and then I could potentially have 0% downtime.
Is it possible to do something like this in entity framework 4 or how do I actually make database changes in production with entity framework? I have searched the web for any solution but to no success. I have also considered having 2 entity framework instances in my app but the amount of code I have to write to switch between them would be far too much and error prone.
So to summarize the question:
Is there any way to rename tables and columns in a database in a production environment using Entity Framework 4 and without any downtime whatsoever?
EDIT
A new idea I got would be to create a view in place of the old table with the old naming scheme referencing the new table. I dont know if this will work. For it to work Entity Framework needs to accept this view like it would be a normal table. Could this be a good idea? Anyone tried anything like this with EF4 ?
Okay! Here is what thought I got -
Create new table in DB than renaming and build EntityFramework solution on that.
You need to deploy this by taking one server off from loadbalancer, update it,point entity framework code to new table structure and then again attach to loadbalancer.
Make the same procedure for all the servers in loadbalancer one by one. Once all are up with updated patch then sync old table and new table and then just delete old table. I am suggesting synching these tables so that you have all the data without data loss might happen in above process.