Is migration file must in Sequelize? - mysql

I started my project with sequeliz-cli by running init command.
but i deleted the migrations and seeder folder. and started creating the models.
i need to first create a db in workbench, and then ran db.sync() from sequelize, which created the tables in db. Its running kind of fine.
can we complete the whole project without using migrate and deploy it on production env, will not using migrate disadvantageous??

Migrations help you to use different versions of your app and DB in different production environments and change your DB structure as long as to add some initial data precisely.
sync method would be dangerous in such scenarios with different versions in a situation when you run a newer version of your app on an older DB occasionally.
If you have a simple DB structure that won't change or will be the same on all prods then you can use sync.

Related

what does rake db:schema:cache:dump exactly do? is it safe to be ran in production?

Rails has an issue with big schemas where it becomes slower over time as the app grows, to fix this issue the schema_cache file was introduced and is enabled by default, it get updated with every migration.
We can create and update the file manually using rake db:schema:cache:dump.
Finally my questions:
What query does it actually run inside the db?
Does it lock the db while performing the query?
Finally is it safe to be ran in production?
Thank you
After logging my DB and running the command, it turns out it just runs
show full fields table_name
against all the tables in the schema.
The command can be executed in production and it creates a db/schema_cache.yml. Mainly it serializes data related to schema into this file. Generally in a prod environment with multiple servers, this file would be shared across to avoid db calls for
show full fields
This query is run by rails framework on boot of servers and with this file the requests would decline and the yml file would serve this.

Django - recreate database from migration files | Sync production and development databases

My situation is the following:
I have a Django project in a production environment which uses a mysql database.
Also, I have a copy of the same Django project in a local environment which uses a mysql database too. What I want to do is set the following workflow to deploy database changes into production:
Make changes in local models.
Run makemigrations.
Run migrate.
Add, commit and upload the migration file to production.
Run simply "migrate" in production environment to update the
production database.
Actually, for some past reason both databases aren't completetly synced and for do this workflow, I need both databases works with same migration files and keep synced.
Any help on how to do that? How can i create this local database fully synced with production?
I tried the following things:
Create an empty local database and simply apply the "migrate" command.
Export the production database schema and create a local database based on it.
None of the above worked because when i try to run locally the django server, first he says that i have 100 migrations unaplied and, when i do the migrate says...
django.db.migrations.exceptions.InvalidBasesError: Cannot resolve bases for []
This can happen if you are inheriting models from an app with migrations (e.g. contrib.auth)
in an app with no migrations; see https://docs.djangoproject.com/en/1.11/topics/migrations/#dependencies for more
Thank you so much.

Heroku Rails mySql (mysql2 gem) migrations

Been able to make migrations (add columns etc) on my local machine with mysql.
But when trying to push these migrations to Heroku, they continue to fail.
The first table in my migration file gets tagged with:
Mysql2::Error: Table 'xxxxx' already exists
locally all my migrations showing as:
up 20171127214206 Add tags to business
but when running heroku run rake db:migrate:status
down 20171127214206 Add tags to business
i don't mind losing all the data at this point as i'm working in a development version on Heroku and will merge later with production
been working on this issue for over a day, so any and all advice appreciated.
according to this and many other sources, it appears you're better to use postgresql adapter with heroku. If you want to do that, you need to edit your database.yml similar to this

Sails.js: Production env + sails-mysql - database tables not created upon lift

In production mode, when lifting a Sails application, the database tables are not created upon lift, while in dev mode, they are. Right now, when deploying, I'm running in dev mode once first so that the tables can be created and then running in prod mode. Is there a way around this?
No; this is by design. In the production environment, Sails does not do any migration to ensure that data isn't corrupted or lost when lifting.
From the Sails deployment guide:
Sails sets all your models to migrate:safe when run in production, which means no auto-migrations are run on starting up the app. You can set your database up the following way: Create the database on
the server and then run your sails app with migrate:alter locally, but
configured to use the production server as your db. This will
automatically set things up. In case you can't connect to the server
remotely, you'll simply dump your local schema and import it into the
database server.

Delvering a modified database from the local env to the production one

I'm working with MySQL databases.
To simplify the problem, let's say I have two environments : the local one (development) and the remote one (production mode).
In the database, I have some tables that contain configuration data.
How can I automate cleanly the delivery from the development mode to the production mode when I modify the database schema and the configuration tables content ?
For instance, I dot it manually by doing a diff between the local and remote databases. But, I find that method not so clean and I believe there is good practice allowing that.
This might be helpful in cases where you have multiple environments and multiple developers making schema changes very often and using php.. https://github.com/davejkiger/mysql-php-migrations
Introduce parameter "version" for your database. This version should be written somewhere in your code and somewhere in your database. Your code will work with database only if they have equal versions
Create a wrapper around your MySQL connection. This wrapper should check versions and if versions are not compatible, it should start upgrade.
"Upgrade" is a process of sequential applying the list of *.sql files with SQL commands, which will move your database from one state to another. It can be schema changes or data manipulation commands.
When you do something with database, do it only through adding new *.sql file and incrementing version.
As a result, when you deploy your database from development enviroment to production, your database will be upgraded automatically in the same way as it was upgraded during development.
I've seen LiquiBase http://www.liquibase.org/ a lot in Java environments.
In most of my projects I use sqlalchemy(a Python tool to manage db plus an ORM). If you have some experience(little more than beginner) with Python I highly recommend using it.
You can check this tool with a little help of that. This is also very useful for migrating your db to other rdbms(for example mysql to postgres or oracle).