I'm working in a project synchronized with a remote repository. I execute this command to update the database on my localhost (development environment):
php app/console doctrine:schema:update --force
but I've read on Doctrine documentation that it shouldn't be executed on a production environment, so I have to execute now the SQL command directly on my production database.
The problem is that I'm not sure what columns I've changed, so I'm looking for a way to know the differences between those databases. Any idea?
Better is use DB versioning system (doctrine/doctrine-migrations-bundle)
Another way - make full db dump to dev environment than make
php app/console doctrine:schema:update --dump-sql
And start queries on live
Related
My situation is the following:
I have a Django project in a production environment which uses a mysql database.
Also, I have a copy of the same Django project in a local environment which uses a mysql database too. What I want to do is set the following workflow to deploy database changes into production:
Make changes in local models.
Run makemigrations.
Run migrate.
Add, commit and upload the migration file to production.
Run simply "migrate" in production environment to update the
production database.
Actually, for some past reason both databases aren't completetly synced and for do this workflow, I need both databases works with same migration files and keep synced.
Any help on how to do that? How can i create this local database fully synced with production?
I tried the following things:
Create an empty local database and simply apply the "migrate" command.
Export the production database schema and create a local database based on it.
None of the above worked because when i try to run locally the django server, first he says that i have 100 migrations unaplied and, when i do the migrate says...
django.db.migrations.exceptions.InvalidBasesError: Cannot resolve bases for []
This can happen if you are inheriting models from an app with migrations (e.g. contrib.auth)
in an app with no migrations; see https://docs.djangoproject.com/en/1.11/topics/migrations/#dependencies for more
Thank you so much.
I need to migrate mnesia to mysql from ejabberd.
I have tried many ways from ui : From ui there is a node. On selection of node I will have many options out of which one option is backup. On that page there is a option of Export all tables as SQL queries to a file: host(0.0.0.0),I tried to take sql backup but file is empty
I also tried these commands:
ejabberdctl export2odbc localhost /var/lib/ejabberd/new_file.sql. This is also a blank file generating no error.:
ejabberdctl export2sql localhost /tmp/sql /var/lib/ejabberd/new.sql. This command does not execute as export2sql does not exist.
Is there any other way to take sql dump from Mnesia
Version : ejabberd 16.01 mysql 5.6.xx
The sql export command was added in 16.04 and named as export_sql and later renamed to export2sql in 16.06. So there's no way to take a dump directly, but you have two alternatives:
If you can upgrade ejabberd, then its straight forward, upgrade the server, take a dump of the sql.
Take a backup of the relevant folders, like database/spool directory, config directory, etc.
Upgrade the server to the latest version or at least version 17.07 (the reason being since version 17.06 most of the tables can be exported to sql file, but 17.03-17.06 suffers a bug.
Configure ejabberd to use mysql as a backend database.
Make sure the following modules have the db_type: sql option.
mod_announce, mod_caps, mod_irc, mod_last, mod_muc, mod_offline, mod_privacy, mod_private, mod_pubsub, mod_roster, mod_shared_roster, mod_vcard, mod_vcard_xupdate
Restore the spool directory and make sure you have the same permissions
for all the files and sub-directories as before.
Run the ejabberd2sql with hosts and the sql filename as parameters
Note: If you only need the sql dump you might want to revert back the configuration after the dump.
If you cannot upgrade the server, you can install the latest version of ejabberd in another machine, copy the database directory, follow the same procedure above and you can get the dump of your sql.
I'm working on changing my environment from vagrant to docker and I came across one hitch. With vagrant I have a bash file that will pull data from ftp and restore it in my local so that I can work with the most up to date data.
This is my code
php artisan db:restore --database=mysql --source=ftp --sourcePath=$(date +'%Y')"/"$(date +'%m')"/"$(date +%m-%d-%Y -d "yesterday")".gz" --compression=gzip
php artisan migrate
Inside of my work container I run this and it wont find the mysql command because mysql is in a different container. What can I do to fix my work flow?
This answer is about rails migrations, but you could take a similar approach to laravel. Create a container that has the required laravel tools and required MySQL tools (likely this) to connect to the database.
You could then combine it with a suggestion in this answer and create a migration script that you can add to your image. As larsks points out in the comments, it would allow you to encapsulate the logic for where to get the backup data and how to restore it in one place. Assuming you named this script restore, you could run your restore command like this
docker run myimage restore
I'm working on a site with the latest version of laravel, and we have recently changed the MySQL DB location from a remote host to a local host. After moving and editing the config/database.php file to reflect the changes, I tried to run the command php artisan migrate, but I'm not getting anything back.
There is no response on the terminal, and no results in the new database. I've also tried migrate:refresh and migrate:status.
How do I re-run the migrations and get the tables in the new database?
At first drop your database table or tables and than execute
php artisan migrate:refresh
hopefully it'll work for you
You can just export DB to a file and import it on a local host. You do not need to run php artisan migrate command.
You do not want to run migrations if you already have DB with tables and data.
For years I've used a ssh pipe from mysqldump on the live server to mysql on my development machine for getting a copy of the current data.
ssh -C <server> mysqldump --opt <live_database_name> |mysql <local_dev_database_name>
Where -C enables ssh compression and --opt enables quickness and completeness.
Does anyone have a rails-ish equivalent rake task for this? Ideally it'd take the database names from config/database.yml
https://gist.github.com/750129
This is not an elegant solution. It's basically a wrapper for your old method, so it's not even compatible with other database drivers.
But it is something you can put in your SCM under lib/tasks to share it with other developers on your team. It also uses config data from your existing config/database.yml file. You define the live db by simply adding another branch to that file and it uses the same key names that Rails do.
Maybe it would even make sense to reuse the production database configuration.
Here's one I use for a Postgres database: https://gist.github.com/748222.
There are three tasks: db:download, db:replace and db:restore. db:restore is just a wrapper around the other two.
I'd say you could do something similar for Mysql pretty quickly as well. I just use the latest backup in this case instead of creating it at runtime.
If you are fine with coding in Ruby you might want to look into seed-fu and activerecord-import.
Oh, I forgot to mention standalone-migrations. It comes with a Rake task for schema migrations.
Good luck!
Instead of using rake to do this, I'd use a capistrano task, as your capistrano tasks would already have knowledge of where the production server is, etc...
Theoretically you should be able to create another "instance" in database.yml (for your live server) and put the correct host there (and leave the rest username/passwords etc). In your rake task you would load the database yaml file, read the host and go on with your command line slurpiness. It could easily be wrapped up in a rake task.