I have a table 'test' having a column 'Name' with no constraints. I need to ALTER this column by giving it a UNIQUE constraint. How should I do it?
Should I use op.alter_column('???') or create_unique_constraint('???')?
Isn't create_unique_constraint for new column and not for existing one?
To add, you'd need:
https://alembic.sqlalchemy.org/en/latest/ops.html#alembic.operations.Operations.create_unique_constraint
from alembic import op
op.create_unique_constraint('uq_user_name', 'user', ['name'], schema='my_schema')
To drop, you'd need:
https://alembic.sqlalchemy.org/en/latest/ops.html#alembic.operations.Operations.drop_constraint
op.drop_constraint('uq_user_name', 'user', schema='my_schema')
Note: SQLAlchemy Migrations
Updated = Version: 0.7.3
to add unique constraints use create() on UniqueConstraint
to remove unique contraints use drop() on UniqueConstraint
Create a migration script. The script can be created in 2 ways.
# create manage.py
migrate manage manage.py --repository=migrations --url=postgresql://<user>:<password>#localhost:5432/<db_name>
# create script file
python manage.py script "Add Unique Contraints"
Or if you don't want to create manage.py then use the below commands
migrate script --repository=migrations --url=postgresql://<user>:<password?#localhost:5432/<db_name> "Add Unique Contraint"
it will create 00x_Add_Unique_Constraints.py
File: 00x_Add_Unique_Constraints.py
from migrate import UniqueConstraint
from sqlalchemy import MetaData, Table
def upgrade(migrate_engine):
# Upgrade operations go here. Don't create your own engine; bind
# migrate_engine to your metadata
# Table Name: user_table
# Column Name: first_name
metadata = MetaData(bind=migrate_engine)
user_table = Table('user_table', metadata, autoload=True)
UniqueConstraint(user_table.c.first_name, table=user_table).create()
def downgrade(migrate_engine):
# Operations to reverse the above upgrade go here.
# Table Name: user_table
# Column Name: first_name
metadata = MetaData(bind=migrate_engine)
user_table = Table('user_table', metadata, autoload=True)
UniqueConstraint(user_table.c.first_name, table=user_table).drop()
Following Mario Ruggier's answer,
I tried using his example code for my MySQL database and I didn't use the schema arguments because my database didn't have a schema.
I used:
from alembic import op
op.create_unique_constraint('uq_user_name', 'user', ['name'])
to drop the unique constraint, and
op.drop_constraint(constraint_name='uq_user_name', table_name='user', type_='unique')
Notice the different that I used a third argument, type_='unique', because without it, MySQL would return an error message that states something like
No generic 'DROP CONSTRAINT' in MySQL - please specify constraint type ...
Related
I have a model:
class MyModel(models.Model)
class Meta:
unique_together=["a", "b"]
index_together=["a", "b"]
a=models.IntegerField(null=True, blank=True)
b=models.ForeignKey("othermodel")
Migrations for this model:
class Migration(migrations.Migration):
dependencies = [
('app', 'previous_migration'),
]
operations = [
migrations.AlterUniqueTogether(
name='mymodel',
unique_together=set([('a', 'b')]),
),
migrations.AlterIndexTogether(
name='mymodel',
index_together=set([('a', 'b')]),
),
]
./manage.py sqlmigrate app mymigration
BEGIN;
CREATE INDEX `app_mymodel_id_asdfasfd_idx` ON `app_mymodel` (`a`, `b`);
COMMIT;
And i am using a MySQL database.
Django (1.8.5) now creates an index over both fields together, but with type INDEX and not type UNIQUE, which does not result in the expected IntegrityError when saving a duplicate. Manually changing the index results in the correct behaviour.
With just the AlterUniqueTogether migration, i get an empty output for /manage.py sqlmigrate.
How do i tell Django to create an UNIQUE index? Or is there a good reason why the created index is not setup this way?
With django 1.8 and a fresh installation with syncdb of the app, all needed indexes are created. I currently cannot reproduce the issue, as the old installation has the index manually created and fresh installations create it with syncdb and with migrate without any problems.
I did not test more recent django yet, but i assume it works as well.
I have a table defined in my app like this:
users = Table('users', metadata,
Column('user_id', Integer, autoincrement=True, primary_key=True),
Column('user_name', Unicode(16), unique=True, nullable=False),
Column('email_address', Unicode(255), unique=True, nullable=False),
Column('display_name', Unicode(255)),
Column('password', Unicode(80)),
Column('created', DateTime, default=datetime.now),
mysql_engine='InnoDB',
mysql_charset='utf8',
)
However, after developing for a while, I want to change user_name to a longer length, such as Unicode(255). As I understand it, this definition is run at first start-up, and so just changing this line wouldn't work for existing databases. It needs to migrate to this new definition. How do I go about converting already created databases into the new, desired definition?
You are correct that updating your code will not update an existing database schema. You can use Alembic to auto-generate and run schema migrations. Alembic can auto-generate change scripts for you by comparing the schema of your newly edited metadata with the schema from your database. Start here: http://alembic.readthedocs.org/en/latest/
Edit alembic/env.py and modify your configuration to include compare_type=True:
def run_migrations_online():
...
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True, # <------------- THIS LINE HERE
)
...
Disclosure, I got this solution from this helpful guide.
I need to add a FULLTEXT index to one of my Django model's fields and understand that there is no built in functionality to do this and that such an index must be added manually in mysql (our back end DB).
I want this index to be created in every environment. I understand model changes can be dealt with Django south migrations, but is there a way I could add such a FULLTEXT index as part of a migration?
In general, if there is any custom SQL that needs to be run, how can I make it a part of a migration.
Thanks.
You can write anything as a migration. That's the point!
Once you have South up and running, type in python manage.py schemamigration myapp --empty my_custom_migration to create a blank migration that you can customize.
Open up the XXXX_my_custom_migration.py file in myapp/migrations/ and type in your custom SQL migration there in the forwards method. For example you could use db.execute
The migration might look something like this:
class Migration(SchemaMigration):
def forwards(self, orm):
db.execute("CREATE FULLTEXT INDEX foo ON bar (foobar)")
print "Just created a fulltext index..."
print "And calculated {answer}".format(answer=40+2)
def backwards(self, orm):
raise RuntimeError("Cannot reverse this migration.")
# or what have you
$ python manage.py migrate myapp XXXX # or just python manage.py migrate.
"Just created fulltext index...."
"And calculated 42"
In newer versions of Django, you can create an empty migration for execute custom sql: python3 manage.py makemigrations --empty app_name
Then, in the generated migration:
from django.db import migrations
class Migration(migrations.Migration):
operations = [
migrations.RunSQL(
sql="CREATE FULLTEXT INDEX `index_name` on table_name(`column_name`);",
reverse_sql="ALTER TABLE table_name DROP INDEX index_name"
)
]
I want to programatically generate ALTER TABLE statements in SQL Alchemy to add a new column to a table. The column to be added should take its definition from an existing mapped class.
So, given an SQL Alchemy Column instance, can I generate the SQL schema definition(s) I would need for ALTER TABLE ... ADD COLUMN ... and CREATE INDEX ...?
I've played at a Python prompt and been able to see a human-readable description of the data I'm after:
>>> DBChain.__table__.c.rName
Column('rName', String(length=40, convert_unicode=False, assert_unicode=None, unicode_error=None, _warn_on_bytestring=False), table=<Chain>)
When I call engine.create_all() the debug log includes the SQL statements I'm looking to generate:
CREATE TABLE "Chain" (
...
"rName" VARCHAR(40),
...
)
CREATE INDEX "ix_Chain_rName" ON "Chain" ("rName")
I've heard of sqlalchemy-migrate, but that seems to be built around static changes and I'm looking to dynamically generate schema-changes.
(I'm not interested in defending this design, I'm just looking for a dialect-portable way to add a column to an existing table.)
After tracing engine.create_all() with a debugger I've discovered a possible answer:
>>> engine.dialect.ddl_compiler(
... engine.dialect,
... DBChain.__table__.c.rName ) \
... .get_column_specification(
... DBChain.__table__.c.rName )
'"rName" VARCHAR(40)'
The index can be created with:
sColumnElement = DBChain.__table__.c.rName
if sColumnElement.index:
sIndex = sa.schema.Index(
"ix_%s_%s" % (rTableName, sColumnElement.name),
sColumnElement,
unique=sColumnElement.unique)
sIndex.create(engine)
I'm getting an error when I'm trying to dump data to a JSON fixture in Djanog 1.2.1 on my live server. On the live server it's running MySQL Server version 5.0.77 and I imported a lot of data to my tables using the phpMyAdmin interface. The website works fine and Django admin responds as normal. But when I try and actually dump the data of the application that corresponds to the tables I get this error:
$ python manage.py dumpdata --indent=2 gigs > fixtures/gigs_100914.json
/usr/local/lib/python2.6/site-packages/MySQLdb/__init__.py:34: DeprecationWarning: the sets module is deprecated
from sets import ImmutableSet
Error: Unable to serialize database: Location matching query does not exist.
My Django model for 'gigs' that I'm trying to dump from looks like this in the models.py file:
from datetime import datetime
from django.db import models
class Location(models.Model):
name = models.CharField(max_length=120, blank=True, null=True)
class Meta:
ordering = ['name']
def __unicode__(self):
return "%s (%s)" % (self.name, self.pk)
class Venue(models.Model):
name = models.CharField(max_length=120, blank=True, null=True)
contact = models.CharField(max_length=250, blank=True, null=True)
url = models.URLField(max_length=60, verify_exists=False, blank=True, null=True) # because of single thread problems, I left this off (http://docs.djangoproject.com/en/dev/ref/models/fields/#django.db.models.URLField.verify_exists)
class Meta:
ordering = ['name']
def __unicode__(self):
return "%s (%s)" % (self.name, self.pk)
class Gig(models.Model):
date = models.DateField(blank=True, null=True)
details = models.CharField(max_length=250, blank=True, null=True)
location = models.ForeignKey(Location)
venue = models.ForeignKey(Venue)
class Meta:
get_latest_by = 'date'
ordering = ['-date']
def __unicode__(self):
return u"%s on %s at %s" % (self.location.name, self.date, self.venue.name)
Like I say, Django is fine with the data. The site works fine and the relationships seem to operate absolutely fine. When a run the command to get what SQL Django is using:
$ python manage.py sql gigs
/usr/local/lib/python2.6/site-packages/MySQLdb/__init__.py:34: DeprecationWarning: the sets module is deprecated
from sets import ImmutableSet
BEGIN;CREATE TABLE `gigs_location` (
`id` integer AUTO_INCREMENT NOT NULL PRIMARY KEY,
`name` varchar(120)
)
;
CREATE TABLE `gigs_venue` (
`id` integer AUTO_INCREMENT NOT NULL PRIMARY KEY,
`name` varchar(120),
`contact` varchar(250),
`url` varchar(60)
)
;
CREATE TABLE `gigs_gig` (
`id` integer AUTO_INCREMENT NOT NULL PRIMARY KEY,
`date` date,
`details` varchar(250),
`location_id` integer NOT NULL,
`venue_id` integer NOT NULL
)
;
ALTER TABLE `gigs_gig` ADD CONSTRAINT `venue_id_refs_id_3d901b6d` FOREIGN KEY (`venue_id`) REFERENCES `gigs_venue` (`id`);
ALTER TABLE `gigs_gig` ADD CONSTRAINT `location_id_refs_id_2f8d7a0` FOREIGN KEY (`location_id`) REFERENCES `gigs_location` (`id`);COMMIT;
I've triple checked the data, gone through to make sure all the relationships and data is ok after importing. But I'm still getting this error, three days on... I'm stuck with what to do about it. I can't imagine the "DeprecationWarning" is going to be a problem here. I really need to dump this data back out as JSON.
Many thanks for any help at all.
Could be something similar to this.
Run it with:
python manage.py dumpdata --indent=2 -v 2 --traceback gigs
To see the underlying error.
I once ran in a similar problem where the error message was as mesmerizing as yours. The cause was a lack of memory on my server. It seems that generating dumps in json is quite memory expensive. I had only 60meg of memory (at djangohosting.ch) and it was not enough to get a dump for a mysql DB for which the mysql dump was only 1meg.
I was able to find out by watching the python process hit the 60meg limit using the top command in a second command line while running manage.py dumpdata in a first one.
My solution : get the mysql dump and then load it on my desktop pc, before generating the json dump. That said, for backup purposes, the mysql dumps are enough.
The command to get a mysql dump is the following :
mysqldump -p [password] -u [username] [database_name] > [dump_file_name].sql
That said, your problem could be completely different. You should really look at every table that has a foreign key to your Location table, and check if there is no field pointing to a previously deleted location. Unfortunately MySQL is very bad at maintaining Referential integrity, and you cannot count on it.
you can --exclude that particular app which is creating problem , still there will be database tables , it worked for me
python manage.py dumpdata > backedup_data.json --exclude app_name
This error shows because there's a mismatch between your DB's schema and your Models.
You can try find it manually or you could just go ahead and install django-extensions
pip install django-extensions
and use the sqldiff command which will print you exactly wheres the problem.
python manage.py sqldiff -a -t
First and foremost, make your models match what your db has. Then run migrations and a fake migrate:
python manage.py makemigrations && python manage.py migrate --fake
That alone should let you run a dump. As soon as django makes sure the DB's schema matches your models, it will let you.
Moving forward, you can update your models and re-run the migrations as usual:
python manage.py makemigrations && python manage.py migrate