Changing a defined column using sqlalchemy into a longer-length unicode - mysql

I have a table defined in my app like this:
users = Table('users', metadata,
Column('user_id', Integer, autoincrement=True, primary_key=True),
Column('user_name', Unicode(16), unique=True, nullable=False),
Column('email_address', Unicode(255), unique=True, nullable=False),
Column('display_name', Unicode(255)),
Column('password', Unicode(80)),
Column('created', DateTime, default=datetime.now),
mysql_engine='InnoDB',
mysql_charset='utf8',
)
However, after developing for a while, I want to change user_name to a longer length, such as Unicode(255). As I understand it, this definition is run at first start-up, and so just changing this line wouldn't work for existing databases. It needs to migrate to this new definition. How do I go about converting already created databases into the new, desired definition?

You are correct that updating your code will not update an existing database schema. You can use Alembic to auto-generate and run schema migrations. Alembic can auto-generate change scripts for you by comparing the schema of your newly edited metadata with the schema from your database. Start here: http://alembic.readthedocs.org/en/latest/

Edit alembic/env.py and modify your configuration to include compare_type=True:
def run_migrations_online():
...
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True, # <------------- THIS LINE HERE
)
...
Disclosure, I got this solution from this helpful guide.

Related

Alembic migrate as_uuid for UUIDField

I'm trying to migrate a SQLAlchemy model (using Flask-SQLAlchemy if that helps). I use a model that looks like this:
class Model:
uuid = db.Column(UUID(as_uuid=True))
# With an occasional foreign key
class Model1:
db.column(UUID(as_uuid=True), db.ForeignKey("model.uuid"), nullable=False)
I found it easier to use as_uuid=False, but alembic does not automatically migrate this for us. What are the alembic commands I should write to get this working?

'Relation does not exist' error after transferring to PostgreSQL

I have transfered my project from MySQL to PostgreSQL and tried to drop the column as result of previous issue, because after I removed the problematic column from models.py and saved. error didn't even disappear. Integer error transferring from MySQL to PostgreSQL
Tried both with and without quotes.
ALTER TABLE "UserProfile" DROP COLUMN how_many_new_notifications;
Or:
ALTER TABLE UserProfile DROP COLUMN how_many_new_notifications;
Getting the following:
ERROR: relation "UserProfile" does not exist
Here's a model, if helps:
class UserProfile(models.Model):
user = models.OneToOneField(User)
how_many_new_notifications = models.IntegerField(null=True,default=0)
User.profile = property(lambda u: UserProfile.objects.get_or_create(user=u)[0])
I supposed it might have something to do with mixed-case but I have found no solution through all similar questions.
Yes, Postgresql is a case aware database but django is smart enough to know that. It converts all field and it generally converts the model name to a lower case table name. However the real problem here is that your model name will be prefixed by the app name. generally django table names are like:
<appname>_<modelname>
You can find out what exactly it is by:
from myapp.models import UserProfile
print (UserProfile._meta.db_table)
Obviously this needs to be typed into the django shell, which is invoked by ./manage.py shell the result of this print statement is what you should use in your query.
Client: DataGrip
Database engine: PostgreSQL
For me this worked opening a new console, because apparently from the IDE cache it was not recognizing the table I had created.
Steps to operate with the tables of a database:
Database (Left side panel of the IDE) >
Double Click on PostgreSQL - #localhost >
Double Click on the name of the database >
Right click on public schema >
New > Console
GL

EF and Code First Migrations with MySQL - dbo.tablename does not exist

I have set up entity framework to use MySQL and created a migration.
When I run update-database I get error "table dbname.dbo.tablename' does not exist. Running in -Verbose mode I see the statement that causes the error :
alter table `dbo.Comments` drop foreign key `FK_dbo.Comments_dbo.Comments_Comment_ID`
When I run the query direct in MySQL workbench is throws the same error.
The problem seems to be the dbo. prefix in the migration set. Anything in the form dbo.tablename won't run saying that table does not exist. E.g. select * from dbo.tablename fails but select * from tablename works. The database was generated by Entity Framework and the code first migrations were generated by EF too.
However the migrations generate everything with the dbo. prefix, which does not work.
Does anyone have a solution to this?
I was having this problem just today as well; found my answer here:
MySqlMigrationCodeGenerator
You have to set:
CodeGenerator = new MySqlMigrationCodeGenerator();
In your context's configuration class. This will get rid of the schema gibberish for MySQL. My Configuration class looks like this:
internal sealed class Configuration : DbMigrationsConfiguration<YourContext>
{
public Configuration()
{
AutomaticMigrationsEnabled = false;
SetSqlGenerator("MySql.Data.MySqlClient", new MySqlMigrationSqlGenerator());
SetHistoryContextFactory("MySql.Data.MySqlClient", (conn, schema) => new MySqlHistoryContext(conn, schema));
CodeGenerator = new MySqlMigrationCodeGenerator();
}
}
I have this same issue and I applied the solution posted by Sorio and It generated a lot of problems.
CodeGenerator = new MySqlMigrationCodeGenerator();
This code line can change all the sql that you generate, for example in my case in the sql code to apply to the database all the foreign key constraints disappear.
I am still without a solution because for me this is not acceptable and I recommend you to check the sql generate with the command before use this solution:
update-database -script

Accessing models in alembic migrations

I'm using alembic migrations for a flask+sqlalchemy project and things work as expected till I try to query the models in alembic.
from models import StoredFile
def upgrade():
### commands auto generated by Alembic - please adjust! ###
op.add_column('stored_file', sa.Column('mimetype', sa.Unicode(length=32))
for sf in StoredFile.query.all():
sf.mimetype = guess_type(sf.title)
The above code gets stuck after adding column and never comes out. I guess the StoredFile.query is trying to use a different database connection than the one being used by alembic. (But why? Am I missing something in env.py?)
I could solve it by using the op.get_bind().execute(...) but the question is how can I use the models directly in alembic?
You should not use classes from models in your alembic migrations. If you need to use model classes, you should redefine them in each migration file to make the migration self-contained. The reason is that multiple migrations can be deployed in one command, and it's possible that between the time the migration was written and until it is actually performed in production, the model classes have been changed in accordance with a "later" migration.
For example, see this example from the documentation for Operations.execute:
from sqlalchemy.sql import table, column
from sqlalchemy import String
from alembic import op
account = table('account',
column('name', String)
)
op.execute(
account.update(). \
where(account.c.name==op.inline_literal('account 1')). \
values({'name':op.inline_literal('account 2')})
)
Tip: You don't need to include the full model class, only the parts that are necessary for the migration.
I had the same problem. When you use StoredFile.query you are using a different session than Alembic is using. It tries to query the database but the table is locked because you're altering it. So the upgrade just sits there and waits forever because you have two sessions waiting for each other. Based on #SowingSadness response, this worked for me:
from models import StoredFile
def upgrade():
### commands auto generated by Alembic - please adjust! ###
op.add_column('stored_file', sa.Column('mimetype', sa.Unicode(length=32))
connection = op.get_bind()
SessionMaker = sessionmaker(bind=connection.engine)
session = SessionMaker(bind=connection)
for sf in session.query(StoredFile):
sf.mimetype = guess_type(sf.title)
session.flush()
op.other_operations()

Alembic: How to add unique constraint to existing column

I have a table 'test' having a column 'Name' with no constraints. I need to ALTER this column by giving it a UNIQUE constraint. How should I do it?
Should I use op.alter_column('???') or create_unique_constraint('???')?
Isn't create_unique_constraint for new column and not for existing one?
To add, you'd need:
https://alembic.sqlalchemy.org/en/latest/ops.html#alembic.operations.Operations.create_unique_constraint
from alembic import op
op.create_unique_constraint('uq_user_name', 'user', ['name'], schema='my_schema')
To drop, you'd need:
https://alembic.sqlalchemy.org/en/latest/ops.html#alembic.operations.Operations.drop_constraint
op.drop_constraint('uq_user_name', 'user', schema='my_schema')
Note: SQLAlchemy Migrations
Updated = Version: 0.7.3
to add unique constraints use create() on UniqueConstraint
to remove unique contraints use drop() on UniqueConstraint
Create a migration script. The script can be created in 2 ways.
# create manage.py
migrate manage manage.py --repository=migrations --url=postgresql://<user>:<password>#localhost:5432/<db_name>
# create script file
python manage.py script "Add Unique Contraints"
Or if you don't want to create manage.py then use the below commands
migrate script --repository=migrations --url=postgresql://<user>:<password?#localhost:5432/<db_name> "Add Unique Contraint"
it will create 00x_Add_Unique_Constraints.py
File: 00x_Add_Unique_Constraints.py
from migrate import UniqueConstraint
from sqlalchemy import MetaData, Table
def upgrade(migrate_engine):
# Upgrade operations go here. Don't create your own engine; bind
# migrate_engine to your metadata
# Table Name: user_table
# Column Name: first_name
metadata = MetaData(bind=migrate_engine)
user_table = Table('user_table', metadata, autoload=True)
UniqueConstraint(user_table.c.first_name, table=user_table).create()
def downgrade(migrate_engine):
# Operations to reverse the above upgrade go here.
# Table Name: user_table
# Column Name: first_name
metadata = MetaData(bind=migrate_engine)
user_table = Table('user_table', metadata, autoload=True)
UniqueConstraint(user_table.c.first_name, table=user_table).drop()
Following Mario Ruggier's answer,
I tried using his example code for my MySQL database and I didn't use the schema arguments because my database didn't have a schema.
I used:
from alembic import op
op.create_unique_constraint('uq_user_name', 'user', ['name'])
to drop the unique constraint, and
op.drop_constraint(constraint_name='uq_user_name', table_name='user', type_='unique')
Notice the different that I used a third argument, type_='unique', because without it, MySQL would return an error message that states something like
No generic 'DROP CONSTRAINT' in MySQL - please specify constraint type ...