Alembic migrate as_uuid for UUIDField - sqlalchemy

I'm trying to migrate a SQLAlchemy model (using Flask-SQLAlchemy if that helps). I use a model that looks like this:
class Model:
uuid = db.Column(UUID(as_uuid=True))
# With an occasional foreign key
class Model1:
db.column(UUID(as_uuid=True), db.ForeignKey("model.uuid"), nullable=False)
I found it easier to use as_uuid=False, but alembic does not automatically migrate this for us. What are the alembic commands I should write to get this working?

Related

'Table doesn't exist' on django makemigrations

On a django 1.11 application which uses mysql , I have 3 apps and in one of them I have a 'Country' model:
class Country(models.Model):
countryId = models.AutoField(primary_key=True, db_column='country_id')
name = models.CharField(max_length=100)
code = models.CharField(max_length=3)
class Meta:
db_table = 'country'
Whaen I try to makemigrations I get this error:
django.db.utils.ProgrammingError: (1146, "Table 'dbname.country' doesn't exist")
If I run making migration for another app which is not related to this model and its database table using ./manage.py makemigrations another_app, I still get this error.
I've had this problem and it's because I was initializing a default value somewhere in a model using... the database that I had just dropped. In a nutshell I had something like forms.ChoiceField(choices=get_some_data(),...) where get_some_data() used the database to retrieve some default values.
I wish you had posted the backtrace because in my case it's pretty obvious by looking at the backtrace that get_some_data() was using the orm (using something like somemodel.objetcs.filter(...)).
Somehow, Django thinks you've already created this table and are now trying to modify it, while in fact you've externally dropped the table and started over. If that's the case, delete all of the files in migrations folders belong to your apps and start over with ./manage.py makemigrations.
Review, if you have any dependencies, is possible same Model need the Model Country in the same app or other app like:
class OtherModel(models.Model):
country = models.ForeignKey(Country)
1.- If is True, you need to review if installed_apps in settings.py have the correct order of apps, if is in the same app, you need to declare first a Country app and then the dependents.
2.- If dependent is in the same app, the dependent Model need to be declared after Country model in models.py.
3.- Review if the error track on console talk about same erros on models.py or forms.py
4.- Review if when executing makemigrations and migrate is the correct order of apps: python manage.py makemirgations app_of_country, other_app_name

Django unique_together creates an index without UNIQUE constraint

I have a model:
class MyModel(models.Model)
class Meta:
unique_together=["a", "b"]
index_together=["a", "b"]
a=models.IntegerField(null=True, blank=True)
b=models.ForeignKey("othermodel")
Migrations for this model:
class Migration(migrations.Migration):
dependencies = [
('app', 'previous_migration'),
]
operations = [
migrations.AlterUniqueTogether(
name='mymodel',
unique_together=set([('a', 'b')]),
),
migrations.AlterIndexTogether(
name='mymodel',
index_together=set([('a', 'b')]),
),
]
./manage.py sqlmigrate app mymigration
BEGIN;
CREATE INDEX `app_mymodel_id_asdfasfd_idx` ON `app_mymodel` (`a`, `b`);
COMMIT;
And i am using a MySQL database.
Django (1.8.5) now creates an index over both fields together, but with type INDEX and not type UNIQUE, which does not result in the expected IntegrityError when saving a duplicate. Manually changing the index results in the correct behaviour.
With just the AlterUniqueTogether migration, i get an empty output for /manage.py sqlmigrate.
How do i tell Django to create an UNIQUE index? Or is there a good reason why the created index is not setup this way?
With django 1.8 and a fresh installation with syncdb of the app, all needed indexes are created. I currently cannot reproduce the issue, as the old installation has the index manually created and fresh installations create it with syncdb and with migrate without any problems.
I did not test more recent django yet, but i assume it works as well.

Questions about using Flask+MySQL+Flask-SQLAlchemy

I am going to build a site using Flask+MySQL+Flask-SQLAlchemy, however, after reading some tutorials, I have some questions:
Flask-SQLAlchemy can be imported in at least two ways:
http://pythonhosted.org/Flask-SQLAlchemy/quickstart.html
from flask.ext.sqlalchemy import SQLAlchemy
OR http://flask.pocoo.org/docs/patterns/sqlalchemy/
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.ext.declarative import declarative_base
The first way seems much more convenient. Why Pocoo team choose to use the second way?
http://pythonhosted.org/Flask-SQLAlchemy/queries.html The insert examples here are too simple. How can I perform INSERT
IGNORE or INNER JOIN? If I want to write native SQL statements. How to do that with SQLAlchemy?
I need some good examples on MySQL+Flask-SQLAlchemy, while most example are SQLite+MySQL+Flask-SQLAlchemy.
I have been coding using MySQL+Flask-SQLAlchemy and i host my applications on pythonanywhere.com ,like what #Sean Vieira said ,your code will run on any Relational database the only thing you need to change is the connection string to the database of your liking ,e.g using Mysql ,this will be saved in a file called config.py [you can save it using any other name]
SQLALCHEMY_DATABASE_URI = 'mysql://username:password#localhost/yourdatabase'
SQLALCHEMY_POOL_RECYCLE = 280
SQLALCHEMY_POOL_SIZE = 20
SQLALCHEMY_TRACK_MODIFICATIONS = True
then in your main app ,you will import it like this
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
# Create an Instance of Flask
application = Flask(__name__)
# Include config from config.py
application.config.from_object('config')
application.secret_key = 'some_secret_key'
# Create an instance of SQLAlchemy
db = SQLAlchemy(application)
and all you need is to make sure your models correspond to a database table like the model below
class Isps(db.Model):
__tablename__ = "isps"
isp_id = db.Column('isp_id', db.Integer, primary_key=True)
isp_name = db.Column('isp_name', db.String(80), unique=True)
isp_description = db.Column('isp_description', db.String(180))
# services = db.relationship('Services', backref="isps", cascade="all, delete-orphan",lazy='dynamic')
def __init__(self, isp_name, isp_description):
self.isp_name = isp_name
self.isp_description = isp_description
def __repr__(self):
return '<Isps %r>' % self.isp_name
and you can then learn the power of SQLAlchemy to do optimised queries
Flask-SQLAlchemy was written by Armin (the creator of Flask) to make it simple to work with SQLAlchemy. The pattern described in Flask's documentation is what you would use if you did not choose to use the Flask-SQLAlchemy extension.
As for MySQL vs. SQLite, the whole point of the SQLAlchemy ORM is to make it possible (for the most part) to ignore what database you are running against. SomeModel.filter(SomeModel.column == 'value') will work the same, regardless of what database you are connecting to.

Accessing models in alembic migrations

I'm using alembic migrations for a flask+sqlalchemy project and things work as expected till I try to query the models in alembic.
from models import StoredFile
def upgrade():
### commands auto generated by Alembic - please adjust! ###
op.add_column('stored_file', sa.Column('mimetype', sa.Unicode(length=32))
for sf in StoredFile.query.all():
sf.mimetype = guess_type(sf.title)
The above code gets stuck after adding column and never comes out. I guess the StoredFile.query is trying to use a different database connection than the one being used by alembic. (But why? Am I missing something in env.py?)
I could solve it by using the op.get_bind().execute(...) but the question is how can I use the models directly in alembic?
You should not use classes from models in your alembic migrations. If you need to use model classes, you should redefine them in each migration file to make the migration self-contained. The reason is that multiple migrations can be deployed in one command, and it's possible that between the time the migration was written and until it is actually performed in production, the model classes have been changed in accordance with a "later" migration.
For example, see this example from the documentation for Operations.execute:
from sqlalchemy.sql import table, column
from sqlalchemy import String
from alembic import op
account = table('account',
column('name', String)
)
op.execute(
account.update(). \
where(account.c.name==op.inline_literal('account 1')). \
values({'name':op.inline_literal('account 2')})
)
Tip: You don't need to include the full model class, only the parts that are necessary for the migration.
I had the same problem. When you use StoredFile.query you are using a different session than Alembic is using. It tries to query the database but the table is locked because you're altering it. So the upgrade just sits there and waits forever because you have two sessions waiting for each other. Based on #SowingSadness response, this worked for me:
from models import StoredFile
def upgrade():
### commands auto generated by Alembic - please adjust! ###
op.add_column('stored_file', sa.Column('mimetype', sa.Unicode(length=32))
connection = op.get_bind()
SessionMaker = sessionmaker(bind=connection.engine)
session = SessionMaker(bind=connection)
for sf in session.query(StoredFile):
sf.mimetype = guess_type(sf.title)
session.flush()
op.other_operations()

Changing a defined column using sqlalchemy into a longer-length unicode

I have a table defined in my app like this:
users = Table('users', metadata,
Column('user_id', Integer, autoincrement=True, primary_key=True),
Column('user_name', Unicode(16), unique=True, nullable=False),
Column('email_address', Unicode(255), unique=True, nullable=False),
Column('display_name', Unicode(255)),
Column('password', Unicode(80)),
Column('created', DateTime, default=datetime.now),
mysql_engine='InnoDB',
mysql_charset='utf8',
)
However, after developing for a while, I want to change user_name to a longer length, such as Unicode(255). As I understand it, this definition is run at first start-up, and so just changing this line wouldn't work for existing databases. It needs to migrate to this new definition. How do I go about converting already created databases into the new, desired definition?
You are correct that updating your code will not update an existing database schema. You can use Alembic to auto-generate and run schema migrations. Alembic can auto-generate change scripts for you by comparing the schema of your newly edited metadata with the schema from your database. Start here: http://alembic.readthedocs.org/en/latest/
Edit alembic/env.py and modify your configuration to include compare_type=True:
def run_migrations_online():
...
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True, # <------------- THIS LINE HERE
)
...
Disclosure, I got this solution from this helpful guide.