I'm using alembic migrations for a flask+sqlalchemy project and things work as expected till I try to query the models in alembic.
from models import StoredFile
def upgrade():
### commands auto generated by Alembic - please adjust! ###
op.add_column('stored_file', sa.Column('mimetype', sa.Unicode(length=32))
for sf in StoredFile.query.all():
sf.mimetype = guess_type(sf.title)
The above code gets stuck after adding column and never comes out. I guess the StoredFile.query is trying to use a different database connection than the one being used by alembic. (But why? Am I missing something in env.py?)
I could solve it by using the op.get_bind().execute(...) but the question is how can I use the models directly in alembic?
You should not use classes from models in your alembic migrations. If you need to use model classes, you should redefine them in each migration file to make the migration self-contained. The reason is that multiple migrations can be deployed in one command, and it's possible that between the time the migration was written and until it is actually performed in production, the model classes have been changed in accordance with a "later" migration.
For example, see this example from the documentation for Operations.execute:
from sqlalchemy.sql import table, column
from sqlalchemy import String
from alembic import op
account = table('account',
column('name', String)
)
op.execute(
account.update(). \
where(account.c.name==op.inline_literal('account 1')). \
values({'name':op.inline_literal('account 2')})
)
Tip: You don't need to include the full model class, only the parts that are necessary for the migration.
I had the same problem. When you use StoredFile.query you are using a different session than Alembic is using. It tries to query the database but the table is locked because you're altering it. So the upgrade just sits there and waits forever because you have two sessions waiting for each other. Based on #SowingSadness response, this worked for me:
from models import StoredFile
def upgrade():
### commands auto generated by Alembic - please adjust! ###
op.add_column('stored_file', sa.Column('mimetype', sa.Unicode(length=32))
connection = op.get_bind()
SessionMaker = sessionmaker(bind=connection.engine)
session = SessionMaker(bind=connection)
for sf in session.query(StoredFile):
sf.mimetype = guess_type(sf.title)
session.flush()
op.other_operations()
Related
I'm trying to migrate a SQLAlchemy model (using Flask-SQLAlchemy if that helps). I use a model that looks like this:
class Model:
uuid = db.Column(UUID(as_uuid=True))
# With an occasional foreign key
class Model1:
db.column(UUID(as_uuid=True), db.ForeignKey("model.uuid"), nullable=False)
I found it easier to use as_uuid=False, but alembic does not automatically migrate this for us. What are the alembic commands I should write to get this working?
In PyCharm I created a MySQL schema using pymysql on my computer. Now I want to use Peewee to create tables and write the SQL queries. However, I always receive an error message (see below) when trying to connect to the DB.
The user has sufficient rights to create tables in the DB schema as it works flawlessly with pymysql (creating tables as well as the schema works fine).
I looked at similar questions on Stackoverflow and couldn't find a similar problem. Moreover, this problem wasn't experienced in any of the tutorials I looked at, so I'm not entirely sure what could be the culprit causing the error. Below is a minimal working example.
from peewee import*
import peewee
user = 'root'
password = 'root'
db_name = 'peewee_demo'
# The schema with the name 'peewee_demo' exists
db = MySQLDatabase(db_name, user=user, passwd=password)
class Book(peewee.Model):
author = peewee.CharField()
title = peewee.TextField()
class Meta:
database = db
db.connect() # Code fails here
Book.create_table()
book = Book(author="me", title='Peewee is cool')
book.save()
for book in Book.filter(author="me"):
print(book.title)
I would expect the above code to connect to MySQL and then create a new table in the schema "peewee_demo". But instead, the code throws an error message when trying to connect to the DB:
/usr/bin/python3.6: Relink '/lib/x86_64-linux-gnu/libsystemd.so.0' with /lib/x86_64-linux-gnu/librt.so.1' for IFUNC symbol clock_gettime'
/usr/bin/python3.6: Relink /lib/x86_64-linux-gnu/libudev.so.1' with /lib/x86_64-linux-gnu/librt.so.1' for IFUNC symbol `clock_gettime'
Do you have any ideas how to fix this issue?
Thanks in advance
As #coleifer pointed out in his comment, the error was probably related to a shared library issue in Python. After setting up a new virtual environment and installing all required packages, everything runs perfectly fine.
I just added the answer to be able to close the question. If #coleifer converts his comment into an answer, I'll delete mine and accept his.
I am going to build a site using Flask+MySQL+Flask-SQLAlchemy, however, after reading some tutorials, I have some questions:
Flask-SQLAlchemy can be imported in at least two ways:
http://pythonhosted.org/Flask-SQLAlchemy/quickstart.html
from flask.ext.sqlalchemy import SQLAlchemy
OR http://flask.pocoo.org/docs/patterns/sqlalchemy/
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.ext.declarative import declarative_base
The first way seems much more convenient. Why Pocoo team choose to use the second way?
http://pythonhosted.org/Flask-SQLAlchemy/queries.html The insert examples here are too simple. How can I perform INSERT
IGNORE or INNER JOIN? If I want to write native SQL statements. How to do that with SQLAlchemy?
I need some good examples on MySQL+Flask-SQLAlchemy, while most example are SQLite+MySQL+Flask-SQLAlchemy.
I have been coding using MySQL+Flask-SQLAlchemy and i host my applications on pythonanywhere.com ,like what #Sean Vieira said ,your code will run on any Relational database the only thing you need to change is the connection string to the database of your liking ,e.g using Mysql ,this will be saved in a file called config.py [you can save it using any other name]
SQLALCHEMY_DATABASE_URI = 'mysql://username:password#localhost/yourdatabase'
SQLALCHEMY_POOL_RECYCLE = 280
SQLALCHEMY_POOL_SIZE = 20
SQLALCHEMY_TRACK_MODIFICATIONS = True
then in your main app ,you will import it like this
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
# Create an Instance of Flask
application = Flask(__name__)
# Include config from config.py
application.config.from_object('config')
application.secret_key = 'some_secret_key'
# Create an instance of SQLAlchemy
db = SQLAlchemy(application)
and all you need is to make sure your models correspond to a database table like the model below
class Isps(db.Model):
__tablename__ = "isps"
isp_id = db.Column('isp_id', db.Integer, primary_key=True)
isp_name = db.Column('isp_name', db.String(80), unique=True)
isp_description = db.Column('isp_description', db.String(180))
# services = db.relationship('Services', backref="isps", cascade="all, delete-orphan",lazy='dynamic')
def __init__(self, isp_name, isp_description):
self.isp_name = isp_name
self.isp_description = isp_description
def __repr__(self):
return '<Isps %r>' % self.isp_name
and you can then learn the power of SQLAlchemy to do optimised queries
Flask-SQLAlchemy was written by Armin (the creator of Flask) to make it simple to work with SQLAlchemy. The pattern described in Flask's documentation is what you would use if you did not choose to use the Flask-SQLAlchemy extension.
As for MySQL vs. SQLite, the whole point of the SQLAlchemy ORM is to make it possible (for the most part) to ignore what database you are running against. SomeModel.filter(SomeModel.column == 'value') will work the same, regardless of what database you are connecting to.
I have a table defined in my app like this:
users = Table('users', metadata,
Column('user_id', Integer, autoincrement=True, primary_key=True),
Column('user_name', Unicode(16), unique=True, nullable=False),
Column('email_address', Unicode(255), unique=True, nullable=False),
Column('display_name', Unicode(255)),
Column('password', Unicode(80)),
Column('created', DateTime, default=datetime.now),
mysql_engine='InnoDB',
mysql_charset='utf8',
)
However, after developing for a while, I want to change user_name to a longer length, such as Unicode(255). As I understand it, this definition is run at first start-up, and so just changing this line wouldn't work for existing databases. It needs to migrate to this new definition. How do I go about converting already created databases into the new, desired definition?
You are correct that updating your code will not update an existing database schema. You can use Alembic to auto-generate and run schema migrations. Alembic can auto-generate change scripts for you by comparing the schema of your newly edited metadata with the schema from your database. Start here: http://alembic.readthedocs.org/en/latest/
Edit alembic/env.py and modify your configuration to include compare_type=True:
def run_migrations_online():
...
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
compare_type=True, # <------------- THIS LINE HERE
)
...
Disclosure, I got this solution from this helpful guide.
I am trying to switch between 2 mysql servers at runtime. I do not need to maintain both connections alive all the time.
This is what I am doing
from django.conf import settings
from django.db import connection
from django.contrib.auth.models import User
connection.close()
setattr(settings, 'DATABASE_HOST', 'mysql1.com')
list1 = User.objects.all()
connection.close()
setattr(settings, 'DATABASE_HOST', 'mysql2.com')
list2 = User.objects.all()
I have the following settings.py:
DATABASE_HOST = '' # localhost
DATABASE_NAME = test
...
The database name is the same on all servers and only the content of each tables differ.
I should get list1 != list2 as the users are different on both servers.
The issue is that I always get the list of users from the default database defined in settings.py (which is running on localhost) instead of the one from mysql 1 server and then from mysql 2 server.
Any idea what I am doing wrong here?
Laurent
My guess, from the information, would be a potential error in your set DATABASE_HOST lines (in yor pseudo code above). read: "setattr(settings..."
Other than that, I'm not sure how you've configured your database to switch based on your criteria, as you've not explained this. If you are doing it by model, it may be worth considering how Django knows this, or even using external connections (manually loading the database driver and running commands by hand prior to the render stage), and using the main.
I'd query the whole approach, but mostly because I'm not sure how you're actually differentiating the two databases, or why. Could you provide a bit more information on how you're doing this? I assume the variables you're pulling in dot-points 2 and 5 above are different. I don't need the values, I'm just making sure you've not used the old code duplication and forgotten to edit it (we've all been there).
Note: I'd post this as a comment if I could, but I think the solution may be in how you're pulling the variables. Finally, you could try adding the database name (just the server IP or whatever) to the output, if you're in 'dev'/debug (offline/non-production) mode, to check if it's actually making it to the second server.
For reference, the Django documentation explicitly states you shouldn't do this -- Altering settings at runetime.
There is a lot of talk within the Django community about the ORM supporting multiple connections/databases at once. There's a lot of good reference info out there on it. Check out this blog post: Easy Multi-Database Support for Django and this Django wiki page Multiple Database Support.
In the blog post, Eric Florenzano does something like this in his settings.py file:
DATABASES = dict(
primary = dict(
DATABASE_NAME=DATABASE_NAME,
# ...
),
secondary = dict(
DATABASE_NAME='secondary.db',
# ...
),
)