How to use flask-migrate with other declarative_bases
Asked Answered
P

3

5

I'm trying to implement python-social-auth in Flask. I've ironed out tons of kinks whilst trying to interpret about 4 tutorials and a full Flask-book at the same time, and feel I've reached sort of an impasse with Flask-migrate.

I'm currently using the following code to create the tables necessary for python-social-auth to function in a flask-sqlalchemy environment.

from social.apps.flask_app.default import models
models.PSABase.metadata.create_all(db.engine)

Now, they're obviously using some form of their own Base, not related to my actual db-object. This in turn causes Flask-Migrate to completely miss out on these tables and remove them in migrations. Now, obviously I can remove these db-drops from every removal, but I can imagine it being one of those things that at one point is going to get forgotten about and all of a sudden I have no OAuth-ties anymore.

I've gotten this solution to work with the usage (and modification) of the manage.py-command syncdb as suggested by the python-social-auth Flask example

Miguel Grinberg, the author of Flask-Migrate replies here to an issue that seems to very closely resemble mine.

The closest I could find on stack overflow was this, but it doesn't shed too much light on the entire thing for me, and the answer was never accepted (and I can't get it to work, I have tried a few times)

For reference, here is my manage.py:

#!/usr/bin/env python

from flask.ext.script import Server, Manager, Shell
from flask.ext.migrate import Migrate, MigrateCommand


from app import app, db

manager = Manager(app)
manager.add_command('runserver', Server())
manager.add_command('shell', Shell(make_context=lambda: {
    'app': app,
    'db_session': db.session
}))

migrate = Migrate(app, db)
manager.add_command('db', MigrateCommand)

@manager.command
def syncdb():
    from social.apps.flask_app.default import models
    models.PSABase.metadata.create_all(db.engine)
    db.create_all()

if __name__ == '__main__':
    manager.run()

And to clarify, the db init / migrate / upgrade commands only create my user table (and the migration one obviously), but not the social auth ones, while the syncdb command works for the python-social-auth tables.

I understand from the github response that this isn't supported by Flask-Migrate, but I'm wondering if there's a way to fiddle in the PSABase-tables so they are picked up by the db-object sent into Migrate.

Any suggestions welcome.

(Also, first-time poster. I feel I've done a lot of research and tried quite a few solutions before I finally came here to post. If I've missed something obvious in the guidelines of SO, don't hesitate to point that out to me in a private message and I'll happily oblige)

Particularize answered 3/2, 2016 at 23:39 Comment(0)
V
2

The problem is that you have two sets of models, each with a different SQLAlchemy metadata object. The models from PSA were generated directly from SQLAlchemy, while your own models were generated through Flask-SQLAlchemy.

Flask-Migrate only sees the models that are defined via Flask-SQLAlchemy, because the db object that you give it only knows about the metadata for those models, it knows nothing about these other PSA models that bypassed Flask-SQLAlchemy.

So yeah, end result is that each time you generate a migration, Flask-Migrate/Alembic find these PSA tables in the db and decides to delete them, because it does not see any models for them.

I think the best solution for your problem is to configure Alembic to ignore certain tables. For this you can use the include_object configuration in the env.py module stored in the migrations directory. Basically you are going to write a function that Alembic will call every time it comes upon a new entity while generating a migration script. The function will return False when the object in question is one of these PSA tables, and True for every thing else.

Update: Another option, which you included in the response you wrote, is to merge the two metadata objects into one, then the models from your application and PSA are inspected by Alembic together.

I have nothing against the technique of merging multiple metadata objects into one, but I think it is not a good idea for an application to track migrations in models that aren't yours. Many times Alembic will not be able to capture a migration accurately, so you may need to make minor corrections on the generated script before you apply it. For models that are yours, you are capable of detecting these inaccuracies that sometimes show up in migration scripts, but when the models aren't yours I think you can miss stuff, because you will not be familiar enough with the changes that went into those models to do a good review of the Alembic generated script.

For this reason, I think it is a better idea to use my proposed include_object configuration to leave the third party models out of your migrations. Those models should be migrated according to the third party project's instructions instead.

Violent answered 4/2, 2016 at 2:47 Comment(6)
Thank you for the answer, Miguel. Right, I almost forgot about Alembic completely. Couldn't I just skip the Flask-Migrate and use that directly? It looks to me like it could be tricked into thinking this is a multi-database setup (which it basically is, from Alembics perspective).Particularize
Hmm, not sure if you'll get different results with Alembic. If you set up a multidb configuration (which you can also set up with Flask-Migrate, btw), then the problem will show up on both sides. The migrations for your regular db will still try to delete the PSA models, but in addition to that, the migrations for the PSA side will try to delete your tables as well. One way to avoid this mess is to use two separate databases, then the tables from one side will not be visible to the other.Violent
Ah, yes, they'll contradict each other. Wouldn't be too bad with separate db's, but could be easy to miss for users during troubleshooting I think. I guess I could just implement the social auth stuff myself. It just makes me a bit sad to be so close to the finish line. It's all working except for the migrations, which is a must, making me have to rip it all out again and start over :PParticularize
What is the problem with the include_object option? That should address your problem, by keeping the PSA models out of the migration evaluator.Violent
By keeping them out, wouldn't I expose myself to changes to the PSA models that wouldn't be migrated in, or do I rather expose myself to risk by including them? Wouldn't changes to the PSA models be detected by the migrator now that I've properly included them (my own answer)?Particularize
Thanks for the edit, Miguel, this clarifies what you mean. Accepting this answer since it really lead me to all conclusions and solutions I actually needed.Particularize
P
5

After the helpful answer from Miguel here I got some new keywords to research. I ended up at a helpful github-page which had further references to, amongst others, the Alembic bitbucket site which helped immensely.

In the end I did this to my Alembic migration env.py-file:

from sqlalchemy import engine_from_config, pool, MetaData

[...]

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
from flask import current_app
config.set_main_option('sqlalchemy.url',
                       current_app.config.get('SQLALCHEMY_DATABASE_URI'))

def combine_metadata(*args):
    m = MetaData()
    for metadata in args:
        for t in metadata.tables.values():
            t.tometadata(m)
    return m

from social.apps.flask_app.default import models

target_metadata = combine_metadata(
    current_app.extensions['migrate'].db.metadata,
    models.PSABase.metadata)

This seems to work absolutely perfectly.

Particularize answered 4/2, 2016 at 9:9 Comment(1)
I confirm that this should be the accepted answer in this Q&A. I have been helplessly need to remove drop/create statement for all social auth tables in upgrade/downgrade migration file. Having the above fix, will make both upgrade/downgrade initially empty. Flawless solution :)Berkey
V
2

The problem is that you have two sets of models, each with a different SQLAlchemy metadata object. The models from PSA were generated directly from SQLAlchemy, while your own models were generated through Flask-SQLAlchemy.

Flask-Migrate only sees the models that are defined via Flask-SQLAlchemy, because the db object that you give it only knows about the metadata for those models, it knows nothing about these other PSA models that bypassed Flask-SQLAlchemy.

So yeah, end result is that each time you generate a migration, Flask-Migrate/Alembic find these PSA tables in the db and decides to delete them, because it does not see any models for them.

I think the best solution for your problem is to configure Alembic to ignore certain tables. For this you can use the include_object configuration in the env.py module stored in the migrations directory. Basically you are going to write a function that Alembic will call every time it comes upon a new entity while generating a migration script. The function will return False when the object in question is one of these PSA tables, and True for every thing else.

Update: Another option, which you included in the response you wrote, is to merge the two metadata objects into one, then the models from your application and PSA are inspected by Alembic together.

I have nothing against the technique of merging multiple metadata objects into one, but I think it is not a good idea for an application to track migrations in models that aren't yours. Many times Alembic will not be able to capture a migration accurately, so you may need to make minor corrections on the generated script before you apply it. For models that are yours, you are capable of detecting these inaccuracies that sometimes show up in migration scripts, but when the models aren't yours I think you can miss stuff, because you will not be familiar enough with the changes that went into those models to do a good review of the Alembic generated script.

For this reason, I think it is a better idea to use my proposed include_object configuration to leave the third party models out of your migrations. Those models should be migrated according to the third party project's instructions instead.

Violent answered 4/2, 2016 at 2:47 Comment(6)
Thank you for the answer, Miguel. Right, I almost forgot about Alembic completely. Couldn't I just skip the Flask-Migrate and use that directly? It looks to me like it could be tricked into thinking this is a multi-database setup (which it basically is, from Alembics perspective).Particularize
Hmm, not sure if you'll get different results with Alembic. If you set up a multidb configuration (which you can also set up with Flask-Migrate, btw), then the problem will show up on both sides. The migrations for your regular db will still try to delete the PSA models, but in addition to that, the migrations for the PSA side will try to delete your tables as well. One way to avoid this mess is to use two separate databases, then the tables from one side will not be visible to the other.Violent
Ah, yes, they'll contradict each other. Wouldn't be too bad with separate db's, but could be easy to miss for users during troubleshooting I think. I guess I could just implement the social auth stuff myself. It just makes me a bit sad to be so close to the finish line. It's all working except for the migrations, which is a must, making me have to rip it all out again and start over :PParticularize
What is the problem with the include_object option? That should address your problem, by keeping the PSA models out of the migration evaluator.Violent
By keeping them out, wouldn't I expose myself to changes to the PSA models that wouldn't be migrated in, or do I rather expose myself to risk by including them? Wouldn't changes to the PSA models be detected by the migrator now that I've properly included them (my own answer)?Particularize
Thanks for the edit, Miguel, this clarifies what you mean. Accepting this answer since it really lead me to all conclusions and solutions I actually needed.Particularize
P
0

I use two models as following:-

One which is use using db as

db = SQLAlchemy()
app['SQLALCHEMY_DATABASE_URI'] = 'postgresql://postgres:' + POSTGRES_PASSWORD + '@localhost/Flask'
db.init_app(app)

class User(db.Model):
    pass

the other with Base as

Base = declarative_base()
uri = 'postgresql://postgres:' + POSTGRES_PASSWORD + '@localhost/Flask'
engine = create_engine(uri)
metadata = MetaData(engine)
Session = sessionmaker(bind=engine)
session = Session()

class Address(Base):
    pass

Since you created user with db.Model you can use flask migrate on User and class Address used Base which handles fetching pre-existing table from the database.

Portraitist answered 6/5, 2017 at 13:20 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.