Certain SQL dialect specific SQL including triggers and stored procedures are not abstracted away by SqlAlchemy. In those cases SqlAlchemy provides a DDL interface that can be connected to events that conditionally trigger the appropriate dialect specific code.
ddl = sqlalchemy.DDL(custom_pg_trigger)
sqlalchemy.event.listen(MyTable.__table__, "after_create", ddl.execute_if(dialect="postgresql"))
ddl = sqlalchemy.DDL(custom_sqlite_trigger)
sqlalchemy.event.listen(MyTable.__table__, "after_create", ddl.execute_if(dialect="sqlite"))
Alembic unfortunately does not recognized a schema change when events are
added. The alembic generated changeset can be manually added to the upgrade
and downgrade
cases:
ddl = sqlachemy.DDL(custom_pg_trigger)
ddl_callable = ddl.execute_if(dialect="postgresql")
ddl_callable(target=None, bind=alembic.get_context().bind)
ddl = sqlachemy.DDL(custom_sqlite_trigger)
ddl_callable = ddl.execute_if(dialect="sqlite")
ddl_callable(target=None, bind=alembic.get_context().bind)
SQLite only supports a subset of the SQL ALTER
constructs. For certain actions
like dropping a table, the accepted workaround is to
- create a new table
- copy the original table to the new table
- drop the old table
- rename the new table
Because this is a common pattern, Alembic has a construct that detects operations other than COLUMN ADD
(which is supported) on an SQLite database
and performs the copy. Unfortunately even if the database is SQLite the
condition is not generated automatically, so again the migration must be manually edited.
with op.batch_alter_table('mytable') as batch_op:
batch_op.drop_column('created_at')
batch_op.drop_column('updated_at')
Trying to add a non-null column with a default value of current time results
in an error on SQLite with something about a non-static default value. Because the operations are all add_column which is nominally supported by SQLite, the operation will still attempt to use ALTER TABLE
rather than a table copy.
# Still fails because uses ALTER TABLE
with op.batch_alter_table('mytable') as batch_op:
batch_op.add_column(sa.Column('created_at', sa.DateTime(timezone=True), nullable=False, server_default=sa.text(u'CURRENT_TIMESTAMP')))
batch_op.add_column(sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True))
The work around is to force the table copy by using batch_alter_table
with a recreate=always
. Unfortunately, it means the table copy will no longer be
dialect specific, but at least it works.
with op.batch_alter_table('user', recreate='always') as batch_op:
batch_op.add_column(sa.Column('created_at', sa.DateTime(timezone=True), nullable=False, server_default=sa.text(u'CURRENT_TIMESTAMP')))
batch_op.add_column(sa.Column('updated_at', sa.DateTime(timezone=True), nullable=True))
Thank you, helped me a lot!