I ran into an issue today where I had to perform a bulk insert into a postgres DB. I was already using SQLAlchemy and Flask-SQLAlchemy to manage the connections to the db and I didn't want to have to use things like psycopg2
directly.
Note: SQLAlchemy provides an ORM. It isn't just an ORM. That is an important thing to be kept in mind. This means that you can bypass choose to not use the ORM layer when you don't want it. The idea with an ORM is to track changes to objects and when you have a case like that is when you'd use the ORM. In a bulk upload scenario, you don't need to track changes to objects. All you care is that everything be pushed into the DB.
SQLAlchemy (and Flask-SQLAlchemy) lets us do this by using the engine
directly. This is how I did it:
from xref import db
from xref.models import user
users = get_users_to_insert()
db.engine.execute(user.__table__.insert(), users)
Of course this assumes that you have a list
of dict
s with the key names matching the Column
s defined in your SQLAlchemy model.
Hi Darek,
This is how your list looks:
The keys there are
0
,1
and2
. I think you should transform this so that it looks like this:IIRC, this should work ⬆