Skip to content

Instantly share code, notes, and snippets.

@mlgarchery
Last active March 25, 2025 16:45
Show Gist options
  • Save mlgarchery/3f098b9b091ed4335233cf94cca264c6 to your computer and use it in GitHub Desktop.
Save mlgarchery/3f098b9b091ed4335233cf94cca264c6 to your computer and use it in GitHub Desktop.
Migrate from a SQLite database to a Postgres database

Migrating from SQLite (D1) to PostgreSQL (Neon)

As D1, a sqlite-compatible DB from Cloudflare, is limited to a maximum of 10GB, I needed to migrate my database to one that is famously capable of handling a numerous number of rows.

To export it I used the wrangler CLI:

npx wrangler d1 export my-db --remote --output="db.sql"

This comes with an unfortunate downtime.. So to do during a quiet period.

Then make the bunch of sql requests you get from it an actual sqlite database:

# sudo apt install sqlite3
$ sqlite3 load.sqlite
>PRAGMA journal_mode = OFF; PRAGMA cache_size = 7500000; PRAGMA synchronous = 0; PRAGMA temp_store = 2;

sqlite> .read db.sql

At this point inspect your DB for any missing foreign key reference (row containing a column pointing to a non existing row) and clean it if necessary. Also, column names should be in lowercase otherwise the next step won't work. I lost quite a lot of time on this (and also on casting properly the columns in the command file below).

Once you have a clean sqlite database, you can use pgloader to convert it to a Postgres. I've done it directly connecting to a Neon DB after having created the database with its schema.

You need to create a command file with a content looking like that:

LOAD DATABASE  
 FROM sqlite:///home/yourself/load.sqlite
 -- to push in Neon, replace your password slot in the pg connection string by endpoint=[endpoint];password 
 INTO postgresql://<connectionparameters>

 WITH data only, prefetch rows = 1000
 -- this line casts a column from a specific table to the wanted type, you can do that for each problematic column
 -- but most of the defaults casting are good in my experience :
 CAST column session.expires_at to timestamptz using unix-timestamp-to-timestamptz

 SET PostgreSQL PARAMETERS
     work_mem to '128MB',
     search_path to 'hot',
     maintenance_work_mem to '1GB'; -- there should be one ; at the end of the file, else the parsing will fail !

I am just grateful of the existence of pgloader, you can read more about its options in the doc https://pgloader.readthedocs.io/en/latest/ref/sqlite.html

The whole process took about 25 min for a 10GB D1 DB.

I hope this help some of you out there.

sqlite3 version: 3.45.1 pgloader version: 3.6.7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment