TLDR: run migrations against new db and set environment variables:
Add this from sample.env to env with accurate db names, user and passwords:
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=sqfour
POSTGRES_USER=csquser
POSTGRES_PASSWORD=souperdouper
POSTGRES_ECHO=true
Make sure csquser has privileged access to sqfour db or the like.
Verify postgres is running:
% brew services restart postgresql
Run migration
make db-apply-migrations
To verify look for the tables in the postgres console
# login to db
% psql -U csquser -d sqfour
# display tables
sqfour=> \dt
# eg result:
Schema | Name | Type | Owner
--------+----------------------+-------+-----------
public | alembic_version | table | csquser
public | answers | table | csquser
public | pipeline_status | table | csquser
public | convo_record | table | csquser
create database, db user and grant them PRIVILEGES on the db.
% psql postgres
Now in the postgres terminal...
CREATE DATABASE sqfour;
CREATE USER csquser WITH ENCRYPTED PASSWORD 'dinnertime';
GRANT ALL PRIVILEGES ON DATABASE sqfour TO csquser;
\q
Exit terminal and login with new user to verify.
psql -U csquser -d sqfour
migration files are generated by running
make db-new-migration
Only run this command if you have have made changes to models in the db or added a model (i.e. any changes to backend/db/models/*). It might be preferred to actually run the command in the makefile directly if you want to title your revision:
tag the revision with a nice message
poetry run alembic revision --autogenerate -m add_new_mapping_or_table
If you need to revise the migration on a table and point to a new one: (not a trivial command, user should be comfortable with how the revisions are managed and ordered in the migrations directory, and ready to lose stuff if you make a mistake)
Hard code replace alembic version
UPDATE alembic_version SET version_num = '30d6e1137cc0';
\dt
# lists all table
\q
# quits
SELECT
table_name,
pg_size_pretty(pg_total_relation_size(table_name::regclass)) AS total_size
FROM
information_schema.tables
WHERE
table_schema = 'public';
# get table size (check if it's getting bigger after doing a write)
SELECT
table_name,
column_name,
data_type,
is_nullable,
column_default
FROM
information_schema.columns
WHERE
table_schema = 'public'
ORDER BY
table_name, ordinal_position;
# Get table schema for all tables (helpful for building queries in gpt)
What else?
To add:
get in when running locally
login with local credentials to verify env vars