psycopg2. The database layer is configured in api/startup/Alchemy.py and accessed through the DBSession wrapper in api/helper/DBSession.py.
How it works
On application startup,create_app() in app.py calls init_db():
init_db() imports all models (so SQLAlchemy knows about all tables), optionally drops them if CLEAR_DB=True, then creates any missing tables:
Engine and session
The engine connects to PostgreSQL using credentials fromEnvironment.py:
db_session is a thread-local scoped session. It is removed at the end of each request via the teardown_appcontext hook in app.py.
DBSession wrapper
All database operations go through theDBSession class in api/helper/DBSession.py. It wraps the scoped session with error handling and automatic rollback:
myDB singleton is pre-instantiated and injected into every repository.
Adding a new model
Whenant generate route <name> runs, it:
- Creates
api/models/<Name>_model.py - Adds an import of the model to
init_db()inapi/startup/Alchemy.py
create_all() on the next startup. No manual migration step is required for new tables in development.
Environment variables
| Variable | Description |
|---|---|
POSTGRES_USER | Database username |
POSTGRES_PASSWORD | Database password |
POSTGRES_DB | Database name |
DB_URL | Host — postgres inside Docker, localhost for local dev |
CLEAR_DB | Set to True to drop and recreate all tables on startup |
Connecting outside Docker
SetDB_URL=localhost in .env and ensure PostgreSQL is running locally on port 5432.