flowchart TD
Client[Client/Browser] --> API[FastAPI Service]
subgraph Core Services
API --> PostgreSQL[(PostgreSQL)]
API --> Redis[(Redis Cache)]
API --> Kafka[[Apache Kafka]]
end
subgraph Supporting Services
Zookeeper[Zookeeper]
end
API -->|async| Worker[Background Worker]
Kafka --> Worker
Worker --> PostgreSQL
- FastAPI Service - Core banking operations
- PostgreSQL - Primary data storage
- Redis - Simple locking mechanism
- Kafka - Basic event streaming
- Background Worker - Async transaction processing
flowchart LR
Docker[Docker Environment] --> PostgreSQL
Docker --> Redis
Docker --> Kafka
Docker --> Zookeeper
Key Deliverables:
- Local development environment via Docker
- Basic database schema (accounts + transactions)
- Essential services running:
- PostgreSQL for ACID transactions
- Redis for simple locks
- Kafka for event streaming
- Zookeeper for Kafka coordination
flowchart LR
API[FastAPI] --> AccountEP[Account Endpoints]
API --> TransactionEP[Transaction Endpoints]
AccountEP --> CreateAccount[POST /accounts]
AccountEP --> GetBalance[GET /accounts/id/balance]
TransactionEP --> Credit[POST /transactions/credit]
TransactionEP --> Debit[POST /transactions/debit]
Key Features:
- Account creation with initial balance
- Balance inquiry endpoint
- Basic credit/debit transactions
- Simple error handling
- Request validation
sequenceDiagram
participant Client
participant API
participant Redis
participant DB
Client->>API: Debit Request
API->>Redis: Acquire Lock
Redis-->>API: Lock Granted
API->>DB: Check Balance
DB-->>API: Current Balance
alt Sufficient Funds
API->>DB: Update Balance
API->>Kafka: Publish Success Event
else Insufficient Funds
API->>Kafka: Publish Failure Event
end
API->>Redis: Release Lock
API-->>Client: Transaction Result
Key Mechanisms:
- Redis-based locking for concurrency control
- Balance validation logic
- Atomic database operations
- Basic transaction audit logging
flowchart LR
API -->|Transaction Events| Kafka
Kafka -->|Completed| TopicA[transactions.completed]
Kafka -->|Failed| TopicB[transactions.failed]
subgraph Event Consumers
FraudDetection[Fraud Detection]
Notifications[User Notifications]
Analytics[Basic Analytics]
end
TopicA --> FraudDetection
TopicA --> Notifications
TopicA --> Analytics
Key Features:
- Event production for completed/failed transactions
- Basic consumer scaffolding
- Event schema standardization
- Simple fraud detection pattern monitoring
flowchart LR
T[Testing Pyramid] --> UT[Unit Tests]
T --> IT[Integration Tests]
T --> E2E[E2E Tests]
UT --> BusinessLogic[Service Layer]
IT --> APITests[API Endpoints]
IT --> DBTests[Database Interactions]
E2E --> UserJourney[Full Transaction Flow]
Validation Scope:
- Account creation workflows
- Balance calculation accuracy
- Concurrent transaction handling
- Event production consistency
- Error scenario handling
pie
title Architecture Simplifications
"Single Database Instance" : 35
"Basic Redis Locking" : 25
"Minimal Kafka Topics" : 20
"No gRPC Services" : 15
"Basic Monitoring" : 5
Strategic Shortcuts:
- Single PostgreSQL instance vs sharded cluster
- Simplified Redis locking vs distributed locks
- Minimal Kafka topics (2 vs 6+ in full plan)
- No advanced monitoring (Prometheus/Grafana)
- Basic authentication vs full JWT/OAuth
- Simple error handling vs dead letter queues
flowchart TD
Dev[Local Development] -->|Docker| Stage[Staging]
Stage -->|Validation| Prod[Production]
subgraph Prod
direction TB
LB[Load Balancer] --> API1[API 1]
LB --> API2[API 2]
API1 --> DB[PostgreSQL]
API2 --> DB
DB --> Replica[Read Replica]
end
Progression Path:
- Local Docker development
- Shared staging environment
- Basic production setup with:
- Load balancing
- Database replication
- Horizontal API scaling
- Basic monitoring
This rapid prototype focuses on demonstrating core banking capabilities while maintaining flexibility for future enhancements. Each component can be iteratively upgraded to match the full architecture requirements.
banking-api/
├── app/
│ ├── main.py # FastAPI entry point
│ ├── database.py # DB connection & models
│ ├── schemas.py # Pydantic models
│ ├── routes/
│ │ ├── accounts.py # Account endpoints
│ │ └── transactions.py # Transaction endpoints
│ ├── services/
│ │ ├── accounts.py # Account business logic
│ │ └── transactions.py # Transaction processing
│ └── utils/
│ ├── redis.py # Redis client
│ └── kafka.py # Kafka producer/consumer
├── tests/
│ └── test_api.py # Basic integration tests
├── docker-compose.yml # Dev environment
└── requirements.txt # Core dependencies
1. docker-compose.yml
version: '3.8'
services:
api:
build: .
ports: ["8000:8000"]
environment:
- DATABASE_URL=postgresql+asyncpg://user:pass@db:5432/bank
- REDIS_URL=redis://redis:6379/0
- KAFKA_BOOTSTRAP_SERVERS=kafka:9092
depends_on: [db, redis, kafka]
db:
image: postgres:15
environment:
- POSTGRES_USER=user
- POSTGRES_PASSWORD=pass
- POSTGRES_DB=bank
ports: ["5432:5432"]
volumes: [postgres-data:/var/lib/postgresql/data]
redis:
image: redis:7
ports: ["6379:6379"]
volumes: [redis-data:/data]
zookeeper:
image: confluentinc/cp-zookeeper:7.3.2
environment:
ZOOKEEPER_CLIENT_PORT: 2181
kafka:
image: confluentinc/cp-kafka:7.3.2
depends_on: [zookeeper]
environment:
KAFKA_BROKER_ID: 1
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
volumes:
postgres-data:
redis-data:
2. requirements.txt
fastapi>=0.95
uvicorn>=0.22
asyncpg>=0.27
redis>=4.5
kafka-python>=2.2
sqlalchemy>=2.0
pydantic>=1.10
python-multipart>=0.0
3. app/database.py
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
from sqlalchemy.orm import declarative_base, sessionmaker
Base = declarative_base()
class Account(Base):
__tablename__ = "accounts"
id = Column(UUID, primary_key=True)
balance = Column(Numeric(10,2), default=0)
# ... other fields
engine = create_async_engine(os.getenv("DATABASE_URL"))
async_session = sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
async def init_db():
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
1. app/main.py
from fastapi import FastAPI
from .database import init_db
from .routes import accounts, transactions
app = FastAPI()
@app.on_event("startup")
async def startup():
await init_db()
app.include_router(accounts.router)
app.include_router(transactions.router)
2. app/schemas.py
from pydantic import BaseModel
class AccountCreate(BaseModel):
initial_balance: float
class TransactionRequest(BaseModel):
account_id: str
amount: float
3. app/routes/accounts.py
from fastapi import APIRouter, Depends
from sqlalchemy.ext.asyncio import AsyncSession
from .database import async_session
from .services import accounts
router = APIRouter()
@router.post("/accounts")
async def create_account():
return await accounts.create_account()
@router.get("/accounts/{account_id}/balance")
async def get_balance(account_id: str):
return await accounts.get_balance(account_id)
4. app/routes/transactions.py
from fastapi import APIRouter
from .services import transactions
router = APIRouter()
@router.post("/transactions/debit")
async def debit_account(request: TransactionRequest):
return await transactions.process_debit(request)
@router.post("/transactions/credit")
async def credit_account(request: TransactionRequest):
return await transactions.process_credit(request)
1. app/services/accounts.py
from uuid import uuid4
from .database import async_session
from sqlalchemy import update, select
async def create_account():
async with async_session() as session:
account_id = str(uuid4())
session.add(Account(id=account_id))
await session.commit()
return {"id": account_id}
async def get_balance(account_id: str):
async with async_session() as session:
result = await session.execute(select(Account.balance).where(Account.id == account_id))
return {"balance": result.scalar()}
2. app/services/transactions.py
from decimal import Decimal
from .utils.redis import get_redis
from .utils.kafka import produce_transaction
async def process_debit(request):
async with async_session() as session:
redis = await get_redis()
# Simple lock implementation
lock = await redis.setnx(f"lock:{request.account_id}", "1")
if not lock:
raise HTTPException(409, "Transaction in progress")
try:
account = await session.get(Account, request.account_id)
if account.balance < request.amount:
await produce_transaction("failed", request)
raise HTTPException(400, "Insufficient funds")
account.balance -= Decimal(str(request.amount))
await session.commit()
await produce_transaction("completed", request)
return {"status": "success"}
finally:
await redis.delete(f"lock:{request.account_id}")
3. app/utils/redis.py
from redis.asyncio import Redis
redis_client = None
async def get_redis():
global redis_client
if not redis_client:
redis_client = Redis.from_url(os.getenv("REDIS_URL"))
return redis_client
1. app/utils/kafka.py
from kafka import KafkaProducer
producer = KafkaProducer(bootstrap_servers=os.getenv("KAFKA_BOOTSTRAP_SERVERS"))
async def produce_transaction(status, request):
message = {
"account_id": request.account_id,
"amount": request.amount,
"status": status,
"timestamp": datetime.utcnow().isoformat()
}
producer.send(f"transactions.{status}", value=json.dumps(message).encode())
2. Basic Consumer (optional)
from kafka import KafkaConsumer
async def start_consumer():
consumer = KafkaConsumer("transactions.completed")
for msg in consumer:
print(f"Processed transaction: {msg.value}")
1. tests/test_api.py
from fastapi.testclient import TestClient
from app.main import app
client = TestClient(app)
def test_create_account():
response = client.post("/accounts")
assert response.status_code == 200
assert "id" in response.json()
def test_transaction_flow():
# Create account
acc = client.post("/accounts").json()
# Test credit
credit = client.post("/transactions/credit", json={
"account_id": acc["id"],
"amount": 100
})
assert credit.status_code == 200
# Test debit
debit = client.post("/transactions/debit", json={
"account_id": acc["id"],
"amount": 50
})
assert debit.status_code == 200
# Check balance
balance = client.get(f"/accounts/{acc['id']}/balance")
assert balance.json()["balance"] == 50
- Database: Single PostgreSQL instance with basic tables
- Caching: Redis for simple locking, no cache invalidation
- Event Streaming: Basic Kafka producer with 2 topics (completed/failed)
- Transactions: Simple balance updates with Redis locking
- API: Minimal endpoints for core banking operations
- Security: Basic API key auth (to be added later)
- Error Handling: Simple HTTP exceptions with Kafka logging
- Deployment: Local Docker setup only
This prototype can handle basic account operations and transactions while demonstrating the core architecture. Subsequent iterations can add advanced features like gRPC, proper monitoring, and distributed transactions.