Skip to content

Instantly share code, notes, and snippets.

@DebopamParam
Last active June 27, 2025 17:19
Show Gist options
  • Save DebopamParam/cc2427e06ac2e08fa2f4c7f67ed1fedd to your computer and use it in GitHub Desktop.
Save DebopamParam/cc2427e06ac2e08fa2f4c7f67ed1fedd to your computer and use it in GitHub Desktop.

Rapid Prototype Architecture Overview

flowchart TD
    Client[Client/Browser] --> API[FastAPI Service]
    
    subgraph Core Services
        API --> PostgreSQL[(PostgreSQL)]
        API --> Redis[(Redis Cache)]
        API --> Kafka[[Apache Kafka]]
    end
    
    subgraph Supporting Services
        Zookeeper[Zookeeper]
    end
    
    API -->|async| Worker[Background Worker]
    Kafka --> Worker
    Worker --> PostgreSQL
Loading

Key Components:

  1. FastAPI Service - Core banking operations
  2. PostgreSQL - Primary data storage
  3. Redis - Simple locking mechanism
  4. Kafka - Basic event streaming
  5. Background Worker - Async transaction processing

Phase Breakdown

1. Infrastructure Setup

flowchart LR
    Docker[Docker Environment] --> PostgreSQL
    Docker --> Redis
    Docker --> Kafka
    Docker --> Zookeeper
Loading

Key Deliverables:

  • Local development environment via Docker
  • Basic database schema (accounts + transactions)
  • Essential services running:
    • PostgreSQL for ACID transactions
    • Redis for simple locks
    • Kafka for event streaming
    • Zookeeper for Kafka coordination

2. Basic API Implementation

flowchart LR
    API[FastAPI] --> AccountEP[Account Endpoints]
    API --> TransactionEP[Transaction Endpoints]
    
    AccountEP --> CreateAccount[POST /accounts]
    AccountEP --> GetBalance[GET /accounts/id/balance]
    
    TransactionEP --> Credit[POST /transactions/credit]
    TransactionEP --> Debit[POST /transactions/debit]
Loading

Key Features:

  • Account creation with initial balance
  • Balance inquiry endpoint
  • Basic credit/debit transactions
  • Simple error handling
  • Request validation

3. Core Business Logic

sequenceDiagram
    participant Client
    participant API
    participant Redis
    participant DB
    
    Client->>API: Debit Request
    API->>Redis: Acquire Lock
    Redis-->>API: Lock Granted
    API->>DB: Check Balance
    DB-->>API: Current Balance
    alt Sufficient Funds
        API->>DB: Update Balance
        API->>Kafka: Publish Success Event
    else Insufficient Funds
        API->>Kafka: Publish Failure Event
    end
    API->>Redis: Release Lock
    API-->>Client: Transaction Result
Loading

Key Mechanisms:

  • Redis-based locking for concurrency control
  • Balance validation logic
  • Atomic database operations
  • Basic transaction audit logging

4. Event Streaming Basics

flowchart LR
    API -->|Transaction Events| Kafka
    Kafka -->|Completed| TopicA[transactions.completed]
    Kafka -->|Failed| TopicB[transactions.failed]
    
    subgraph Event Consumers
        FraudDetection[Fraud Detection]
        Notifications[User Notifications]
        Analytics[Basic Analytics]
    end
    
    TopicA --> FraudDetection
    TopicA --> Notifications
    TopicA --> Analytics
Loading

Key Features:

  • Event production for completed/failed transactions
  • Basic consumer scaffolding
  • Event schema standardization
  • Simple fraud detection pattern monitoring

5. Testing & Validation

flowchart LR
    T[Testing Pyramid] --> UT[Unit Tests]
    T --> IT[Integration Tests]
    T --> E2E[E2E Tests]
    
    UT --> BusinessLogic[Service Layer]
    IT --> APITests[API Endpoints]
    IT --> DBTests[Database Interactions]
    E2E --> UserJourney[Full Transaction Flow]
Loading

Validation Scope:

  • Account creation workflows
  • Balance calculation accuracy
  • Concurrent transaction handling
  • Event production consistency
  • Error scenario handling

Key Simplifications vs Full Plan

pie
    title Architecture Simplifications
    "Single Database Instance" : 35
    "Basic Redis Locking" : 25
    "Minimal Kafka Topics" : 20
    "No gRPC Services" : 15
    "Basic Monitoring" : 5
Loading

Strategic Shortcuts:

  1. Single PostgreSQL instance vs sharded cluster
  2. Simplified Redis locking vs distributed locks
  3. Minimal Kafka topics (2 vs 6+ in full plan)
  4. No advanced monitoring (Prometheus/Grafana)
  5. Basic authentication vs full JWT/OAuth
  6. Simple error handling vs dead letter queues

Deployment Flow

flowchart TD
    Dev[Local Development] -->|Docker| Stage[Staging]
    Stage -->|Validation| Prod[Production]
    
    subgraph Prod
        direction TB
        LB[Load Balancer] --> API1[API 1]
        LB --> API2[API 2]
        API1 --> DB[PostgreSQL]
        API2 --> DB
        DB --> Replica[Read Replica]
    end
Loading

Progression Path:

  1. Local Docker development
  2. Shared staging environment
  3. Basic production setup with:
    • Load balancing
    • Database replication
    • Horizontal API scaling
    • Basic monitoring

This rapid prototype focuses on demonstrating core banking capabilities while maintaining flexibility for future enhancements. Each component can be iteratively upgraded to match the full architecture requirements.

Rapid Prototype Banking API Plan

Simplified Folder Structure

banking-api/
├── app/
│   ├── main.py                  # FastAPI entry point
│   ├── database.py              # DB connection & models
│   ├── schemas.py               # Pydantic models
│   ├── routes/
│   │   ├── accounts.py          # Account endpoints
│   │   └── transactions.py      # Transaction endpoints
│   ├── services/
│   │   ├── accounts.py          # Account business logic
│   │   └── transactions.py      # Transaction processing
│   └── utils/
│       ├── redis.py             # Redis client
│       └── kafka.py             # Kafka producer/consumer
├── tests/
│   └── test_api.py              # Basic integration tests
├── docker-compose.yml           # Dev environment
└── requirements.txt             # Core dependencies

Phase-wise Implementation

Phase 1: Core Infrastructure Setup

1. docker-compose.yml

version: '3.8'

services:
  api:
    build: .
    ports: ["8000:8000"]
    environment:
      - DATABASE_URL=postgresql+asyncpg://user:pass@db:5432/bank
      - REDIS_URL=redis://redis:6379/0
      - KAFKA_BOOTSTRAP_SERVERS=kafka:9092
    depends_on: [db, redis, kafka]

  db:
    image: postgres:15
    environment:
      - POSTGRES_USER=user
      - POSTGRES_PASSWORD=pass
      - POSTGRES_DB=bank
    ports: ["5432:5432"]
    volumes: [postgres-data:/var/lib/postgresql/data]

  redis:
    image: redis:7
    ports: ["6379:6379"]
    volumes: [redis-data:/data]

  zookeeper:
    image: confluentinc/cp-zookeeper:7.3.2
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181

  kafka:
    image: confluentinc/cp-kafka:7.3.2
    depends_on: [zookeeper]
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

volumes:
  postgres-data:
  redis-data:

2. requirements.txt

fastapi>=0.95
uvicorn>=0.22
asyncpg>=0.27
redis>=4.5
kafka-python>=2.2
sqlalchemy>=2.0
pydantic>=1.10
python-multipart>=0.0

3. app/database.py

from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
from sqlalchemy.orm import declarative_base, sessionmaker

Base = declarative_base()

class Account(Base):
    __tablename__ = "accounts"
    id = Column(UUID, primary_key=True)
    balance = Column(Numeric(10,2), default=0)
    # ... other fields

engine = create_async_engine(os.getenv("DATABASE_URL"))
async_session = sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)

async def init_db():
    async with engine.begin() as conn:
        await conn.run_sync(Base.metadata.create_all)

Phase 2: Basic API Implementation

1. app/main.py

from fastapi import FastAPI
from .database import init_db
from .routes import accounts, transactions

app = FastAPI()

@app.on_event("startup")
async def startup():
    await init_db()

app.include_router(accounts.router)
app.include_router(transactions.router)

2. app/schemas.py

from pydantic import BaseModel

class AccountCreate(BaseModel):
    initial_balance: float

class TransactionRequest(BaseModel):
    account_id: str
    amount: float

3. app/routes/accounts.py

from fastapi import APIRouter, Depends
from sqlalchemy.ext.asyncio import AsyncSession
from .database import async_session
from .services import accounts

router = APIRouter()

@router.post("/accounts")
async def create_account():
    return await accounts.create_account()

@router.get("/accounts/{account_id}/balance")
async def get_balance(account_id: str):
    return await accounts.get_balance(account_id)

4. app/routes/transactions.py

from fastapi import APIRouter
from .services import transactions

router = APIRouter()

@router.post("/transactions/debit")
async def debit_account(request: TransactionRequest):
    return await transactions.process_debit(request)

@router.post("/transactions/credit")
async def credit_account(request: TransactionRequest):
    return await transactions.process_credit(request)

Phase 3: Core Business Logic

1. app/services/accounts.py

from uuid import uuid4
from .database import async_session
from sqlalchemy import update, select

async def create_account():
    async with async_session() as session:
        account_id = str(uuid4())
        session.add(Account(id=account_id))
        await session.commit()
        return {"id": account_id}

async def get_balance(account_id: str):
    async with async_session() as session:
        result = await session.execute(select(Account.balance).where(Account.id == account_id))
        return {"balance": result.scalar()}

2. app/services/transactions.py

from decimal import Decimal
from .utils.redis import get_redis
from .utils.kafka import produce_transaction

async def process_debit(request):
    async with async_session() as session:
        redis = await get_redis()
        
        # Simple lock implementation
        lock = await redis.setnx(f"lock:{request.account_id}", "1")
        if not lock:
            raise HTTPException(409, "Transaction in progress")
            
        try:
            account = await session.get(Account, request.account_id)
            if account.balance < request.amount:
                await produce_transaction("failed", request)
                raise HTTPException(400, "Insufficient funds")
                
            account.balance -= Decimal(str(request.amount))
            await session.commit()
            await produce_transaction("completed", request)
            return {"status": "success"}
            
        finally:
            await redis.delete(f"lock:{request.account_id}")

3. app/utils/redis.py

from redis.asyncio import Redis

redis_client = None

async def get_redis():
    global redis_client
    if not redis_client:
        redis_client = Redis.from_url(os.getenv("REDIS_URL"))
    return redis_client

Phase 4: Event Streaming Basics

1. app/utils/kafka.py

from kafka import KafkaProducer

producer = KafkaProducer(bootstrap_servers=os.getenv("KAFKA_BOOTSTRAP_SERVERS"))

async def produce_transaction(status, request):
    message = {
        "account_id": request.account_id,
        "amount": request.amount,
        "status": status,
        "timestamp": datetime.utcnow().isoformat()
    }
    producer.send(f"transactions.{status}", value=json.dumps(message).encode())

2. Basic Consumer (optional)

from kafka import KafkaConsumer

async def start_consumer():
    consumer = KafkaConsumer("transactions.completed")
    for msg in consumer:
        print(f"Processed transaction: {msg.value}")

Phase 5: Testing & Validation

1. tests/test_api.py

from fastapi.testclient import TestClient
from app.main import app

client = TestClient(app)

def test_create_account():
    response = client.post("/accounts")
    assert response.status_code == 200
    assert "id" in response.json()

def test_transaction_flow():
    # Create account
    acc = client.post("/accounts").json()
    
    # Test credit
    credit = client.post("/transactions/credit", json={
        "account_id": acc["id"],
        "amount": 100
    })
    assert credit.status_code == 200
    
    # Test debit
    debit = client.post("/transactions/debit", json={
        "account_id": acc["id"],
        "amount": 50
    })
    assert debit.status_code == 200
    
    # Check balance
    balance = client.get(f"/accounts/{acc['id']}/balance")
    assert balance.json()["balance"] == 50

Key Simplifications for Rapid Prototype

  1. Database: Single PostgreSQL instance with basic tables
  2. Caching: Redis for simple locking, no cache invalidation
  3. Event Streaming: Basic Kafka producer with 2 topics (completed/failed)
  4. Transactions: Simple balance updates with Redis locking
  5. API: Minimal endpoints for core banking operations
  6. Security: Basic API key auth (to be added later)
  7. Error Handling: Simple HTTP exceptions with Kafka logging
  8. Deployment: Local Docker setup only

This prototype can handle basic account operations and transactions while demonstrating the core architecture. Subsequent iterations can add advanced features like gRPC, proper monitoring, and distributed transactions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment