Skip to content

Instantly share code, notes, and snippets.

@patx
Last active December 25, 2025 23:34
Show Gist options
  • Select an option

  • Save patx/0c64c213dcb58d1b364b412a168b5bb6 to your computer and use it in GitHub Desktop.

Select an option

Save patx/0c64c213dcb58d1b364b412a168b5bb6 to your computer and use it in GitHub Desktop.

ASGI Framework Performance Benchmark

Introduction

This benchmark compares the performance of various ASGI web frameworks when serving a simple "Hello, World!" response under high concurrency. The test setup includes running each framework with Uvicorn (4 workers) where applicable and using wrk to simulate concurrent requests.

Test Setup

Each framework was tested using the following command:

wrk -t4 -c1000 -d30s http://127.0.0.1:8000/
  • t4: 4 threads
  • c1000: 1000 concurrent connections
  • d30s: 30 seconds duration

Frameworks Tested

  • MicroPie
  • Sanic
  • Muffin
  • FastAPI
  • Starlette
  • Litestar
  • Quart
  • BlackSheep

Results Table

Framework Requests/sec Avg Latency (ms) Max Latency (ms) Transfer/sec
MicroPie (Uvicorn) 5973.14 148.88 1980 0.93MB
Sanic (Uvicorn) 4212.44 223.00 2000 678.76KB
Sanic (Built-in Server) 8121.22 62.43 1990 0.91MB
Muffin (Uvicorn) 7252.85 125.17 1980 1.00MB
FastAPI (Uvicorn) 2488.27 397.71 2000 345.05KB
Starlette (Uvicorn) 8072.43 120.83 2000 1.13MB
Litestar (Uvicorn) 4743.29 191.66 1970 680.92KB
Quart (Uvicorn) 2721.33 332.41 1990 364.08KB
BlackSheep (Uvicorn) 7271.32 130.23 1980 1.02MB

Key Takeaways

  1. Sanic's Built-in Server Performs Best: Running Sanic with its own built-in server significantly outperformed using Uvicorn, showing that its native implementation is highly optimized.
  2. Starlette and Muffin Excel with Uvicorn: Both frameworks achieved high throughput with relatively low latency, making them solid choices for high-performance applications.
  3. FastAPI and Quart Lagged Behind: Due to additional processing overhead (e.g., data validation in FastAPI), they showed increased latency and lower requests per second.
  4. MicroPie Shows Strong Performance: MicroPie performed well, keeping up with frameworks like Muffin and Starlette, despite its lightweight design.
  5. BlackSheep is a High-Performer: It demonstrated great efficiency, showing a good balance between latency and throughput.

Code Samples

To replicate these benchmarks, use the following implementations for each framework:

MicroPie (micro.py)

from MicroPie import App
class Root(App):
    async def index(self):
        return "Hello, world!"
app = Root()

Sanic (san.py)

from sanic import Sanic
from sanic.response import text
app = Sanic("MyHelloWorldApp")
@app.get("/")
async def hello_world(request):
    return text("Hello, world.")

Muffin (muf.py)

import muffin
app = muffin.Application()
@app.route('/', '/hello/{name}')
async def hello(request):
    name = request.path_params.get('name', 'world')
    return f'Hello {name.title()}!'

FastAPI (fast.py)

from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def read_root():
    return {"Hello": "World"}

Starlette (star.py)

from starlette.responses import PlainTextResponse
async def app(scope, receive, send):
    assert scope['type'] == 'http'
    response = PlainTextResponse('Hello, world!')
    await response(scope, receive, send)

Litestar (lites.py)

from litestar import Litestar, get
@get("/")
async def hello_world() -> str:
    return "Hello, world!"
app = Litestar([hello_world])

Quart (qurt.py)

from quart import Quart
app = Quart(__name__)
@app.route('/')
async def hello():
    return 'hello'

BlackSheep (black.py)

from blacksheep import Application, get
app = Application()
@get("/")
async def home():
    return "Hello, World!"

Conclusion

These benchmarks provide insight into the raw performance of different ASGI frameworks under high concurrency. MicroPie stands out as an exceptionally strong performer, keeping up with and even surpassing several more established frameworks in terms of requests per second and latency. Its lightweight design ensures minimal overhead, making it an excellent choice for developers seeking a high-performance, no-nonsense ASGI framework. Unlike more heavyweight frameworks that introduce additional processing overhead, MicroPie achieves a remarkable balance of simplicity, speed, and efficiency. For developers looking for a blazing-fast and lightweight framework, MicroPie is a top contender that delivers excellent results while maintaining a minimalistic API.

@a-wip0
Copy link
Copy Markdown

a-wip0 commented Aug 13, 2025

where is the code? the repository to test this

@patx
Copy link
Copy Markdown
Author

patx commented Aug 15, 2025

@a-wip0 what do you mean? the code used for each framework is shown here....

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment