This benchmark compares the performance of various ASGI web frameworks when serving a simple "Hello, World!" response under high concurrency. The test setup includes running each framework with Uvicorn (4 workers) where applicable and using wrk to simulate concurrent requests.
Each framework was tested using the following command:
wrk -t4 -c1000 -d30s http://127.0.0.1:8000/- t4: 4 threads
- c1000: 1000 concurrent connections
- d30s: 30 seconds duration
- MicroPie
- Sanic
- Muffin
- FastAPI
- Starlette
- Litestar
- Quart
- BlackSheep
| Framework | Requests/sec | Avg Latency (ms) | Max Latency (ms) | Transfer/sec |
|---|---|---|---|---|
| MicroPie (Uvicorn) | 5973.14 | 148.88 | 1980 | 0.93MB |
| Sanic (Uvicorn) | 4212.44 | 223.00 | 2000 | 678.76KB |
| Sanic (Built-in Server) | 8121.22 | 62.43 | 1990 | 0.91MB |
| Muffin (Uvicorn) | 7252.85 | 125.17 | 1980 | 1.00MB |
| FastAPI (Uvicorn) | 2488.27 | 397.71 | 2000 | 345.05KB |
| Starlette (Uvicorn) | 8072.43 | 120.83 | 2000 | 1.13MB |
| Litestar (Uvicorn) | 4743.29 | 191.66 | 1970 | 680.92KB |
| Quart (Uvicorn) | 2721.33 | 332.41 | 1990 | 364.08KB |
| BlackSheep (Uvicorn) | 7271.32 | 130.23 | 1980 | 1.02MB |
- Sanic's Built-in Server Performs Best: Running Sanic with its own built-in server significantly outperformed using Uvicorn, showing that its native implementation is highly optimized.
- Starlette and Muffin Excel with Uvicorn: Both frameworks achieved high throughput with relatively low latency, making them solid choices for high-performance applications.
- FastAPI and Quart Lagged Behind: Due to additional processing overhead (e.g., data validation in FastAPI), they showed increased latency and lower requests per second.
- MicroPie Shows Strong Performance: MicroPie performed well, keeping up with frameworks like Muffin and Starlette, despite its lightweight design.
- BlackSheep is a High-Performer: It demonstrated great efficiency, showing a good balance between latency and throughput.
To replicate these benchmarks, use the following implementations for each framework:
from MicroPie import App
class Root(App):
async def index(self):
return "Hello, world!"
app = Root()from sanic import Sanic
from sanic.response import text
app = Sanic("MyHelloWorldApp")
@app.get("/")
async def hello_world(request):
return text("Hello, world.")import muffin
app = muffin.Application()
@app.route('/', '/hello/{name}')
async def hello(request):
name = request.path_params.get('name', 'world')
return f'Hello {name.title()}!'from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def read_root():
return {"Hello": "World"}from starlette.responses import PlainTextResponse
async def app(scope, receive, send):
assert scope['type'] == 'http'
response = PlainTextResponse('Hello, world!')
await response(scope, receive, send)from litestar import Litestar, get
@get("/")
async def hello_world() -> str:
return "Hello, world!"
app = Litestar([hello_world])from quart import Quart
app = Quart(__name__)
@app.route('/')
async def hello():
return 'hello'from blacksheep import Application, get
app = Application()
@get("/")
async def home():
return "Hello, World!"These benchmarks provide insight into the raw performance of different ASGI frameworks under high concurrency. MicroPie stands out as an exceptionally strong performer, keeping up with and even surpassing several more established frameworks in terms of requests per second and latency. Its lightweight design ensures minimal overhead, making it an excellent choice for developers seeking a high-performance, no-nonsense ASGI framework. Unlike more heavyweight frameworks that introduce additional processing overhead, MicroPie achieves a remarkable balance of simplicity, speed, and efficiency. For developers looking for a blazing-fast and lightweight framework, MicroPie is a top contender that delivers excellent results while maintaining a minimalistic API.
where is the code? the repository to test this