Skip to content

Instantly share code, notes, and snippets.

@grahama1970
Created August 1, 2025 13:44
Show Gist options
  • Select an option

  • Save grahama1970/b6765187851ac9cd524428680cb8091f to your computer and use it in GitHub Desktop.

Select an option

Save grahama1970/b6765187851ac9cd524428680cb8091f to your computer and use it in GitHub Desktop.
perplexity estimate

Here is the final clear and concise estimate with math for how many full-time developers a 16 Nvidia H200 GPU cluster can support running Grok’s Kimi-k2, based purely on token throughput:

Given:

  • Average tokens per developer per day: ~5,800,000 tokens
  • Seconds per day: 86,400 seconds
  • Per-GPU token throughput (prefill + decode combined) from Grok Kimi-k2 benchmarks: ~4,000 tokens/sec
  • Number of GPUs in cluster: 16

Step 1: Calculate tokens per second per developer

$$ \frac{5,800,000 \text{ tokens}}{86,400 \text{ sec}} \approx 67 \text{ tokens/sec per developer} $$

Step 2: Calculate total cluster token throughput

$$ 16 \text{ GPUs} \times 4,000 \frac{\text{tokens}}{\text{sec} \cdot \text{GPU}} = 64,000 \text{ tokens/sec total cluster throughput} $$

Step 3: Estimate max developers supported by token throughput

$$ \frac{64,000 \text{ tokens/sec total}}{67 \text{ tokens/sec per developer}} \approx 955 \text{ developers} $$

Step 4: Practical adjustment (accounting for overhead, latency, concurrency variability)

To be conservative, allow 60-70% of raw capacity to account for system overhead and peak usage patterns:

$$ 955 \times 0.6 \approx 570 \quad \text{to} \quad 955 \times 0.9 \approx 860 \text{ developers} $$

Final estimate:

A 16 Nvidia H200 GPU cluster running Grok’s Kimi-k2 can support roughly 600 to 900 full-time inference developers concurrently by token throughput capacity.

If your use case involves heavier GPU compute beyond inference (like training or fine-tuning), that number will be considerably lower, but for inference based on token throughput, this is a solid estimate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment