Most of the LLM traffic is handled by the hyperscalers (GCP, Azure, AWS) and we can only guess where they host their models.
Based on strategic investments and agreements, we assume that:
- Anthropic runs mainly on AWS: Anthropic has established AWS as its "primary cloud and training partner" following Amazon's $8 billion investment
- OpenAI runs mainly on Azure: While historically Azure was "the exclusive cloud provider for all OpenAI workloads", Microsoft is no longer OpenAI's exclusive cloud provider as of January 2025, though the OpenAI API remains "exclusive to Azure"
Let's take a look the hypescalers and their respe