Provider | Model | Number of parameters |
---|---|---|
Meta with Microsoft | LLama 2 | 7B, 13B, 32B, 65.2B |
Meta | LLama | 7B, 13B, 70B |
Technology Innovation Institute of UAE | Flacon LLM | 7B, 40B |
Stanford’s CRFM | Alpaca | 7B |
Plan-T5 | 80M, 250M, 780M, 3B, 11B | |
MPT | MosaicML | 7B, 30B |
Created
September 28, 2023 09:45
-
-
Save ljnmedium/81ec20a1a9e81ec2354a100ed43c58df to your computer and use it in GitHub Desktop.
llm_size.md
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment