Created
September 27, 2023 13:58
-
-
Save wolfecameron/81487da3c8842cb20f19ad493ac3a989 to your computer and use it in GitHub Desktop.
Links from LLM presentation for Cartier (week 1).
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Summaries and Overviews: | |
- History of AI: https://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence/ | |
- GPT and GPT-2: https://cameronrwolfe.substack.com/p/language-models-gpt-and-gpt-2 | |
- Modern LLMs: https://cameronrwolfe.substack.com/p/modern-llms-mt-nlg-chinchilla-gopher | |
- Scaling Laws and GPT-3: https://cameronrwolfe.substack.com/p/language-model-scaling-laws-and-gpt | |
- The Illustrated Transformer: http://jalammar.github.io/illustrated-transformer/ | |
- Language Model Mechanics: https://cameronrwolfe.substack.com/i/135273362/the-mechanics-of-a-language-model | |
- BERT: https://cameronrwolfe.substack.com/p/language-understanding-with-bert | |
- Transformer Architecture (T5): https://cameronrwolfe.substack.com/p/t5-text-to-text-transformers-part | |
- Foundation Models: https://crfm.stanford.edu | |
Papers: | |
- AlexNet: https://proceedings.neurips.cc/paper_files/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf | |
- The Transformer: https://arxiv.org/abs/1706.03762 | |
- ULMFit: https://arxiv.org/abs/1801.06146 | |
- GPT: https://www.cs.ubc.ca/~amuham01/LING530/papers/radford2018improving.pdf | |
- GPT-2: https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf | |
- Scaling Laws for Neural Language Models: https://arxiv.org/abs/2001.08361 | |
- GPT-3: https://arxiv.org/abs/2005.14165 | |
- Chinchilla: https://arxiv.org/abs/2203.15556 | |
- ML-NLG: https://arxiv.org/abs/2201.11990 | |
- LaMDA: https://arxiv.org/abs/2201.08239 | |
- Gopher: https://arxiv.org/abs/2112.11446 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment