This is a quick copy-paste-observe guide for people in a hurry to quickly set up Open-WebUI such that one can run and access a local Large Language Model for personal use.
- Install Ollama using the command below:
curl -fsSL https://ollama.com/install.sh | sh