Skip to content

Instantly share code, notes, and snippets.

@markuskreitzer
Last active January 23, 2025 01:57
Show Gist options
  • Save markuskreitzer/1e83cc3e52404f9edaaf38e90f17b7ae to your computer and use it in GitHub Desktop.
Save markuskreitzer/1e83cc3e52404f9edaaf38e90f17b7ae to your computer and use it in GitHub Desktop.
A docker-compose for running ollama with openwebui

Running Ollama on Docker Compose

Assuming you start with a vanilla Ubuntu, you need to install Nvidia stuff

Add Nvidia apt registries:

curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey | sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg \
  && curl -s -L https://nvidia.github.io/libnvidia-container/stable/deb/nvidia-container-toolkit.list | \
    sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' | \
    sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list
    

Install and configure docker to use Nvidia:

sudo apt-get update
sudo apt-get install -y nvidia-container-toolkit
sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker

Create a directory to store ollama's model files. Make sure you have at least a few 100 Gb free on a fast drive:

mkdir -p ollama/ollama-data ollama/open-webui-data

Copy the docker-compose.yml file into the ollama/ directory you create.

cd ollama
docker compose up -d
services:
ollama:
# Uncomment below for GPU support
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
volumes:
- ${OLLAMA_DATA_DIR-./ollama-data}:/root/.ollama
# Uncomment below to expose Ollama API outside the container stack
ports:
- 11434:11434
container_name: ollama
pull_policy: always
tty: true
restart: unless-stopped
image: ollama/ollama:latest
open-webui:
image: ghcr.io/open-webui/open-webui:${WEBUI_DOCKER_TAG-main}
container_name: open-webui
volumes:
- ./open-webui-data:/app/backend/data
depends_on:
- ollama
ports:
- ${OPEN_WEBUI_PORT-8080}:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
- 'WEBUI_SECRET_KEY='
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment