Skip to content

Instantly share code, notes, and snippets.

@amjadbouhouch
Forked from JBGruber/docker-compose.yml
Created December 15, 2024 10:24
Show Gist options
  • Save amjadbouhouch/9defeafb9079e1def126c7e5ed443767 to your computer and use it in GitHub Desktop.
Save amjadbouhouch/9defeafb9079e1def126c7e5ed443767 to your computer and use it in GitHub Desktop.
My compose file to run ollama and ollama-webui
services:
# ollama and API
ollama:
image: ollama/ollama:latest
container_name: ollama
pull_policy: missing
tty: true
restart: unless-stopped
# Expose Ollama API outside the container stack (but only on the same computer;
# remove 127.0.0.1: to make Ollama available on your network)
ports:
- 127.0.0.1:11434:11434
volumes:
- ollama:/root/.ollama
# GPU support (turn off by commenting with # if you don't have an nvidia gpu)
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities:
- gpu
# webui, nagivate to http://localhost:3000/ to use
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
pull_policy: missing
volumes:
- open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- "OLLAMA_API_BASE_URL=http://ollama:11434/api"
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama: {}
open-webui: {}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment