Skip to content

Instantly share code, notes, and snippets.

@msterhuj
Created April 9, 2025 08:01
Show Gist options
  • Save msterhuj/7d71612f0ee6745672b0ea5553e451c4 to your computer and use it in GitHub Desktop.
Save msterhuj/7d71612f0ee6745672b0ea5553e451c4 to your computer and use it in GitHub Desktop.
Ollama with nvidia gpu and openwebui with ssl
services:
caddy:
image: lucaslorentz/caddy-docker-proxy:ci-alpine
ports:
- 80:80
- 443:443
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ./data/caddy:/data
openwebui:
image: ghcr.io/open-webui/open-webui:main
volumes:
- ./data/webui:/app/backend/data
labels:
caddy: chat.example.com
caddy.reverse_proxy: "{{upstreams 8080}}"
ollama-gpu:
volumes:
- ./data/ollama_gpu:/root/.ollama
container_name: ollama-gpu
tty: true
image: docker.io/ollama/ollama:latest
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment