Skip to content

Instantly share code, notes, and snippets.

@patrixr
Last active May 29, 2025 07:15
Show Gist options
  • Save patrixr/53bfef79fc4d715b508e6da1420786f7 to your computer and use it in GitHub Desktop.
Save patrixr/53bfef79fc4d715b508e6da1420786f7 to your computer and use it in GitHub Desktop.
services:
#
# Ollama
# The engine that manages your LLMs
#
ollama:
image: ollama/ollama:latest
ports:
- 11434:11434
volumes:
- ollama:/root/.ollama
container_name: ollama
tty: true
restart: unless-stopped
networks:
- tunnel
# (Uncomment below for Nvidia gpu)
# deploy:
# resources:
# reservations:
# devices:
# - driver: nvidia
# count: all
# capabilities: [gpu]
#
# Open WebUI
# This container serves a slick ui for you to manage your llms
# It does no do any ML work under the hood, simply exposes the ollama service via a browser
#
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
- 'WEBUI_SECRET_KEY='
networks:
- tunnel
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
#
# Cloudflared
# This container will open a tunnel to cloudflare to expose your site
# From the tunnel dashboard you will need to:
# - get a token (which you can set in the environment here)
# - configure the tunnel to read from http://open-webui:8080 (the other docker container)
#
cloudflared:
image: cloudflare/cloudflared:latest
container_name: cloudflared
command: tunnel --no-autoupdate run
networks:
- tunnel
restart: unless-stopped
environment:
- TUNNEL_TOKEN=${TUNNEL_TOKEN} # Don't forget to set your tunnel token
volumes:
ollama: {}
open-webui: {}
networks:
tunnel:
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment