Skip to content

Instantly share code, notes, and snippets.

@cleberjamaral
Last active October 9, 2025 16:37
Show Gist options
  • Save cleberjamaral/74a42919b3b8af7bdf663bef19144918 to your computer and use it in GitHub Desktop.
Save cleberjamaral/74a42919b3b8af7bdf663bef19144918 to your computer and use it in GitHub Desktop.
MyLLM: Launches an LLM model and chat interface in my private machine using docker-compose

My Large Language Model

This project provides a local LLM setup using Docker Compose with Ollama and Open WebUI.

Overview

The docker-compose.yaml file defines a complete LLM and chat interface stack with two main services:

Services

  1. Ollama - Local AI model server

    • Runs the ollama/ollama:latest image
    • Automatically downloads and serves the phi4-mini model (1.7GB)
    • Exposes port 11434 for API access
    • Stores models and data in persistent volumes
  2. Open WebUI - Web-based chat interface

    • Runs the ghcr.io/open-webui/open-webui:main image
    • Provides a user-friendly web interface for chatting with the AI
    • Connects to the Ollama service for AI model access
    • Exposes port 8080 for web access

Key Features

  • Lightweight: Uses the small phi4-mini model (only 1.7GB)
  • Persistent Storage: Data and models are stored in Docker volumes
  • Easy Access: Web interface available at http://localhost:8080
  • Customizable: Easy to change models or ports by modifying the configuration

Usage

  1. Start the services: docker-compose up -d
  2. Open your browser and go to http://localhost:8080
  3. Start chatting with your AI assistant!

Customization

  • To use a different model, modify the command in the ollama service
  • To change the web UI port, modify the port mapping in the open-webui service
  • To prevent logouts after updates, you can set the WEBUI_SECRET_KEY environment variable

docker-compose.yaml

services:
  ollama:
    container_name: ollama
    image: ollama/ollama:latest
    environment:
      - LOG_LEVEL=debug
    volumes:
      - ollama:/root/.ollama
      - models:/models
    ports:
      - "11434:11434"
    networks:
      - ollama-net
    restart: unless-stopped
    entrypoint: ["/bin/sh", "-c"]
    command: ["ollama serve & sleep 5 && ollama pull phi4-mini && wait"]

  open-webui:
    container_name: open-webui
    image: ghcr.io/open-webui/open-webui:main
    environment:
      - MODEL_DOWNLOAD_DIR=/models
      - OLLAMA_API_BASE_URL=http://ollama:11434
      - OLLAMA_API_URL=http://ollama:11434
      - LOG_LEVEL=debug
      - TMPDIR=/tmp/open-webui-tmp
    volumes:
      - ./webui-tmp:/tmp/open-webui-tmp
      - data:/data
      - models:/models
      - open-webui:/app/backend/data
    ports:
      - "8080:8080"
    depends_on:
      - ollama
    extra_hosts:
      - "host.docker.internal:host-gateway"
    networks:
      - ollama-net
    restart: unless-stopped

volumes:
  models:
  ollama:
  data:
  open-webui:

networks:
  ollama-net:
    driver: bridge

Alternative installation

Considering a fresh linux-based machine, it could be done following the next steps:

# mkdir -p /app
# cd /app

Installing ollama (more info: https://www.server-world.info/en/note?os=Debian_12&p=ollama&f=1)

# wget https://ollama.ai/install.sh
# chmod +x install.sh
# ./install.sh

Installing open-webui (more info: https://pahautelman.github.io/pahautelman-blog/tutorials/build-your-local-ai/build-your-local-ai/)

# apt install python3.11 python3-pip python3.11-venv
# python3 -m pip install open-webui

Adding a model

# ollama run phi4-mini

Starting open-webui as a service

# touch /etc/systemd/system/open-webui.service
# chmod 664 /etc/systemd/system/open-webui.service
# vim /etc/systemd/system/open-webui.service

open-webui.service content

[Unit]
Description=Open-WebUI
After=network.target

[Service]
Type=simple
User=root
WorkingDirectory=/app
ExecStart=/app/env/bin/open-webui serve
Restart=always

[Install]
WantedBy=multi-user.target

Enabling and starting the service

# systemctl daemon-reload
# systemctl enable open-webui.service 
# systemctl start open-webui.service 
# systemctl status open-webui.service 
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment