Skip to content

Instantly share code, notes, and snippets.

@garrett
Created June 25, 2024 18:43
Show Gist options
  • Save garrett/76020dd1655e587031c9e6a4a8f7c326 to your computer and use it in GitHub Desktop.
Save garrett/76020dd1655e587031c9e6a4a8f7c326 to your computer and use it in GitHub Desktop.
ollama with rocm (AMD GPU) and webui, in podman
podman run --pull newer --detach --security-opt label=type:container_runtime_t --replace --device /dev/kfd --device /dev/dri -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama:rocm; podman run --replace --pull newer -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
@anna-hope
Copy link

Thank you so much! This helped me overcome the weird "Reason: Memory in use." error I got when using the command from the official repo.

I translated your command for ollama into this compose file (doesn't include the Open Web UI part):

services:
  ollama:
    image: ollama/ollama:rocm
    devices:
      - "/dev/kfd:/dev/kfd"
      - "/dev/dri:/dev/dri"
    ports:
      - "11434:11434"
    security_opt:
      - label=type:container_runtime_t
    volumes:
      - ollama-data:/root/.ollama

volumes:
  ollama-data:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment