Created
June 25, 2024 18:43
-
-
Save garrett/76020dd1655e587031c9e6a4a8f7c326 to your computer and use it in GitHub Desktop.
ollama with rocm (AMD GPU) and webui, in podman
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| podman run --pull newer --detach --security-opt label=type:container_runtime_t --replace --device /dev/kfd --device /dev/dri -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama:rocm; podman run --replace --pull newer -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thank you so much! This helped me overcome the weird "Reason: Memory in use." error I got when using the command from the official repo.
I translated your command for ollama into this compose file (doesn't include the Open Web UI part):