Skip to content

Instantly share code, notes, and snippets.

@Kambaa
Last active February 1, 2025 15:34
Show Gist options
  • Save Kambaa/ac029ff2b10fb78d93b6d226362896c7 to your computer and use it in GitHub Desktop.
Save Kambaa/ac029ff2b10fb78d93b6d226362896c7 to your computer and use it in GitHub Desktop.
Web UI configuration for Ollama on windows

Ollama Web UI(running on WSL Docker) Config With Ollama running on Windows:

I have installed docker on my WSL linux machine, and i have already been using ollama on windows. With these setup in mind i wrote this gist for future reference.

When you run ollama-webui on docker on wsl and your ollama already running on windows machine, webui can not connect to ollama running on windows. For it to work you need to take these steps:

Make Ollama on Windows accessible on network:

Default makes ollama only accessible via localhost(127.0.0.1). To make it accessible on any network devices on the Windows machine, you need to add an environment variable named OLLAMA_HOST with value of 0.0.0.0. Check out the example below:

resim

After that exit Ollama by right clicking on the tray and re-run it from start menu:

resim

You should be able to access ollama via your LAN Ip:

resim

Configure and run Ollama-WebUi via docker on wsl:

Generate a folder for persisting settings, and use that folder on the docker container by running the commands below on your WSL Linux machine with docker already installed(change the YOURLANIPvalue accordingly):

mkdir $HOME/open-webui
sudo docker run -d -p 3000:8080 -v $HOME/open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://YOURLANIP:11434/api -e OLLAMA_BASE_URL=http://YOURLANIP:11434  --restart always --name ollama-webui ghcr.io/ollama-webui/ollama-webui:main

And lastly, go http://127.0.0.1:3000 and check everything works especially this Ollama API connection config: resim

Aliases for the WSL:

After restart, docker does not start the containers automatically on windows (from what i experience) so i wrote these aliases for quick startup for the ollama-webui container. One generates the container, one starts and one stops. Just put them on your $HOME/.bashrc file(again change your YOURLANIP accordingly):

# open your bashrc file: 
$ nano $HOME/.bashrc


# add these aliases
alias aicontainer='mkdir -p $HOME/open-webui && sudo docker run -d -p 3000:8080 -v $HOME/open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://YOURLANIP:11434/api -e OLLAMA_BASE_URL=http://YOURLANIP:11434  --restart always --name ollama-webui ghcr.io/ollama-webui/ollama-webui:main'
alias airun='docker container start ollama-webui'
alias aistop='docker container stop ollama-webui'


# and lastly register the edit state to the current terminal session: 
$ source $HOME/.bashrc

After restart, connect to your WSL Linux and run

airun
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment