Skip to content

Instantly share code, notes, and snippets.

@ben-doyle
Last active January 30, 2025 05:55
Show Gist options
  • Save ben-doyle/741dce61bc1f349e9992f4c9d577e14f to your computer and use it in GitHub Desktop.
Save ben-doyle/741dce61bc1f349e9992f4c9d577e14f to your computer and use it in GitHub Desktop.
Running an LLM (AI) locally, and how to use it

Running an LLM (AI) locally, and how to use it

Getting the models running on your machine using Ollama

Ollama is a lightweight framework for running large language models (LLMs) locally on your machine. It simplifies the process of downloading, running, and interacting with AI models without requiring extensive setup.

What we are using

How to install it

brew install ollama

Use-case 1: A chat interface for your LLM (AI)

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution.

Similar to

What we are using

How to install it

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

This spins up an instance of Open WebUI in a local docker container, available at port 3000. Change this number if your local development workflow relies on port 3000.

Open: http://localhost:3000

When you have changed a few settings, like the theme, it should look like this You can use ollama.com to find models to use, add them using the name Chat to your local LLM
image image image image

Use-case 2: An IDE code co-pilot

Similar to

What we are using

Install the continue extension, press the settings Add your model/s to the config.json Enjoy your new pair programming buddy
image { "models": [ { "title": "deepseek-r1:32b", "provider": "ollama", "model": "deepseek-r1:32b" } ] } image

Use-case 2: An IDE code co-pilot

  • Install Obsidian - Sharpen your thinking and let it become your favourite notes taking app.
  • Spend an unreasonable amount of time adding extensions and organising your notes into a second brain.
  • Install the https://github.com/logancyang/obsidian-copilot extension
  • Add in your local ollama model.
Add in your local ollama model to the obsidian copilot extension settings Chat to your notes! Reference them using the obsidian markdown format
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment