Skip to content

Instantly share code, notes, and snippets.

@nyimbi
Created January 28, 2026 00:05
Show Gist options
  • Select an option

  • Save nyimbi/ac0a7ccb68cfdbf82470df72107ffd60 to your computer and use it in GitHub Desktop.

Select an option

Save nyimbi/ac0a7ccb68cfdbf82470df72107ffd60 to your computer and use it in GitHub Desktop.
Using kimi-cli with Ollama
# Kimi CLI Configuration - Using Ollama with kimi-k2.5
default_model = "kimi-k2.5-ollama"
default_thinking = false
[providers.ollama]
type = "openai_legacy"
base_url = "http://localhost:11434/v1"
api_key = "ollama"
[models."kimi-k2.5-ollama"]
provider = "ollama"
model = "kimi-k2.5:cloud"
max_context_size = 262144
capabilities = ["thinking", "image_in"]
[loop_control]
max_steps_per_turn = 100
max_retries_per_step = 3
max_ralph_iterations = 0
reserved_context_size = 50000
[services]
[mcp.client]
tool_call_timeout_ms = 60000
@nyimbi
Copy link
Copy Markdown
Author

nyimbi commented Jan 28, 2026

Normally in ~/.kimi/config.toml

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment