Skip to content

Instantly share code, notes, and snippets.

@barretts
Created July 28, 2025 17:42
Show Gist options
  • Save barretts/ad0fcb03c1c753199a67fa149f3865c2 to your computer and use it in GitHub Desktop.
Save barretts/ad0fcb03c1c753199a67fa149f3865c2 to your computer and use it in GitHub Desktop.
Using Opencode with LiteLLM and LM Studio `~/.config/opencode/opencode.json`
model_list:
- model_name: gpt-3.5-turbo
litellm_params:
model: openai/devstral-small-2507-mlx@8bit
api_base: http://localhost:1234/v1
api_key: dummy-key
timeout: 600
stream: true
# General settings
general_settings:
default_model: gpt-3.5-turbo
completion_timeout: 600 # 10 minutes timeout
server_settings:
streaming_supported: true
proxy_server_timeout: 600
uvicorn_params:
timeout_keep_alive: 600
{
"provider": {
"litellm": {
"name": "LiteLLM (Local)",
"npm": "@ai-sdk/openai-compatible",
"models": {
"gpt-3.5-turbo": {}
},
"options": {
"baseURL": "http://localhost:8000/v1"
}
}
},
"$schema": "https://opencode.ai/config.json"
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment