Created
July 28, 2025 17:42
-
-
Save barretts/ad0fcb03c1c753199a67fa149f3865c2 to your computer and use it in GitHub Desktop.
Using Opencode with LiteLLM and LM Studio `~/.config/opencode/opencode.json`
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
model_list: | |
- model_name: gpt-3.5-turbo | |
litellm_params: | |
model: openai/devstral-small-2507-mlx@8bit | |
api_base: http://localhost:1234/v1 | |
api_key: dummy-key | |
timeout: 600 | |
stream: true | |
# General settings | |
general_settings: | |
default_model: gpt-3.5-turbo | |
completion_timeout: 600 # 10 minutes timeout | |
server_settings: | |
streaming_supported: true | |
proxy_server_timeout: 600 | |
uvicorn_params: | |
timeout_keep_alive: 600 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"provider": { | |
"litellm": { | |
"name": "LiteLLM (Local)", | |
"npm": "@ai-sdk/openai-compatible", | |
"models": { | |
"gpt-3.5-turbo": {} | |
}, | |
"options": { | |
"baseURL": "http://localhost:8000/v1" | |
} | |
} | |
}, | |
"$schema": "https://opencode.ai/config.json" | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment