Skip to content

Instantly share code, notes, and snippets.

@johannesprinz
Last active July 31, 2025 08:42
Show Gist options
  • Save johannesprinz/a41908e028e7d83277953b06b2b4bc26 to your computer and use it in GitHub Desktop.
Save johannesprinz/a41908e028e7d83277953b06b2b4bc26 to your computer and use it in GitHub Desktop.
opencode.json connecting to Azure AI Foundry models

Adding Azure AI Foundry Models to opencode

This took me a while to get going to pass the "api-version": "2025-01-01-preview" and then I kept hitting MAX TOKEN LENGTH issues when using a smaller model.

  1. Configure the provider in your opencode.json file:

    "I did this is my WSL instance"

    mkdir -p ~/.config/opencode/
    touch ~/.config/opencode/opencode.json
    code ~/.config/opencode/opencode.json
    {
        "$schema": "https://opencode.ai/config.json",
        "provider": {
            "azure-foundry": {
                "npm": "@ai-sdk/openai-compatible",
                "name": "Azure Foundry",
                "options": {
                    "baseURL": "https://<MY_AI_FOUNDRY_INSTANCE>.cognitiveservices.azure.com/openai/deployments/<MY_DEPLOYMENT_NAME>/",
                    "queryParams": {
                        "api-version": "2025-01-01-preview"
                    }
                },
                "models": {
                    "MY_MODEL": {
                        "name": "<CUSTOM_MODEL_DISPLAY_NAME_IN_OPENCODE>"
                    }
                }
            }
        }
    }
  2. To use this configuration in my devcontainer.json I add a mount to map this file.
    {
       ....
       ,
       "mounts": [
          "source=${localEnv:HOME}/.config/opencode,target=/home/node/.config/opencode,type=bind",
          ......
       ],
       ......
    }
  3. Assuming you have opencode installed npm install -g opencode-ai in your devcontainer run opencode auth login
    • Arrow key up to other
    • enter azure-foundry
    • enter YOUR_SECURE_API_TOKEN
    • Run /models in opencode to select your Azure AI Foundry model

🎉 Happy coding!

@johannesprinz
Copy link
Author

If anyone can figure out how to limit the max tokens sent to a model in opencode please let me know.
I want to play with some of the smaller models locally.

@kartik-ramachandran
Copy link

possibly after you download OLLAMA on your machine and get a MCP client (Claude) seems to be good to start with and dont have to buy a subscription, add the following and then you have something to try your models on locally

"ollama": {
"command": "uv",
"args": [
"--directory",
"C:/Source/MCP/PythonServer/weather",
"run",
"ollama.py"
]
},

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment