| name | description |
|---|---|
llm-provider |
Implement, upgrade, or refactor JS/TS projects to use configurable LLM Provider + Model via llm:// URI strings — unifying scattered ENV vars, hardcoded model slugs, DB options, and AI SDK calls into a single composable resolution chain. Use when setting up LLM connections, migrating from env-only config, adding org-scoped encrypted credentials, or replacing multi-variable AI config with LLM strings. |
Implement or refactor JS/TS AI projects to use LLM strings — composable llm:// URIs that unify provider, model, credentials, and inference options (temperature, maxTokens, etc.) from any source (ENV, database, URL slug, job config, org connection) into a single resolution chain that produces an AI SDK model instance.