Skip to content

Instantly share code, notes, and snippets.

@YuriyGuts
Last active January 11, 2026 13:34
Show Gist options
  • Select an option

  • Save YuriyGuts/caaa91eee484a5ae825cb23bf6582950 to your computer and use it in GitHub Desktop.

Select an option

Save YuriyGuts/caaa91eee484a5ae825cb23bf6582950 to your computer and use it in GitHub Desktop.
Expose Ollama models to LM Studio by symlinking its model files. Just run `python3 link-ollama-models-to-lm-studio.py`. On Windows, run it as admin.
@dromanov
Copy link

If you have different location to store lmstudio models (e.g., another hard drive with free space), you will have to move ~/.lmstudio/models/ollama folder manually to a new location.

Anyway, kudos to the author - it works nicely, and it helps a lot since gollama dropped the support for the functionality.

@inorixu
Copy link

inorixu commented Jan 11, 2026

Thanks! It works. The Qwen3-VL model doesn't work because LM Studio doesn't support it, but other models function normally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment