Hindsight + Osaurus local LLM setup
This note shows how to run Hindsight in Docker while using an Osaurus-hosted local OpenAI-compatible model for LLM calls.
- Osaurus is running on the host machine.
- Osaurus exposes an OpenAI-compatible API on
http://127.0.0.1:1337.