A guide on using Ollama as the OpenAI API provider for inline completions in iTerm2.
- API URL:
http://127.0.0.1:11434/v1/completions
- Model:
mistral
- Tokens:
4000
- Use legacy completions API:
true
Return commands suitable for copy/pasting into \(shell) on \(uname).
The script should do this: \(ai.prompt)
IMPORTANT: Do NOT include any instructions, bullet points, commentary NOR Markdown triple-backtick code blocks as your whole response will be copied into my terminal automatically. ONLY return the command without enclosing it in a Markdown code block.
@bsahane tried your config but used the
mistral:latest
for the model. Works like a charm!