Skip to content

Instantly share code, notes, and snippets.

terminal : ollama run llama3.2
# uses "base" python interpreter
import ollama
response = ollama.chat(model='llama3.2', messages=[
{
'role': 'user',