A tool to talk to an llm from the terminal.
the default model is 3.5.
llm 'basic query'
# ask about a file: use -s
cat mycode.py | llm -s "Explain this code"
# set non-default model
llm 'Ten names for cheesecakes' -m 4o
# disable streaming
llm 'Ten names for cheesecakes' --no-stream
# continue conversation (potentially more expensive)
llm 'More names' -c
# or just start a chat:
llm chat -m chatgpt
llm chat -m 4o
# maybe with a template or system prompt; see templates below
llm chat -t my_template
# ask about a blog post; curl -s = silent
curl -s 'https://simonwillison.net/2023/May/15/per-interpreter-gils/' | \
llm -s 'Suggest topics for this post as a JSON array'
# prompt templates https://llm.datasette.io/en/stable/templates.html#prompt-templates
# create system prompts to reuse
llm -s 'write pytest tests for this code' --save pytest
# templates can also save which model:
llm -s 'write pytest tests for this code' -m 4o --save pytest4o
# re-use the template:
cat llm/utils.py | llm -t pytest
# advanced: use llm-cmd to auto-suggest commands
# https://github.com/simonw/llm-cmd
llm cmd undo last git commit# install a local model, e.g.
llm install llm-gpt4all
# see installed models
llm models
# ask a oneshot
llm -m mistral-7b-instruct-v0 'difference between a pelican and a walrus'
# start a chat
llm chat -m mistral-7b-instruct-v0pip install llm
llm keys set openai # paste openai key
llm 'Ten names for cheesecakes'
# generate a key and ensure there is positive balance: https://platform.openai.com/settings/organization/billing/overview- topic:: [[Topic-Tool]]