Skip to content

Instantly share code, notes, and snippets.

@bukowa
Last active October 8, 2025 17:35
Show Gist options
  • Select an option

  • Save bukowa/08d4c749b34f3325237ee0b4a6fe42ce to your computer and use it in GitHub Desktop.

Select an option

Save bukowa/08d4c749b34f3325237ee0b4a6fe42ce to your computer and use it in GitHub Desktop.
llm inject prompt
llmx() {
# If no query was provided, show usage and exit.
if [ -z "$*" ]; then
echo "Usage: llmx <your query for a command>"
return 1
fi
# --- HISTORY MANAGEMENT ---
# 1. Manually add the command YOU just typed to the history.
# We use $FUNCNAME instead of $0 for reliability. This is the fix.
history -s "$FUNCNAME $*"
# --- AI CALL ---
echo "Asking AI for command: '$*'..." >&2
local cmd
cmd="$(llm -s "you are a bash expert. return ONLY the command. no explanations, no formatting, just the raw command text." "$*")"
# --- OUTPUT AND FINAL HISTORY ---
if [ -n "$cmd" ]; then
echo
echo "Suggested command:"
printf "\033[0;32m%s\033[0m\n" "$cmd"
echo
# 2. Now, add the AI's generated command to the history.
history -s "$cmd"
echo "--> Press [UP ARROW] for the AI command." >&2
echo "--> Press [UP ARROW] twice for your original query." >&2
else
echo "Error: AI did not return a command." >&2
fi
}
pip install llm
llm install llm-gemini │
llm models | grep gemini │
llm keys set gemini │
llm models default gemini/gemini-flash-latest
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment