Skip to content

Instantly share code, notes, and snippets.

@ahyatt
Created September 15, 2024 15:54
Show Gist options
  • Save ahyatt/d87753a6dafa6d39a7315c17b76eebc3 to your computer and use it in GitHub Desktop.
Save ahyatt/d87753a6dafa6d39a7315c17b76eebc3 to your computer and use it in GitHub Desktop.
Fill-in-the-middle with LLM function calling
(require 'llm)
(require 'llm-prompt)
(defvar llm-provider nil
"The LLM provider to use for misc functionality")
(llm-defprompt llm-context
"The user is currently in Emacs. The following is part of the buffer they are looking at, which is a {{mode}} buffer. Their cursor is marked with '{{cursor}}':
{{context}}")
(defun llm-get-context (cursor)
"Get the current context for a LLM prompt.
If CURSOR is non-nil, the value will be inserted at the current
position."
(let ((mode major-mode)
(text (buffer-string))
(pos (point)))
(with-temp-buffer
(insert text)
(funcall mode)
(goto-char pos)
(when cursor
(insert cursor))
(let ((visible (buffer-substring-no-properties (window-start)
(window-end)))
(alt (if (eq major-mode 'org-mode)
(let ((context (org-list-context)))
(buffer-substring-no-properties
(car context) (cadr context)))
"")))
(if (> (length visible)
(length alt))
visible alt)))))
(defun llm-fill-in ()
"Insert text based on the current context.
Text is inserted at the current cursor location."
(interactive)
(llm-chat llm-provider
(llm-make-chat-prompt
"Please fill in the missing text at the cursor. Ensure that your addition is complete and coherent, seamlessly integrating with the existing content. Avoid introducing new ideas or making assumptions that the original author may not have intended. Your goal is to maintain the original author’s style and intent without making unintended decisions."
:context (llm-prompt-fill 'llm-context
llm-provider
:mode (format "%s" major-mode)
:cursor "CURSOR"
:context (llm-get-context "CURSOR"))
:functions (list
(make-llm-function-call
:function (lambda (text) (insert text))
:name "insert_text"
:description "Insert the specified text into the current document."
:args (list (make-llm-function-arg
:name "text"
:description "The text to insert."
:type 'string
:required t)))))))
@eval-exec
Copy link

Hello, what llm-provider are you using that can let llm-fil-in work?
I trird:

  (setq llm-provider (make-llm-ollama
								   :chat-model "codestral"
								   ))

Then M-x llm-fil-in, it not work, and show below message:

llm-request-plz-sync-raw-output: LLM request failed with code 400: Bad Request (additional information: ((error . codestral does not support tools)))

@ahyatt
Copy link
Author

ahyatt commented Dec 20, 2024

Right for tool use (what we call function calling here), you need an ollama model that supports tool calling. Ollama has a way for you to see which ones will work, so try one of those.

@eval-exec
Copy link

Thank you, I tried

(setq llm-provider
  (make-llm-ollama
	:chat-model "qwen2.5-coder:32b"))

It works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment