Skip to content

Instantly share code, notes, and snippets.

View jwiegley's full-sized avatar

John Wiegley jwiegley

View GitHub Profile
diff --git i/lisp/eshell/esh-cmd.el w/lisp/eshell/esh-cmd.el
index f0c3a6a3e14..2801eacb231 100644
--- i/lisp/eshell/esh-cmd.el
+++ w/lisp/eshell/esh-cmd.el
@@ -1485,7 +1485,13 @@ eshell-exec-lisp
(if form-p
(eval func-or-form)
(apply func-or-form args)))))
- (and result (funcall printer result))
+ ;; As a special case, a Lisp function or form may evaluate to an
*Prompt*: This is great. Can you show me any pull requests I have open that are still waiting to be merged?
*Response*:
{
"tool_calls": [],
"content": ""
}
To check your open pull requests, I need to know which specific repository you're interested in. Could you please specify the repository name (e.g., "haskell-project" or "emacs-config") you'd like me to check for open PRs? This is required because the GitHub API needs the exact repository to query.
May 21 22:33:39 vulcan systemd[1]: Started LLM Gateway to provide model access, fallbacks and spend tracking across 100+ LLMs..
May 21 22:33:42 vulcan litellm[163219]: Traceback (most recent call last):
May 21 22:33:42 vulcan litellm[163219]: File "/nix/store/kjvgj2n3yn70hmjifg6y0bk9m4rf7jba-python3-3.12.10/lib/python3.12/pathlib.py", line 1311, in mkdir
May 21 22:33:42 vulcan litellm[163219]: os.mkdir(self, mode)
May 21 22:33:42 vulcan litellm[163219]: FileNotFoundError: [Errno 2] No such file or directory: '/.cache/prisma-python/binaries/5.17.0/393aa359c9ad4a4bb28630fb5613f9c281cde053'
May 21 22:33:42 vulcan litellm[163219]: During handling of the above exception, another exception occurred:
May 21 22:33:42 vulcan litellm[163219]: Traceback (most recent call last):
May 21 22:33:42 vulcan litellm[163219]: File "/nix/store/kjvgj2n3yn70hmjifg6y0bk9m4rf7jba-python3-3.12.10/lib/python3.12/pathlib.py", line 1311, in mkdir
May 21 22:33:42 vulcan litellm[163219]: os.mkdir(self, mode)
May 21 22:33:42 vul
{ config, lib, pkgs, ... }:
let
portal = pkgs.stdenv.mkDerivation {
name = "nginx-portal";
src = ./nginx-portal;
installPhase = ''
mkdir -p $out
cp -r $src/* $out/
'';
};

NOTE Pool thoughts

John: In the pool now, no Element

Will I be able to put an @ cookie at the beginning of a system prompt in gptel-directives, and use that to set the model for when that directive is used? Like if I need to use a specific local LLM for tool-related prompts…

* New chat
*Prompt*: Hello
*Response*:
#+begin_reasoning
<think>
Okay, the user started a new chat and just said "Hello". I should respond in a friendly and helpful manner. Let me keep it concise as instructed. Maybe greet them back and offer assistance. Let's see, "Hello! How can I assist you today?" That sounds good. Short and to the point.
</think>
#+end_reasoning
* New chat
*Prompt*: Write a function in Emacs Lisp that computes the difference in time
between my current location and Zurich, Switzerland.
*Response*:
Okay, the user has asked me to write an Emacs Lisp function that computes the time difference between their current location and Zurich, Switzerland.
First, I need to figure out how to get the current time in the user's time zone. Emacs Lisp has the =current-time= function which returns the current time as a time structure, but I think that uses the system's local time. Alternatively, there's =gmt-time= which gives the time in GMT.
{ config, lib, pkgs, ... }:
let
portal = pkgs.stdenv.mkDerivation {
name = "nginx-portal";
src = ./nginx-portal;
installPhase = ''
mkdir -p $out
cp -r $src/* $out/
'';
};
+ set -eo pipefail
+ pihole=/nix/store/7rhbnw44y2bfr4imgx2692qq9b1wam9y-pihole-6.0.6/bin/pihole
+ jq=/nix/store/2yi7k5ay6xj73rbfqf170gms922rjm2d-jq-1.7.1-bin/bin/jq
+ '[' '!' -f /var/lib/pihole/gravity.db ']'
+ source /nix/store/7rhbnw44y2bfr4imgx2692qq9b1wam9y-pihole-6.0.6/usr/share/pihole/advanced/Scripts/api.sh
+ source /nix/store/7rhbnw44y2bfr4imgx2692qq9b1wam9y-pihole-6.0.6/usr/share/pihole/advanced/Scripts/utils.sh
+ any_failed=0
+ for i in 1 2 3
+ TestAPIAvailability
+ local chaos_api_list authResponse authStatus authData
Debugger entered--Lisp error: (error "Selecting deleted buffer")
apply(debug (error (error "Selecting deleted buffer")))
transient--exit-and-debug(error (error "Selecting deleted buffer"))
completing-read-default("Model: " (("Ollama:HammerAI/midnight-m..." #s(gptel-ollama :name "Ollama" :host "localhost:11434" :header nil :protocol "http" :stream t :endpoint "/api/chat" :key nil :models ... :url "http://localhost:11434/api..." :request-params ... :curl-args nil) HammerAI/midnight-miqu-70b-v1.5:latest) ("Ollama:codellama:70b" #s(gptel-ollama :name "Ollama" :host "localhost:11434" :header nil :protocol "http" :stream t :endpoint "/api/chat" :key nil :models ... :url "http://localhost:11434/api..." :request-params ... :curl-args nil) codellama:70b) ("Ollama:deepseek-coder:33b" #s(gptel-ollama :name "Ollama" :host "localhost:11434" :header nil :protocol "http" :stream t :endpoint "/api/chat" :key nil :models ... :url "http://localhost:11434/api..." :request-params ... :curl-args nil) deepseek-coder:33b) ("O