Skip to content

Instantly share code, notes, and snippets.

@ex3ndr
Created February 14, 2026 10:53
Show Gist options
  • Select an option

  • Save ex3ndr/63234636087f2b5b1b867d872f493447 to your computer and use it in GitHub Desktop.

Select an option

Save ex3ndr/63234636087f2b5b1b867d872f493447 to your computer and use it in GitHub Desktop.
Improve user-facing messages in daycare
From a7bac5cbd7bbffe332fbe0033bcfa15c88d0ae5f Mon Sep 17 00:00:00 2001
From: Steve Korshakov <[email protected]>
Date: Sat, 14 Feb 2026 02:52:53 -0800
Subject: [PATCH] Improve user-facing messages to be more human-friendly
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
- Replace cold system messages with warmer, conversational tone
- Add token count to context overflow message (~Xk tokens)
- Messages updated:
* Context limit reached -> Our conversation got too long (~Xk tokens)
* Session reset. -> Got it\! Starting fresh.
* Compacting session context -> One sec — tidying up our conversation history...
* Unexpected error -> Oops\! Something went wrong. Please try again.
* Inference failed. -> Something went wrong with the AI. Trying again might help\!
* No inference provider available. -> Hmm, no AI provider is available right now.
---
.../daycare/sources/engine/agents/agent.ts | 18 +++++++++++++-----
.../sources/engine/agents/ops/agentLoopRun.ts | 6 +++---
2 files changed, 16 insertions(+), 8 deletions(-)
diff --git a/packages/daycare/sources/engine/agents/agent.ts b/packages/daycare/sources/engine/agents/agent.ts
index fd5e637..0004e5f 100644
--- a/packages/daycare/sources/engine/agents/agent.ts
+++ b/packages/daycare/sources/engine/agents/agent.ts
@@ -280,7 +280,7 @@ export class Agent {
}
try {
await connector.sendMessage(this.descriptor.channelId, {
- text: "Unexpected error",
+ text: "Oops! Something went wrong. Please try again.",
replyToMessageId: item.context.messageId
});
} catch (error) {
@@ -451,7 +451,7 @@ export class Agent {
const targetId = target?.targetId ?? null;
if (agentKind === "foreground" && connector?.capabilities.sendText && targetId) {
await connector.sendMessage(targetId, {
- text: "Compacting session context. I'll continue shortly.",
+ text: "One sec — tidying up our conversation history...",
replyToMessageId: entry.context.messageId
});
}
@@ -706,7 +706,7 @@ export class Agent {
try {
await connector.sendMessage(this.descriptor.channelId, {
- text: "Session reset.",
+ text: "Got it! Starting fresh.",
replyToMessageId: item.context?.messageId
});
} catch (error) {
@@ -723,9 +723,13 @@ export class Agent {
entry: AgentMessage,
source: string
): Promise<void> {
+ // Capture token info before reset clears it
+ const tokenInfo = this.state.tokens;
+ const tokenCount = tokenInfo?.size?.total ?? null;
+
const reset: AgentInboxReset = {
type: "reset",
- message: "Emergency reset: context overflow detected. Previous session context was cleared."
+ message: "Context overflow: session was automatically reset."
};
await this.handleReset(reset);
@@ -744,9 +748,13 @@ export class Agent {
return;
}
+ // Build a friendly message with token count if available
+ const tokenPart = tokenCount ? ` (~${Math.round(tokenCount / 1000)}k tokens)` : "";
+ const message = `Our conversation got too long${tokenPart}. I've reset to start fresh — what were you asking?`;
+
try {
await connector.sendMessage(targetId, {
- text: "Context limit reached. Session reset. Please resend your last request.",
+ text: message,
replyToMessageId: entry.context.messageId
});
} catch (error) {
diff --git a/packages/daycare/sources/engine/agents/ops/agentLoopRun.ts b/packages/daycare/sources/engine/agents/ops/agentLoopRun.ts
index 478cb76..4b6922c 100644
--- a/packages/daycare/sources/engine/agents/ops/agentLoopRun.ts
+++ b/packages/daycare/sources/engine/agents/ops/agentLoopRun.ts
@@ -306,8 +306,8 @@ export async function agentLoopRun(options: AgentLoopRunOptions): Promise<AgentL
logger.warn({ connector: source, error }, "error: Inference failed");
const message =
error instanceof Error && error.message === "No inference provider available"
- ? "No inference provider available."
- : "Inference failed.";
+ ? "Hmm, no AI provider is available right now. Please check your settings."
+ : "Something went wrong with the AI. Trying again might help!";
logger.debug(`error: Sending error message to user message=${message}`);
await notifySubagentFailure("Inference failed", error);
if (connector && targetId) {
@@ -341,7 +341,7 @@ export async function agentLoopRun(options: AgentLoopRunOptions): Promise<AgentL
);
return { responseText: finalResponseText, historyRecords, contextOverflow: true, tokenStatsUpdates };
}
- const message = "Inference failed.";
+ const message = "Something went wrong with the AI. Trying again might help!";
const errorDetail =
response.message.errorMessage && response.message.errorMessage.length > 0
? response.message.errorMessage
--
2.43.0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment