Skip to content

Instantly share code, notes, and snippets.

@bearlike
Created September 21, 2024 08:43
Show Gist options
  • Save bearlike/cf12932a600dec8e08db4f8ce316f6cf to your computer and use it in GitHub Desktop.
Save bearlike/cf12932a600dec8e08db4f8ce316f6cf to your computer and use it in GitHub Desktop.
Prompt Template for creating a new OpenWebUI tool

Below is an example of a tool that can be used with the OpenWebUI LLM. You must understand how the below code works and create a new well-documented tool that provides the desired functionality based on the new user requirement. Functions in the Tools class are exposed to LLM for tool selection. Functions in the Utils class are not exposed to LLM and are meant to be used as development helper functions. The EventEmitter class is used to emit events to the OpenWebUI. The Valves class is used to define the user configuration parameters for the tool. The some_functionality_exposed_to_LLM function is an example of a function that is exposed to the LLM. The some_helper_function_not_exposed_to_LLM function is an example of a function that is not exposed to the LLM. The main function is an example of how to run the tool locally. The main function is not exposed to the LLM. Each function in Tools represents a new exposed function to the LLM. There can be many functions in Tools available for LLM function calling.

The new feature request is to create a tool that does the following: {{CLIPBOARD}}

Boilerplate OpenWebUI LLM Tool Example:

"""
title: Boilerplate OpenWebUI LLM Tool
author: KK
funding_url: https://github.com/bearlike
version: 0.1.0
license: MIT
"""

from datetime import timedelta, datetime
from typing import Callable, Any, Tuple
import asyncio
import json
from pydantic import BaseModel, Field


class EventEmitter:
    # This class is used to emit events to the OpenWebUI. It is not required for the tool to work.
    # It is useful to provide feedback to the user about the intermediate progress of the tool.
    def __init__(self, event_emitter: Callable[[dict], Any] = None):
        self.event_emitter = event_emitter

    async def emit(self, description="Unknown State", status="in_progress", done=False):
        if self.event_emitter:
            await self.event_emitter(
                {
                    "type": "status",
                    "data": {
                        "status": status,
                        "description": description,
                        "done": done,
                    },
                }
            )


class Utils:
    # Functions in the Utils class are not exposed to the LLM via OpenWebUI.
    def __init__(self):
        pass

    @staticmethod
    def unix_to_human_ts(timestamp: int) -> str:
        return datetime.utcfromtimestamp(timestamp).strftime("%Y-%b-%d %H:%M:%S")

    @staticmethod
    def unix_to_human_delta(timestamp):
        if isinstance(timestamp, str):
            # Example: 2024-09-21T06:11:05.256103146Z
            timestamp = datetime.strptime(timestamp, "%Y-%m-%dT%H:%M:%S.%fZ")
        if isinstance(timestamp, int) and (timestamp <= 0 or timestamp >= 8640000):
            return "N/A"
        return str(timedelta(seconds=timestamp)).replace(", 0:00:00", "")

    @staticmethod
    def bytes_to_human_size(size: int) -> str:
        suffixes = ["B", "KB", "MB", "GB", "TB"]
        index = 0
        while size >= 1024 and index < len(suffixes) - 1:
            size /= 1024
            index += 1
        return f"{size:.2f} {suffixes[index]}"

    @staticmethod
    def some_helper_function_not_exposed_to_LLM(argument) -> Tuple[dict, set]:
        # Do something here
        return (dict(), set())


class Tools:
    # Functions in the Tools class are exposed as Tools to the LLM via OpenWebUI.
    class Valves(BaseModel):
        # This class is used to define the user configuration parameters for the tool.
        DAEMON_URL: str = Field(
            default="unix://var/run/docker.sock",
            description="Docker Daemon URL.",
        )

    def __init__(self):
        self.valves = self.Valves()

    async def some_functionality_exposed_to_LLM(
        self, argument_with_defaults: str = None, __event_emitter__: Callable[[dict], Any] = None
    ) -> str:
        """
        All exposed functions must be documented in this way. This docstring will be part of the tool selection LLM prompt.
        :param argument_with_defaults: Optional Describe the argument and behavior.
        :return: Returns usually a string such as JSON dump.
        """
        # Functionality to be exposed to the LLM
        # This function needs to be modified to provide the desired functionality.
        # This function must always be safe.
        emitter = EventEmitter(__event_emitter__)

        try:
            await emitter.emit("Helper function is called")
            # Call the helper function
            results_json, _ = Utils.some_helper_function_not_exposed_to_LLM("argument")

            await emitter.emit(
                status="complete",
                description=f"Retrieved {len(results_json)} stuff",
                done=True,
            )

        except Exception as err:
            await emitter.emit(
                status="error",
                description=f"An error occurred while fetching containers: {str(err)}",
                done=True,
            )
            results_json = [{"error": str(err)}]

        return json.dumps(results_json, ensure_ascii=False)


if __name__ == "__main__":
    from pprint import pprint

    async def main():
        tools = Tools()
        answers = await tools.some_functionality_exposed_to_LLM()

        answers_json = json.loads(answers)
        for answer in answers_json:
            pprint(answer)
            print()

    # Run the async main function
    asyncio.run(main())
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment