Skip to content

Instantly share code, notes, and snippets.

@fblissjr
Created July 28, 2024 23:16
Show Gist options
  • Save fblissjr/d01a3ca1046919d0825034947ee04cd5 to your computer and use it in GitHub Desktop.
Save fblissjr/d01a3ca1046919d0825034947ee04cd5 to your computer and use it in GitHub Desktop.
extending loom openai api to local llms

Loom Design Document: OpenAI API Modification for Local LLMs

1. Overview

This document outlines the necessary changes to modify Loom's OpenAI API implementation to support local Language Models (LLMs) that use the OpenAI API specification. This modification will allow users to interact with local LLMs using the same interface as the official OpenAI API.

2. Design Goals

  • Allow users to specify a custom base URL for OpenAI API calls
  • Maintain compatibility with the existing OpenAI implementation
  • Provide a seamless way to switch between official OpenAI models and local LLMs
  • Minimize changes to the existing codebase

3. Implementation Steps

3.1. Update the ModelPreset Interface

File: common.ts

Add a new baseUrl field to the ModelPreset interface:

export interface ModelPreset<P extends Provider> {
  name: string;
  provider: P;
  model: string;
  contextLength: number;
  apiKey: string;
  baseUrl?: string; // New field for custom base URL
  // ... other existing fields
}

3.2. Modify the completeOpenAI Method

File: main.ts

Update the completeOpenAI method to accept and use a custom base URL:

import { OpenAIApi, Configuration } from 'openai';

// ... existing imports and code

export default class LoomPlugin extends Plugin {
  // ... existing code

  async completeOpenAI(prompt: string): Promise<CompletionResult> {
    const preset = getPreset(this.settings) as ModelPreset<"openai">;
    const openai = new OpenAIApi(new Configuration({
      apiKey: preset.apiKey,
      basePath: preset.baseUrl || undefined, // Use custom base URL if provided
    }));

    try {
      const response = await openai.createCompletion({
        model: preset.model,
        prompt: prompt,
        max_tokens: this.settings.maxTokens,
        temperature: this.settings.temperature,
        top_p: this.settings.topP,
        frequency_penalty: this.settings.frequencyPenalty,
        presence_penalty: this.settings.presencePenalty,
      });

      return {
        ok: true,
        completions: response.data.choices.map(choice => choice.text || ""),
      };
    } catch (error) {
      console.error("OpenAI API error:", error);
      return {
        ok: false,
        status: error.response?.status || 500,
        message: error.message,
      };
    }
  }

  // ... rest of the class
}

3.3. Update the Settings Tab

File: settings.ts

Add a new setting for the custom base URL in the OpenAI preset section:

import { App, PluginSettingTab, Setting } from 'obsidian';
import LoomPlugin from './main';

export class LoomSettingTab extends PluginSettingTab {
  plugin: LoomPlugin;

  constructor(app: App, plugin: LoomPlugin) {
    super(app, plugin);
    this.plugin = plugin;
  }

  display(): void {
    const { containerEl } = this;
    containerEl.empty();

    // ... existing settings code

    const currentPreset = this.plugin.settings.modelPresets[this.plugin.settings.modelPreset];
    if (currentPreset.provider === "openai") {
      new Setting(containerEl)
        .setName("OpenAI Base URL")
        .setDesc("Enter a custom base URL for local LLMs (optional)")
        .addText(text => text
          .setPlaceholder("https://api.openai.com/v1")
          .setValue(currentPreset.baseUrl || "")
          .onChange(async (value) => {
            currentPreset.baseUrl = value || undefined;
            await this.plugin.saveSettings();
          }));
    }

    // ... rest of the settings
  }
}

3.4. Add a New Preset for Local LLMs

File: main.ts

In the onload method of the LoomPlugin class, add a new preset for local LLMs:

async onload() {
  // ... existing onload code

  const localLLMPreset: ModelPreset<"openai"> = {
    name: "Local LLM (OpenAI-compatible)",
    provider: "openai",
    model: "local-model", // This should match the model name expected by your local LLM
    contextLength: 4096, // Adjust as needed
    apiKey: "dummy-key", // Your local LLM might not need a real API key
    baseUrl: "http://localhost:8000/v1", // Default local URL, can be changed in settings
  };

  this.settings.modelPresets.push(localLLMPreset);
  await this.saveSettings();

  // ... rest of onload
}

3.5. Update the generate Method

File: main.ts

Modify the generate method to pass the baseUrl to the completion method:

async generate(file: TFile, rootNode: string | null) {
  // ... existing preparation logic

  const completionMethods: Record<Provider, (prompt: string) => Promise<CompletionResult>> = {
    openai: (prompt) => this.completeOpenAI(prompt),
    // ... other providers
  };

  const preset = getPreset(this.settings);
  const result = await completionMethods[preset.provider](prompt);

  // ... handle result and create nodes
}

Note that we don't need to modify the generate method signature or the completionMethods object, as the baseUrl is already included in the preset and will be used within the completeOpenAI method.

4. Testing

To ensure the modification works correctly, perform the following tests:

  1. Unit Tests:

    • Create unit tests for the completeOpenAI method, mocking the OpenAI API responses.
    • Test with both default and custom base URLs.
  2. Integration Tests:

    • Set up a local LLM server that implements the OpenAI API spec (e.g., using LocalAI).
    • Create an integration test that uses the local LLM preset and verifies the correct base URL is used.
  3. Manual Testing:

    • Install the modified plugin in Obsidian.
    • Configure a preset with a custom base URL pointing to a local LLM.
    • Perform completions and verify that requests are sent to the correct URL.
    • Switch between official OpenAI and local LLM presets to ensure smooth transitions.

5. Deployment

  1. Update the plugin version in manifest.json.
  2. Create a new release on the GitHub repository.
  3. Update the plugin in the Obsidian Community Plugins directory.

6. Documentation

Update the plugin's documentation to include:

  • Instructions on how to set up and use local LLMs with the plugin.
  • Explanation of the new "OpenAI Base URL" setting.
  • Any known limitations or considerations when using local LLMs.

7. Future Considerations

  • Implement a way to easily share and import custom LLM configurations
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment