Skip to content

Instantly share code, notes, and snippets.

@satokaz
Last active November 4, 2023 14:04
Show Gist options
  • Save satokaz/71d336be4e50a165e2f2e4afa300f6e9 to your computer and use it in GitHub Desktop.
Save satokaz/71d336be4e50a165e2f2e4afa300f6e9 to your computer and use it in GitHub Desktop.
Using LM Studio with localpilot

localpilot で LM Studio を使ってみる

  • default の Server Port は、 1234 。異なる場合は、config.py'domain': 'http://localhost:1234' を変更
  • python3 app.py --setup の実行で LM Studio の利用が default
  • /v1/chat/completions は利用できず、/v1/completions を利用
  • LM Studio の Local Inference Server で、CPU や Server Log をみて、何かやってるんだなぁというのがわかる
diff --git a/app.py b/app.py
index 9d06a3a..186680a 100644
--- a/app.py
+++ b/app.py
@@ -69,7 +69,7 @@ class ModelPickerApp(rumps.App):
                 self.menu_items[item].state = False
 
     def run_server(self):
-        subprocess.run(['python', 'proxy.py'])
+        subprocess.run(['python3', 'proxy.py'])
 
 
 if __name__ == '__main__':
diff --git a/config.py b/config.py
index 9b11232..cc09c0a 100644
--- a/config.py
+++ b/config.py
@@ -20,7 +20,11 @@ models = {
         'type': 'local',
         'filename': 'codellama-34b-instruct.Q4_K_M.gguf',
     },
-    'default': 'GitHub',
+    'LM Studio' : {
+        'domain': 'http://localhost:1234',
+        'type': 'lmstudio',
+    },
+    'default': 'LM Studio',
 }
 
 model_folder = os.path.expanduser('~/models')
diff --git a/proxy.py b/proxy.py
index 09d488c..94af6f6 100644
--- a/proxy.py
+++ b/proxy.py
@@ -53,6 +53,8 @@ async def proxy(request: Request):
         url = f"{state['domain']}{path}"
     elif state['type'] == 'local':
         url = f"http://localhost:8000{path}"
+    elif state['type'] == 'lmstudio':
+        url = f"{state['domain']}/v1/completions"
 
     data = await request.body()
     headers = dict(request.headers)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment