Skip to content

Instantly share code, notes, and snippets.

@awni
Last active August 16, 2024 05:14
Show Gist options
  • Save awni/a4613c7b25ec0f7c2935d511cf579c07 to your computer and use it in GitHub Desktop.
Save awni/a4613c7b25ec0f7c2935d511cf579c07 to your computer and use it in GitHub Desktop.

MLX LM with the OpenAI Python Package

1. Install

Install MLX LM and openai:

pip install mlx-lm openai

2. Run the MLX LM server

Run the MLX LM server with:

mlx_lm.server

3. Make the HTTP request:

Make a Python script (like test.py). And include the following:

import openai

openai_client = openai.OpenAI(
  api_key="placeholder-api", base_url="http://localhost:8080"
)

response = openai_client.chat.completions.create(
    model='mlx-community/Meta-Llama-3-8B-Instruct-4bit',
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": f"Say this is a test!"},
    ],
)

# Process response.

Run the script python test.py.

@awni
Copy link
Author

awni commented Jul 8, 2024

Try updating MLX LM pip install -U mlx-lm, It should work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment