Install MLX LM and openai
:
pip install mlx-lm openai
Run the MLX LM server with:
mlx_lm.server
Make a Python script (like test.py
). And include the following:
import openai
openai_client = openai.OpenAI(
api_key="placeholder-api", base_url="http://localhost:8080"
)
response = openai_client.chat.completions.create(
model='mlx-community/Meta-Llama-3-8B-Instruct-4bit',
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": f"Say this is a test!"},
],
)
# Process response.
Run the script python test.py
.
4. Curl
Response: