Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save gravitymonkey/04393648c7f8f6a116a2e4d331e6d60b to your computer and use it in GitHub Desktop.
Save gravitymonkey/04393648c7f8f6a116a2e4d331e6d60b to your computer and use it in GitHub Desktop.
Mistral 0.3 (May '24) with Ollama for Function Calling (ish)
# here's the RAW curl from their example in a copy/paste format to see the TOOLS response
curl -X POST http://localhost:11434/api/generate -H "Content-Type: application/json" -d '{
"model": "mistral",
"prompt": "[AVAILABLE_TOOLS] [{\"type\": \"function\", \"function\": {\"name\": \"get_current_weather\", \"description\": \"Get the current weather\", \"parameters\": {\"type\": \"object\", \"properties\": {\"location\": {\"type\": \"string\", \"description\": \"The city and state, e.g. San Francisco, CA\"}, \"format\": {\"type\": \"string\", \"enum\": [\"celsius\", \"fahrenheit\"], \"description\": \"The temperature unit to use. Infer this from the users location.\"}}, \"required\": [\"location\", \"format\"]}}}][/AVAILABLE_TOOLS][INST] What is the weather like today in San Francisco [/INST]",
"raw": true,
"stream": false
}'
@gravitymonkey
Copy link
Author

Keep in mind that function calling here is not compatible with the OpenAI API, it works but using raw via Ollama. Here's the response I got from the curl, above: {"model":"mistral","created_at":"2024-05-23T16:09:19.29958Z","response":"[TOOL_CALLS] [{\"name\": \"get_current_weather\", \"arguments\": {\"location\": \"San Francisco, CA\", \"format\": \"celsius\"}}]","done":true,"done_reason":"stop","total_duration":8080074416,"load_duration":6185662708,"prompt_eval_count":136,"prompt_eval_duration":715246000,"eval_count":36,"eval_duration":1159088000}%

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment