If you use LiteLLM to proxy requests to Ollama.ai in corporate environments, you may encounter the following error in your Python application:
httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1006)
The error is understandable since in such environments, corpororates always perform man-in-the-middle mechanism to monitor network traffic ins and outs.
Disable cert verification. If you're not happy with other results from Google, this snippet may work:
import httpx
import openai
import os
api_base = "http://0.0.0.0:8000" # litellm server
os.environ["OPENAI_BASE_URL"] = api_base
openai.api_key = "temp-key"
openai.http_client = httpx.Client(verify=False)
client = openai.OpenAI(api_key="anything")
content="why is the sky blue?"
response = client.chat.completions.create(
model="ollama/codellama",
messages=[{
"role": "user",
"content": content
}],
stream=True
)
for chunk in response:
print(chunk.choices[0].delta.content or "", end="")
This is not working (anymore).