Skip to content

Instantly share code, notes, and snippets.

@georgejean
Last active June 5, 2025 16:28
Show Gist options
  • Save georgejean/c19c6af1abfe8cdbf07481329d992360 to your computer and use it in GitHub Desktop.
Save georgejean/c19c6af1abfe8cdbf07481329d992360 to your computer and use it in GitHub Desktop.
Ollama and openAI Agent SDK : a starter
from openai import AsyncOpenAI
from agents import Agent, Runner, OpenAIChatCompletionsModel
from agents import set_tracing_disabled
from openai.types.responses import ResponseTextDeltaEvent
import asyncio
external_client = AsyncOpenAI(
base_url = 'http://localhost:11434/v1',
api_key='ollama', # required, but unused
)
set_tracing_disabled(True)
model = OpenAIChatCompletionsModel(
model="qwen3:8b",
openai_client=external_client,
)
async def main():
agent = Agent(
name="Joker",
instructions="You are a helpful assistant.",
model=model
)
result = Runner.run_streamed(agent, input="Please tell me 5 jokes.")
async for event in result.stream_events():
if event.type == "raw_response_event" and isinstance(event.data, ResponseTextDeltaEvent):
print(event.data.delta, end="", flush=True)
if __name__ == "__main__":
asyncio.run(main())
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment