Skip to content

Instantly share code, notes, and snippets.

@peterc
Last active August 8, 2025 16:49
Show Gist options
  • Save peterc/e38f5399e7f756f47f70bb6b846c763d to your computer and use it in GitHub Desktop.
Save peterc/e38f5399e7f756f47f70bb6b846c763d to your computer and use it in GitHub Desktop.
Calling GPT-5 from a plain Ruby script using the official OpenAI library
require "openai"
client = OpenAI::Client.new # assumes valid key in OPENAI_API_KEY
begin
resp = client.chat.completions.create(
model: :"gpt-5-nano",
reasoning_effort: :high,
verbosity: :low,
messages: [{ role: "user", content: "Tell me a joke" }]
)
puts resp.choices[0].message.content
puts "#{resp.usage.completion_tokens} (of which #{resp.usage.completion_tokens_details.reasoning_tokens} were reasoning tokens)"
rescue => e
puts e.body[:error][:message]
end
# reasoning_effort can be :minimal, :low, :medium, :high
# verbosity can be :low, :medium, :high
@peterc
Copy link
Author

peterc commented Aug 7, 2025

There are better options, such as RubyLLM, but I like using the bare OpenAI library for many things, and the documentation for it is so weird that it doesn't even include the choices[0].message.content stuff which I always forget. So this code is mostly here for me to copy and paste into experiments.

@peterc
Copy link
Author

peterc commented Aug 7, 2025

Besides, RubyLLM doesn't easily support gpt-5 just yet. This code will work:

require 'ruby_llm'

RubyLLM.configure do |config|
  config.openai_api_key = ENV.fetch("OPENAI_API_KEY")
end

chat = RubyLLM.chat(provider: :openai,
                    model: "gpt-5-nano",
                    assume_model_exists: true)
              .with_temperature(1.0)
              .with_params(reasoning_effort: :high, verbosity: :high)
              
response = chat.ask("Tell me a joke")
puts response.content

But the chat call is a bit verbose as it doesn't have built-in support yet (which, when added, will set the temperature appropriately and not need all the provider: and :assume_model_exists stuff) and so also raises a warning.

@peterc
Copy link
Author

peterc commented Aug 8, 2025

IMPORTANT UPDATE! RubyLLM does support GPT-5, I'm just an idiot who didn't read all the documentation. RubyLLM has a registry of models that stands distinct from the specific version you have installed and you can update it! More info at https://rubyllm.com/models/#refreshing-the-registry https://x.com/paolino/status/1953591077480665217

Thanks to @crmne for bringing this to my attention. My little "hack" above will force the issue, however, if for some reason you don't/can't/won't update the registry.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment