Skip to content

Instantly share code, notes, and snippets.

@limcheekin
Created May 13, 2025 08:57
Show Gist options
  • Select an option

  • Save limcheekin/174b7aa017accdb2c02308c21ebcc0e3 to your computer and use it in GitHub Desktop.

Select an option

Save limcheekin/174b7aa017accdb2c02308c21ebcc0e3 to your computer and use it in GitHub Desktop.
Gemini 2.5 OpenAI-compatible implicit caching support
#!/bin/bash
# --- Configuration ---
API_ENDPOINT="https://generativelanguage.googleapis.com/v1beta/openai/chat/completions" # Replace with the actual endpoint (e.g., from Google AI Studio, a third-party provider, etc.)
API_KEY="Your Gemini API Key" # Replace with your actual API key
FILE_PATH="report.md" # Replace with the path to your large text file
MODEL_NAME="gemini-2.5-flash-preview-04-17" # Replace with the specific Gemini 2.5 model name supported by your endpoint
# --- Construct JSON Payload using jq ---
# Reads the file content and embeds it as a string within the JSON
# --- Construct JSON Payload using jq and pipe directly to curl ---
# Uses --rawfile based on your jq --help output
jq -n \
--arg model "$MODEL_NAME" \
--arg user "$1" \
--rawfile file_content "$FILE_PATH" \
'{
"model": $model,
"messages": [
{
"role": "user",
"content": $file_content
},
{
"role": "user",
"content": $user
}
],
"max_tokens": 4096
}' | curl -X POST "$API_ENDPOINT" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $API_KEY" \
-d @-
# Note: The '| \' syntax pipes the output of the command on the left
# (jq...) to the command on the right (curl...).
# -d @- tells curl to read the POST data from stdin.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment