Skip to content

Instantly share code, notes, and snippets.

@garyblankenship
Created March 16, 2025 15:57
Show Gist options
  • Save garyblankenship/673e4bd3a2113545de541f8094854f60 to your computer and use it in GitHub Desktop.
Save garyblankenship/673e4bd3a2113545de541f8094854f60 to your computer and use it in GitHub Desktop.
laravel echolabs prism ai
# Prism - Laravel LLM Integration
Prism is a powerful Laravel package for integrating Large Language Models (LLMs) into your applications. It provides a unified interface to work with various AI providers, enabling you to focus on building innovative AI features without getting bogged down in provider-specific implementation details.
## Core Concepts
### Installation and Configuration
```php
// Installation
composer require prism-php/prism
// Publish config
php artisan vendor:publish --tag=prism-config
```
Configure in `config/prism.php`:
```php
return [
'prism_server' => [
'middleware' => [],
'enabled' => env('PRISM_SERVER_ENABLED', true),
],
'providers' => [
'anthropic' => [
'api_key' => env('ANTHROPIC_API_KEY', ''),
'version' => env('ANTHROPIC_API_VERSION', '2023-06-01'),
'default_thinking_budget' => env('ANTHROPIC_DEFAULT_THINKING_BUDGET', 1024),
'anthropic_beta' => env('ANTHROPIC_BETA', null),
],
'openai' => [
'url' => env('OPENAI_URL', 'https://api.openai.com/v1'),
'api_key' => env('OPENAI_API_KEY', ''),
'organization' => env('OPENAI_ORGANIZATION', null),
],
// Other providers...
],
];
```
### Text Generation
Basic text generation:
```php
use Prism\Prism\Prism;
use Prism\Prism\Enums\Provider;
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withPrompt('Tell me a short story about a brave knight.')
->asText();
echo $response->text;
```
With system prompts:
```php
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withSystemPrompt('You are an expert mathematician who explains concepts simply.')
->withPrompt('Explain the Pythagorean theorem.')
->asText();
```
Using Laravel views for complex prompts:
```php
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withSystemPrompt(view('prompts.math-tutor'))
->withPrompt('What is calculus?')
->asText();
```
Message chains for conversations:
```php
use Prism\Prism\ValueObjects\Messages\UserMessage;
use Prism\Prism\ValueObjects\Messages\AssistantMessage;
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withMessages([
new UserMessage('What is JSON?'),
new AssistantMessage('JSON is a lightweight data format...'),
new UserMessage('Can you show me an example?')
])
->asText();
```
### Tools & Function Calling
Tools extend AI capabilities by providing access to specific functions:
```php
use Prism\Prism\Prism;
use Prism\Prism\Enums\Provider;
use Prism\Prism\Facades\Tool;
$weatherTool = Tool::as('weather')
->for('Get current weather conditions')
->withStringParameter('city', 'The city to get weather for')
->using(function (string $city): string {
// Your weather API logic here
return "The weather in {$city} is sunny and 72°F.";
});
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-latest')
->withMaxSteps(2)
->withPrompt('What is the weather like in Paris?')
->withTools([$weatherTool])
->asText();
```
Tool parameter types:
- `withStringParameter` - For text inputs
- `withNumberParameter` - For numeric values
- `withBooleanParameter` - For true/false flags
- `withArrayParameter` - For lists of items
- `withEnumParameter` - For restricted value sets
- `withObjectParameter` - For complex objects
- `withParameter` - For schema-based parameters
Tool choice options:
```php
use Prism\Prism\Enums\ToolChoice;
// Let the AI decide whether to use tools
->withToolChoice(ToolChoice::Auto)
// Force the AI to use a tool
->withToolChoice(ToolChoice::Any)
// Force the AI to use a specific tool
->withToolChoice('weather');
```
Complex tool implementation:
```php
namespace App\Tools;
use Prism\Prism\Tool;
use Illuminate\Support\Facades\Http;
class SearchTool extends Tool
{
public function __construct()
{
$this
->as('search')
->for('useful when you need to search for current events')
->withStringParameter('query', 'Detailed search query.')
->using($this);
}
public function __invoke(string $query): string
{
// Implementation...
}
}
```
### Structured Output
Define structured responses with schemas:
```php
use Prism\Prism\Prism;
use Prism\Prism\Enums\Provider;
use Prism\Prism\Schema\ObjectSchema;
use Prism\Prism\Schema\StringSchema;
$schema = new ObjectSchema(
name: 'movie_review',
description: 'A structured movie review',
properties: [
new StringSchema('title', 'The movie title'),
new StringSchema('rating', 'Rating out of 5 stars'),
new StringSchema('summary', 'Brief review summary')
],
requiredFields: ['title', 'rating', 'summary']
);
$response = Prism::structured()
->using(Provider::OpenAI, 'gpt-4o')
->withSchema($schema)
->withPrompt('Review the movie Inception')
->asStructured();
// Access your structured data
$review = $response->structured;
echo $review['title']; // "Inception"
echo $review['rating']; // "5 stars"
echo $review['summary']; // "A mind-bending..."
```
Available Schema Types:
- `StringSchema` - For text values
- `NumberSchema` - For integers and floating-point numbers
- `BooleanSchema` - For true/false values
- `ArraySchema` - For lists of items
- `EnumSchema` - For restricted value sets
- `ObjectSchema` - For complex, nested data structures
### Streaming Output
Stream responses in real-time:
```php
use Prism\Prism\Prism;
$response = Prism::stream()
->using('openai', 'gpt-4')
->withPrompt('Tell me a story about a brave knight.')
->asStream();
// Process each chunk as it arrives
foreach ($response as $chunk) {
echo $chunk->text;
// Flush the output buffer to send text to the browser immediately
ob_flush();
flush();
}
```
In a Laravel controller:
```php
use Prism\Prism\Prism;
use Illuminate\Http\Response;
public function streamResponse()
{
return response()->stream(function () {
$stream = Prism::stream()
->using('openai', 'gpt-4')
->withPrompt('Explain quantum computing step by step.')
->asStream();
foreach ($stream as $chunk) {
echo $chunk->text;
ob_flush();
flush();
}
}, 200, [
'Cache-Control' => 'no-cache',
'Content-Type' => 'text/event-stream',
'X-Accel-Buffering' => 'no', // Prevents Nginx from buffering
]);
}
```
With Laravel 12 Event Streams:
```php
Route::get('/chat', function () {
return response()->eventStream(function () {
$stream = Prism::stream()
->using('openai', 'gpt-4')
->withPrompt('Explain quantum computing step by step.')
->asStream();
foreach ($stream as $response) {
yield $response->text;
}
});
});
```
### Embeddings
Generate vector representations of text:
```php
use Prism\Prism\Prism;
use Prism\Prism\Enums\Provider;
$response = Prism::embeddings()
->using(Provider::OpenAI, 'text-embedding-3-large')
->fromInput('Your text goes here')
->asEmbeddings();
// Get your embeddings vector
$embeddings = $response->embeddings[0]->embedding;
// Check token usage
echo $response->usage->tokens;
```
Generate multiple embeddings:
```php
$response = Prism::embeddings()
->using(Provider::OpenAI, 'text-embedding-3-large')
// First embedding
->fromInput('Your text goes here')
// Second embedding
->fromInput('Your second text goes here')
// Third and fourth embeddings
->fromArray([
'Third',
'Fourth'
])
->asEmbeddings();
/** @var Embedding $embedding */
foreach ($embeddings as $embedding) {
// Do something with your embeddings
$embedding->embedding;
}
```
From file:
```php
$response = Prism::embeddings()
->using(Provider::OpenAI, 'text-embedding-3-large')
->fromFile('/path/to/your/document.txt')
->asEmbeddings();
```
### Input Modalities
#### Images
```php
use Prism\Prism\ValueObjects\Messages\UserMessage;
use Prism\Prism\ValueObjects\Messages\Support\Image;
// From a local file
$message = new UserMessage(
"What's in this image?",
[Image::fromPath('/path/to/image.jpg')]
);
// From a URL
$message = new UserMessage(
'Analyze this diagram:',
[Image::fromUrl('https://example.com/diagram.png')]
);
// From a Base64
$image = base64_encode(file_get_contents('/path/to/image.jpg'));
$message = new UserMessage(
'Analyze this diagram:',
[Image::fromBase64($image)]
);
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withMessages([$message])
->generate();
```
#### Documents
```php
use Prism\Prism\ValueObjects\Messages\UserMessage;
use Prism\Prism\ValueObjects\Messages\Support\Document;
Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withMessages([
// From base64
new UserMessage('Here is the document from base64', [
Document::fromBase64(base64_encode(file_get_contents('tests/Fixtures/test-pdf.pdf')), 'application/pdf'),
]),
// Or from a path
new UserMessage('Here is the document from a local path', [
Document::fromPath('tests/Fixtures/test-pdf.pdf'),
]),
// Or from a text string
new UserMessage('Here is the document from a text string (e.g. from your database)', [
Document::fromText('Hello world!'),
]),
])
->generate();
```
### Testing
Using the fake implementation:
```php
use Prism\Prism\Prism;
use Prism\Prism\Enums\Provider;
use Prism\Prism\ValueObjects\Usage;
use Prism\Prism\ValueObjects\ResponseMeta;
use Prism\Prism\Enums\FinishReason;
use Prism\Prism\Text\Response as TextResponse;
it('can generate text', function () {
// Create a fake text response
$fakeResponse = new TextResponse(
text: 'Hello, I am Claude!',
steps: collect([]),
responseMessages: collect([]),
toolCalls: [],
toolResults: [],
usage: new Usage(10, 20),
finishReason: FinishReason::Stop,
meta: new Meta('fake-1', 'fake-model'),
messages: collect([]),
additionalContent: []
);
// Set up the fake
$fake = Prism::fake([$fakeResponse]);
// Run your code
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-latest')
->withPrompt('Who are you?')
->asText();
// Make assertions
expect($response->text)->toBe('Hello, I am Claude!');
});
```
Using ResponseBuilder:
```php
use Prism\Prism\Text\ResponseBuilder;
use Prism\Prism\Text\Step;
use Prism\Prism\ValueObjects\Usage;
use Prism\Prism\ValueObjects\ResponseMeta;
use Prism\Prism\ValueObjects\ToolCall;
Prism::fake([
(new ResponseBuilder)
->addStep(new Step(
text: "Step 1 response text",
finishReason: FinishReason::Stop,
toolCalls: [/** tool calls */],
toolResults: [/** tool results */],
usage: new Usage(1000, 750),
meta: new Meta(id: 123, model: 'test-model'),
messages: [/** messages */],
systemPrompts: [/** system prompts */],
additionalContent: ['test' => 'additional']
))
->toResponse()
]);
```
Testing tools:
```php
it('can use weather tool', function () {
// Define the expected tool call and response sequence
$responses = [
// First response: AI decides to use the weather tool
new TextResponse(
text: '', // Empty text since the AI is using a tool
steps: collect([]),
responseMessages: collect([]),
toolCalls: [
new ToolCall(
id: 'call_123',
name: 'weather',
arguments: ['city' => 'Paris']
)
],
toolResults: [],
usage: new Usage(15, 25),
finishReason: FinishReason::ToolCalls,
meta: new Meta('fake-1', 'fake-model'),
messages: collect([]),
additionalContent: []
),
// Second response: AI uses the tool result to form a response
new TextResponse(
text: 'Based on current conditions, the weather in Paris is sunny with a temperature of 72°F.',
steps: collect([]),
responseMessages: collect([]),
toolCalls: [],
toolResults: [],
usage: new Usage(20, 30),
finishReason: FinishReason::Stop,
meta: new Meta('fake-2', 'fake-model'),
messages: collect([]),
additionalContent: []
),
];
// Set up the fake
$fake = Prism::fake($responses);
// Create the weather tool
$weatherTool = Tool::as('weather')
->for('Get weather information')
->withStringParameter('city', 'City name')
->using(fn (string $city) => "The weather in {$city} is sunny with a temperature of 72°F");
// Run the actual test
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-latest')
->withPrompt('What\'s the weather in Paris?')
->withTools([$weatherTool])
->asText();
// Assert the correct number of API calls were made
$fake->assertCallCount(2);
// Assert tool calls were made correctly
expect($response->steps[0]->toolCalls)->toHaveCount(1);
expect($response->steps[0]->toolCalls[0]->name)->toBe('weather');
expect($response->steps[0]->toolCalls[0]->arguments())->toBe(['city' => 'Paris']);
// Assert tool results were processed
expect($response->toolResults)->toHaveCount(1);
expect($response->toolResults[0]->result)
->toBe('The weather in Paris is sunny with a temperature of 72°F');
// Assert final response
expect($response->text)
->toBe('Based on current conditions, the weather in Paris is sunny with a temperature of 72°F.');
});
```
### Prism Server
Expose your Prism-powered AI models through a standardized API:
Enable in `config/prism.php`:
```php
'prism_server' => [
'middleware' => [],
'enabled' => env('PRISM_SERVER_ENABLED', true),
]
```
Register models in a service provider:
```php
use Prism\Prism\Prism;
use Prism\Prism\Enums\Provider;
use Prism\Prism\Facades\PrismServer;
use Illuminate\Support\ServiceProvider;
class AppServiceProvider extends ServiceProvider
{
public function boot(): void
{
PrismServer::register(
'my-custom-model',
fn () => Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-latest')
->withSystemPrompt('You are a helpful assistant.')
);
}
}
```
Endpoints:
- Chat Completions: `http://your-app.com/prism/openai/v1/chat/completions`
- List Models: `http://your-app.com/prism/openai/v1/models`
### Error Handling
Basic error handling:
```php
use Prism\Prism\Prism;
use Prism\Prism\Enums\Provider;
use Prism\Prism\Exceptions\PrismException;
use Throwable;
try {
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withPrompt('Generate text...')
->asText();
} catch (PrismException $e) {
Log::error('Text generation failed:', ['error' => $e->getMessage()]);
} catch (Throwable $e) {
Log::error('Generic error:', ['error' => $e->getMessage()]);
}
```
Provider-specific exceptions:
- `PrismStructuredDecodingException` - Invalid JSON for structured requests
- `PrismRateLimitedException` - Rate limit or quota hit
- `PrismProviderOverloadedException` - Provider capacity issues
- `PrismRequestTooLargeException` - Request too large
## Advanced Features
### Rate Limit Handling
Catching rate limit exceptions:
```php
use Prism\Prism\Prism;
use Prism\Enums\Provider;
use Prism\Prism\ValueObjects\ProviderRateLimit;
use Prism\Prism\Exceptions\PrismRateLimitedException;
try {
Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withPrompt('Hello world!')
->generate();
}
catch (PrismRateLimitedException $e) {
/** @var ProviderRateLimit $rate_limit */
foreach ($e->rateLimits as $rate_limit) {
// Loop through rate limits...
}
// Log, fail gracefully, etc.
}
```
Finding which rate limit you've hit:
```php
use Prism\Prism\ValueObjects\ProviderRateLimit;
use Illuminate\Support\Arr;
try {
// Your request
}
catch (PrismRateLimitedException $e) {
$hit_limit = Arr::first($e->rateLimits, fn(ProviderRateLimit $rate_limit) => $rate_limit->remaining === 0);
}
```
### Custom Providers
Creating a custom provider:
```php
namespace App\Prism\Providers;
use Prism\Prism\Contracts\Provider;
use Prism\Prism\Embeddings\Request as EmbeddingsRequest;
use Prism\Prism\Embeddings\Response as EmbeddingsResponse;
use Prism\Prism\Structured\Request as StructuredRequest;
use Prism\Prism\Structured\Response as StructuredResponse;
use Prism\Prism\Text\Request as TextRequest;
use Prism\Prism\Text\Response as TextResponse;
class MyCustomProvider implements Provider
{
public function text(TextRequest $request): TextResponse
{
// Implementation...
}
public function structured(StructuredRequest $request): StructuredResponse
{
// Implementation...
}
public function embeddings(EmbeddingsRequest $request): EmbeddingsResponse
{
// Implementation...
}
}
```
Registering a custom provider:
```php
namespace App\Providers;
use App\Prism\Providers\MyCustomProvider;
use Illuminate\Support\ServiceProvider;
class AppServiceProvider extends ServiceProvider
{
public function boot(): void
{
$this->app['prism-manager']->extend('my-custom-provider', function ($app, $config) {
return new MyCustomProvider(
apiKey: $config['api_key'] ?? '',
);
});
}
}
```
Add provider configuration:
```php
// config/prism.php
return [
'providers' => [
// ... other providers ...
'my-custom-provider' => [
'api_key' => env('MY_CUSTOM_PROVIDER_API_KEY'),
],
],
];
```
## Provider-Specific Features
### Anthropic
#### Prompt Caching
```php
use Prism\Enums\Provider;
use Prism\Prism\Prism;
use Prism\Prism\Tool;
use Prism\Prism\ValueObjects\Messages\UserMessage;
use Prism\Prism\ValueObjects\Messages\SystemMessage;
Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withMessages([
(new SystemMessage('I am a long re-usable system message.'))
->withProviderMeta(Provider::Anthropic, ['cacheType' => 'ephemeral']),
(new UserMessage('I am a long re-usable user message.'))
->withProviderMeta(Provider::Anthropic, ['cacheType' => 'ephemeral'])
])
->withTools([
Tool::as('cache me')
->withProviderMeta(Provider::Anthropic, ['cacheType' => 'ephemeral'])
])
->asText();
```
#### Extended Thinking
```php
use Prism\Enums\Provider;
use Prism\Prism\Prism;
Prism::text()
->using('anthropic', 'claude-3-7-sonnet-latest')
->withPrompt('What is the meaning of life?')
// enable thinking
->withProviderMeta(Provider::Anthropic, ['thinking' => ['enabled' => true]])
->asText();
// With custom budget
Prism::text()
->using('anthropic', 'claude-3-7-sonnet-latest')
->withPrompt('What is the meaning of life?')
->withProviderMeta(Provider::Anthropic, [
'thinking' => [
'enabled' => true,
'budgetTokens' => 2048
]
]);
```
Accessing thinking output:
```php
$response->additionalContent['thinking'];
// Or on a specific step
$response->steps->first()->additionalContent->thinking;
```
#### Citations
```php
use Prism\Enums\Provider;
use Prism\Prism\Prism;
use Prism\Prism\ValueObjects\Messages\UserMessage;
use Prism\Prism\ValueObjects\Messages\Support\Document;
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-5-sonnet-20241022')
->withMessages([
new UserMessage(
content: "Is the grass green and the sky blue?",
additionalContent: [
Document::fromChunks(
chunks: ["The grass is green.", "Flamingos are pink.", "The sky is blue."],
title: 'The colours of nature',
context: 'The go-to textbook on the colours found in nature!'
)
]
)
])
->withProviderMeta(Provider::Anthropic, ['citations' => true])
->asText();
```
Accessing citations:
```php
use Prism\Prism\Providers\Anthropic\ValueObjects\MessagePartWithCitations;
use Prism\Prism\Providers\Anthropic\ValueObjects\Citation;
$messageChunks = $response->additionalContent['messagePartsWithCitations'];
$text = '';
$footnotes = '';
$footnoteId = 1;
/** @var MessagePartWithCitations $messageChunk */
foreach ($messageChunks as $messageChunk) {
$text .= $messageChunk->text;
/** @var Citation $citation */
foreach ($messageChunk->citations as $citation) {
$footnotes[] = [
'id' => $footnoteId,
'document_title' => $citation->documentTitle,
'reference_start' => $citation->startIndex,
'reference_end' => $citation->endIndex
];
$text .= '<sup><a href="#footnote-'.$footnoteId.'">'.$footnoteId.'</a></sup>';
$footnoteId++;
}
}
```
### OpenAI
#### Strict Tool Schemas
```php
Tool::as('search')
->for('Searching the web')
->withStringParameter('query', 'the detailed search query')
->using(fn (): string => '[Search results]')
->withProviderMeta(Provider::OpenAI, [
'strict' => true,
]);
```
#### Strict Structured Output Schemas
```php
$response = Prism::structured()
->withProviderMeta(Provider::OpenAI, [
'schema' => [
'strict' => true
]
]);
```
### VoyageAI
#### Input Type
```php
use Prism\Prism\Enums\Provider;
use Prism\Prism\Prism;
// For search / querying
Prism::embeddings()
->using(Provider::VoyageAI, 'voyage-3-lite')
->fromInput('The food was delicious and the waiter...')
->withProviderMeta(Provider::VoyageAI, ['inputType' => 'query'])
->generate();
// For document retrieval
Prism::embeddings()
->using(Provider::VoyageAI, 'voyage-3-lite')
->fromInput('The food was delicious and the waiter...')
->withProviderMeta(Provider::VoyageAI, ['inputType' => 'document'])
->generate();
```
## Common Configuration Parameters
### Generation Parameters
- `withMaxTokens` - Maximum number of tokens to generate
- `usingTemperature` - Controls randomness (0 = deterministic, higher = more random)
- `usingTopP` - Nucleus sampling alternative to temperature
- `withClientOptions` - HTTP client options (e.g., timeouts)
- `withClientRetry` - Automatic retry configuration
- `usingProviderConfig` - Override provider configuration
### Response Properties
- `$response->text` - Generated text content
- `$response->finishReason` - Why generation stopped (Stop, Length, ContentFilter, ToolCalls, Error, Other, Unknown)
- `$response->usage` - Token usage statistics
- `$response->steps` - For multi-step generations
- `$response->responseMessages` - Message history
- `$response->toolCalls` - Tool calls made by the AI
- `$response->toolResults` - Results from tool executions
- `$response->structured` - Parsed structured data (for structured output)
## Supported Providers
- Anthropic (Claude)
- DeepSeek
- Gemini
- Groq
- Mistral
- Ollama
- OpenAI
- VoyageAI
- xAI
Each provider has specific features, limitations, and configuration options. Check the provider-specific documentation for details.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment