Cookbook
Cookbook \ Instructor \ Basics
- Basic use
- Basic use via mixin
- Fluent API
- Handling errors with `Maybe` helper class
- Modes
- Making some fields optional
- Private vs public object field
- Automatic correction based on validation results
- Using attributes
- Using LLM API connection presets from config file
- Validation
- Custom validation using Symfony Validator
- Validation across multiple fields
- Validation with LLM
Cookbook \ Instructor \ Advanced
- Use custom configuration providers
- Context caching (structured output)
- Customize parameters of LLM driver
- Custom prompts
- Customize parameters via DSN
- Extracting arguments of function or method
- Logging monolog
- Logging psr
- Streaming partial updates during inference
- Providing example inputs and outputs
- Extracting scalar values
- Extracting sequences of objects
- Streaming
- Structures
Cookbook \ Instructor \ Troubleshooting
Cookbook \ Instructor \ LLM API Support
Cookbook \ Instructor \ Extras
- Extraction of complex objects
- Extraction of complex objects (Anthropic)
- Extraction of complex objects (Cohere)
- Extraction of complex objects (Gemini)
- Using structured data as an input
- Image processing - car damage detection
- Image to data (OpenAI)
- Image to data (Anthropic)
- Image to data (Gemini)
- Generating JSON Schema from PHP classes
- Generating JSON Schema from PHP classes
- Generating JSON Schema dynamically
- Create tasks from meeting transcription
- Translating UI text fields
- Web page to PHP objects
Cookbook \ Polyglot \ LLM Basics
- Working directly with LLMs
- Working directly with LLMs and JSON - JSON mode
- Working directly with LLMs and JSON - JSON Schema mode
- Working directly with LLMs and JSON - MdJSON mode
- Working directly with LLMs and JSON - Tools mode
- Generating JSON Schema from PHP classes
- Generating JSON Schema from PHP classes
Cookbook \ Polyglot \ LLM Advanced
Cookbook \ Polyglot \ LLM Troubleshooting
Cookbook \ Polyglot \ LLM API Support
Cookbook \ Polyglot \ LLM Extras
Cookbook \ Prompting \ Zero-Shot Prompting
Cookbook \ Prompting \ Few-Shot Prompting
Cookbook \ Prompting \ Thought Generation
Cookbook \ Prompting \ Miscellaneous
- Arbitrary properties
- Consistent values of arbitrary properties
- Chain of Summaries
- Chain of Thought
- Single label classification
- Multiclass classification
- Entity relationship extraction
- Handling errors
- Limiting the length of lists
- Reflection Prompting
- Restating instructions
- Ask LLM to rewrite instructions
- Expanding search queries
- Summary with Keywords
- Reusing components
- Using CoT to improve interpretation of component data
Cookbook \ Instructor \ Advanced
Customize parameters of LLM driver
Overview
You can provide your own LLM configuration instance to Instructor. This is useful when you want to initialize OpenAI client with custom values - e.g. to call other LLMs which support OpenAI API.
Example
Copy
<?php
require 'examples/boot.php';
use Cognesy\Config\Env;
use Cognesy\Events\Dispatchers\EventDispatcher;
use Cognesy\Http\Config\HttpClientConfig;
use Cognesy\Http\Drivers\Symfony\SymfonyDriver;
use Cognesy\Http\HttpClientBuilder;
use Cognesy\Instructor\StructuredOutput;
use Cognesy\Polyglot\Inference\Config\LLMConfig;
use Cognesy\Polyglot\Inference\Enums\OutputMode;
use Symfony\Component\HttpClient\HttpClient as SymfonyHttpClient;
class User {
public int $age;
public string $name;
}
$events = new EventDispatcher();
// Build fully customized HTTP client
$httpConfig = new HttpClientConfig(
connectTimeout: 30,
requestTimeout: 60,
idleTimeout: -1,
maxConcurrent: 5,
poolTimeout: 60,
failOnError: true,
);
$yourClientInstance = SymfonyHttpClient::create(['http_version' => '2.0']);
$customClient = (new HttpClientBuilder)
->withEventBus($events)
->withDriver(new SymfonyDriver(
config: $httpConfig,
clientInstance: $yourClientInstance,
events: $events,
))
->create();
// Create instance of LLM connection preset initialized with custom parameters
$llmConfig = new LLMConfig(
apiUrl : 'https://api.deepseek.com',
apiKey : Env::get('DEEPSEEK_API_KEY'),
endpoint: '/chat/completions', defaultModel: 'deepseek-chat', defaultMaxTokens: 128, driver: 'openai-compatible',
);
// Get Instructor with the default client component overridden with your own
$structuredOutput = (new StructuredOutput)
->withEventHandler($events)
->withLLMConfig($llmConfig)
->withHttpClient($customClient);
// Call with custom model and execution mode
$user = $structuredOutput
->wiretap(fn($e) => $e->print())
->with("Our user Jason is 25 years old.")
->withResponseClass(User::class)
->withOutputMode(OutputMode::Tools)
->withStreaming()
->get();
dump($user);
assert(isset($user->name));
assert(isset($user->age));
?>
Assistant
Responses are generated using AI and may contain mistakes.