Cookbook
Cookbook \ Instructor \ Basics
- Basic use
- Basic use via mixin
- Fluent API
- Handling errors with `Maybe` helper class
- Modes
- Making some fields optional
- Private vs public object field
- Automatic correction based on validation results
- Using attributes
- Using LLM API connection presets from config file
- Validation
- Custom validation using Symfony Validator
- Validation across multiple fields
- Validation with LLM
Cookbook \ Instructor \ Advanced
- Use custom configuration providers
- Context caching (structured output)
- Customize parameters of LLM driver
- Custom prompts
- Customize parameters via DSN
- Extracting arguments of function or method
- Logging monolog
- Logging psr
- Streaming partial updates during inference
- Providing example inputs and outputs
- Extracting scalar values
- Extracting sequences of objects
- Streaming
- Structures
Cookbook \ Instructor \ Troubleshooting
Cookbook \ Instructor \ LLM API Support
Cookbook \ Instructor \ Extras
- Extraction of complex objects
- Extraction of complex objects (Anthropic)
- Extraction of complex objects (Cohere)
- Extraction of complex objects (Gemini)
- Using structured data as an input
- Image processing - car damage detection
- Image to data (OpenAI)
- Image to data (Anthropic)
- Image to data (Gemini)
- Generating JSON Schema from PHP classes
- Generating JSON Schema from PHP classes
- Generating JSON Schema dynamically
- Create tasks from meeting transcription
- Translating UI text fields
- Web page to PHP objects
Cookbook \ Polyglot \ LLM Basics
- Working directly with LLMs
- Working directly with LLMs and JSON - JSON mode
- Working directly with LLMs and JSON - JSON Schema mode
- Working directly with LLMs and JSON - MdJSON mode
- Working directly with LLMs and JSON - Tools mode
- Generating JSON Schema from PHP classes
- Generating JSON Schema from PHP classes
Cookbook \ Polyglot \ LLM Advanced
Cookbook \ Polyglot \ LLM Troubleshooting
Cookbook \ Polyglot \ LLM API Support
Cookbook \ Polyglot \ LLM Extras
Cookbook \ Prompting \ Zero-Shot Prompting
Cookbook \ Prompting \ Few-Shot Prompting
Cookbook \ Prompting \ Thought Generation
Cookbook \ Prompting \ Miscellaneous
- Arbitrary properties
- Consistent values of arbitrary properties
- Chain of Summaries
- Chain of Thought
- Single label classification
- Multiclass classification
- Entity relationship extraction
- Handling errors
- Limiting the length of lists
- Reflection Prompting
- Restating instructions
- Ask LLM to rewrite instructions
- Expanding search queries
- Summary with Keywords
- Reusing components
- Using CoT to improve interpretation of component data
Cookbook \ Instructor \ LLM API Support
Hugging Face
Overview
You can use Instructor to parse structured output from LLMs using Hugging Face API. This example demonstrates how to parse user data into a structured model using JSON Schema.
Example
Copy
<?php
require 'examples/boot.php';
use Cognesy\Instructor\StructuredOutput;
use Cognesy\Polyglot\Inference\Enums\OutputMode;
enum UserType : string {
case Guest = 'guest';
case User = 'user';
case Admin = 'admin';
}
class User {
public string $firstName;
public UserType $role;
/** @var string[] */
public array $hobbies;
public string $username;
public ?int $age;
}
// Get Instructor with specified LLM client connection
// See: /config/llm.php to check or change LLM client connection configuration details
$structuredOutput = (new StructuredOutput)->using('huggingface');
$user = $structuredOutput
->with(
messages: "Jason (@jxnlco) is 25 years old. He is the admin of this project. He likes playing football and reading books.",
responseModel: User::class,
prompt: 'Parse the user data to JSON, respond using following JSON Schema: <|json_schema|>',
examples: [[
'input' => 'Ive got email Frank - their developer, who\'s 30. He asked to come back to him frank@hk.ch. Btw, he plays on drums!',
'output' => ['firstName' => 'Frank', 'age' => 30, 'username' => 'frank@hk.ch', 'role' => 'user', 'hobbies' => ['playing drums'],],
],[
'input' => 'We have a meeting with John, our new admin who likes surfing. He is 19 years old - check his profile: @jx90.',
'output' => ['firstName' => 'John', 'role' => 'admin', 'hobbies' => ['surfing'], 'username' => 'jx90', 'age' => 19],
]],
//model: 'deepseek-ai/DeepSeek-R1-0528-Qwen3-8B',
maxRetries: 2,
options: ['temperature' => 0.5],
mode: OutputMode::Json,
)->get();
print("Completed response model:\n\n");
dump($user);
assert(isset($user->firstName));
assert(isset($user->role));
assert(isset($user->age));
assert(isset($user->hobbies));
assert(isset($user->username));
assert(is_array($user->hobbies));
assert(count($user->hobbies) > 0);
assert($user->role === UserType::Admin);
assert($user->age === 25);
assert($user->firstName === 'Jason');
assert(in_array($user->username, ['jxnlco', '@jxnlco']));
?>
Assistant
Responses are generated using AI and may contain mistakes.