Cookbook
Cookbook \ Instructor \ Basics
- Basic use
- Basic use via mixin
- Handling errors with `Maybe` helper class
- Modes
- Making some fields optional
- Private vs public object field
- Automatic correction based on validation results
- Using attributes
- Using LLM API connections from config file
- Validation
- Custom validation using Symfony Validator
- Validation across multiple fields
- Validation with LLM
Cookbook \ Instructor \ Advanced
- Context caching (structured output)
- Customize parameters of LLM driver
- Custom prompts
- Customize parameters via DSN
- Using structured data as an input
- Extracting arguments of function or method
- Streaming partial updates during inference
- Providing example inputs and outputs
- Extracting scalar values
- Extracting sequences of objects
- Streaming
- Structures
Cookbook \ Instructor \ Troubleshooting
Cookbook \ Instructor \ LLM API Support
Cookbook \ Instructor \ Extras
- Extraction of complex objects
- Extraction of complex objects (Anthropic)
- Extraction of complex objects (Cohere)
- Extraction of complex objects (Gemini)
- Image processing - car damage detection
- Image to data (OpenAI)
- Image to data (Anthropic)
- Image to data (Gemini)
- Generating JSON Schema from PHP classes
- Generating JSON Schema from PHP classes
- Generating JSON Schema dynamically
- Create tasks from meeting transcription
- Translating UI text fields
- Web page to PHP objects
Cookbook \ Polyglot \ LLM Basics
- Working directly with LLMs
- Working directly with LLMs and JSON - JSON mode
- Working directly with LLMs and JSON - JSON Schema mode
- Working directly with LLMs and JSON - MdJSON mode
- Working directly with LLMs and JSON - Tools mode
- Generating JSON Schema from PHP classes
- Generating JSON Schema from PHP classes
Cookbook \ Polyglot \ LLM Advanced
Cookbook \ Polyglot \ LLM Troubleshooting
Cookbook \ Polyglot \ LLM API Support
Cookbook \ Polyglot \ LLM Extras
Cookbook \ Prompting \ Zero-Shot Prompting
Cookbook \ Prompting \ Few-Shot Prompting
Cookbook \ Prompting \ Thought Generation
Cookbook \ Prompting \ Miscellaneous
- Arbitrary properties
- Consistent values of arbitrary properties
- Chain of Summaries
- Chain of Thought
- Single label classification
- Multiclass classification
- Entity relationship extraction
- Handling errors
- Limiting the length of lists
- Reflection Prompting
- Restating instructions
- Ask LLM to rewrite instructions
- Expanding search queries
- Summary with Keywords
- Reusing components
- Using CoT to improve interpretation of component data
Cookbook \ Instructor \ Basics
Modes
Overview
Instructor supports several ways to extract data from the response:
OutputMode::Tools
- uses OpenAI-style tool calls to get the language model to generate JSON following the schema,OutputMode::JsonSchema
- guarantees output matching JSON Schema via Context Free Grammar, does not support optional properties,OutputMode::Json
- JSON mode, response follows provided JSON Schema,OutputMode::MdJson
- uses prompting to get the language model to generate JSON following the schema.
Note: not all modes are supported by all models or providers.
Mode can be set via parameter of Instructor::response()
or Instructor::request()
methods.
Example
<?php
require 'examples/boot.php';
use Cognesy\Instructor\Instructor;
use Cognesy\Polyglot\LLM\Enums\OutputMode;
class User {
public int $age;
public string $name;
}
$text = "Jason is 25 years old and works as an engineer.";
print("Input text:\n");
print($text . "\n\n");
$instructor = new Instructor;
// CASE 1 - OutputMode::Tools
print("\n1. Extracting structured data using LLM - OutputMode::Tools\n");
$user = $instructor->respond(
messages: $text,
responseModel: User::class,
mode: OutputMode::Tools,
);
check($user);
dump($user);
// CASE 2 - OutputMode::JsonSchema
print("\n2. Extracting structured data using LLM - OutputMode::JsonSchema\n");
$user = $instructor->respond(
messages: $text,
responseModel: User::class,
mode: OutputMode::JsonSchema,
);
check($user);
dump($user);
// CASE 3 - OutputMode::Json
print("\n3. Extracting structured data using LLM - OutputMode::Json\n");
$user = $instructor->respond(
messages: $text,
responseModel: User::class,
mode: OutputMode::Json,
);
check($user);
dump($user);
// CASE 4 - OutputMode::MdJson
print("\n4. Extracting structured data using LLM - OutputMode::MdJson\n");
$user = $instructor->respond(
messages: $text,
responseModel: User::class,
mode: OutputMode::MdJson,
);
check($user);
dump($user);
function check(User $user) {
assert(isset($user->name));
assert(isset($user->age));
assert($user->name === 'Jason');
assert($user->age === 25);
}
?>
Assistant
Responses are generated using AI and may contain mistakes.