Troubleshooting
Debugging
Basics
- Basic use
- Basic use via mixin
- Handling errors with `Maybe` helper class
- Modes
- Making some fields optional
- Private vs public object field
- Automatic correction based on validation results
- Using attributes
- Using LLM API connections from config file
- Validation
- Custom validation using Symfony Validator
- Validation across multiple fields
Advanced
- Context caching
- Context caching (Anthropic)
- Customize parameters of OpenAI client
- Custom prompts
- Using structured data as an input
- Extracting arguments of function or method
- Streaming partial updates during inference
- Providing example inputs and outputs
- Extracting scalar values
- Extracting sequences of objects
- Streaming
- Structures
Troubleshooting
LLM API Support
Extras
- Extraction of complex objects
- Extraction of complex objects (Anthropic)
- Extraction of complex objects (Cohere)
- Extraction of complex objects (Gemini)
- Embeddings
- Image processing - car damage detection
- Image to data (OpenAI)
- Image to data (Anthropic)
- Image to data (Gemini)
- Working directly with LLMs
- Working directly with LLMs and JSON - JSON mode
- Working directly with LLMs and JSON - JSON Schema mode
- Working directly with LLMs and JSON - MdJSON mode
- Inference and tool use
- Working directly with LLMs and JSON - Tools mode
- Prompts
- Generating JSON Schema from PHP classes
- Generating JSON Schema dynamically
- Simple content summary
- Create tasks from meeting transcription
- Translating UI text fields
- Web page to PHP objects
Troubleshooting
Debugging
Overview
The Instructor
class has a withDebug()
method that can be used to debug the request and response.
It displays detailed information about the request being sent to LLM API and response received from it, including:
- request headers, URI, method and body,
- response status, headers, and body.
This is useful for debugging the request and response when you are not getting the expected results.
Example
<?php
$loader = require 'vendor/autoload.php';
$loader->add('Cognesy\\Instructor\\', __DIR__ . '../../src/');
use Cognesy\Instructor\Features\LLM\Data\LLMConfig;
use Cognesy\Instructor\Features\LLM\Drivers\OpenAIDriver;
use Cognesy\Instructor\Instructor;
class User {
public int $age;
public string $name;
}
// CASE 1 - normal flow
$instructor = (new Instructor)->withConnection('openai');
echo "\n### CASE 1.1 - Debugging sync request\n\n";
$user = $instructor->withDebug()->respond(
messages: "Jason is 25 years old.",
responseModel: User::class,
options: [ 'stream' => false ]
);
echo "\nResult:\n";
dump($user);
echo "\n### CASE 1.2 - Debugging streaming request\n\n";
$user2 = $instructor->withDebug()->respond(
messages: "Anna is 21 years old.",
responseModel: User::class,
options: [ 'stream' => true ]
);
echo "\nResult:\n";
dump($user2);
assert(isset($user->name));
assert(isset($user->age));
assert($user->name === 'Jason');
assert($user->age === 25);
assert(isset($user2->name));
assert(isset($user2->age));
assert($user2->name === 'Anna');
assert($user2->age === 21);
// CASE 2 - forcing API error via empty LLM config
$driver = new OpenAIDriver(new LLMConfig());
$instructor = (new Instructor)->withDriver($driver);
echo "\n### CASE 2 - Debugging exception\n\n";
try {
$user = $instructor->withDebug()->respond(
messages: "Jason is 25 years old.",
responseModel: User::class,
options: [ 'stream' => true ]
);
} catch (Exception $e) {
echo "\nCaught it:\n";
dump($e);
}
?>