Polyglot - LLM Advanced
Customize parameters of LLM driver
Cookbook
Instructor - Basics
- Basic use
- Basic use via mixin
- Handling errors with `Maybe` helper class
- Modes
- Making some fields optional
- Private vs public object field
- Automatic correction based on validation results
- Using attributes
- Using LLM API connections from config file
- Validation
- Custom validation using Symfony Validator
- Validation across multiple fields
- Validation with LLM
Instructor - Advanced
- Context caching (structured output)
- Customize parameters of LLM driver
- Custom prompts
- Using structured data as an input
- Extracting arguments of function or method
- Streaming partial updates during inference
- Providing example inputs and outputs
- Extracting scalar values
- Extracting sequences of objects
- Streaming
- Structures
Instructor - Troubleshooting
Instructor - LLM API Support
Instructor - Extras
- Extraction of complex objects
- Extraction of complex objects (Anthropic)
- Extraction of complex objects (Cohere)
- Extraction of complex objects (Gemini)
- Image processing - car damage detection
- Image to data (OpenAI)
- Image to data (Anthropic)
- Image to data (Gemini)
- Generating JSON Schema from PHP classes
- Generating JSON Schema dynamically
- Create tasks from meeting transcription
- Translating UI text fields
- Web page to PHP objects
Polyglot - LLM Basics
Polyglot - LLM Advanced
Polyglot - LLM Troubleshooting
Polyglot - LLM API Support
Polyglot - LLM Extras
Prompting - Zero-Shot Prompting
Prompting - Few-Shot Prompting
Prompting - Thought Generation
Prompting - Miscellaneous
- Arbitrary properties
- Consistent values of arbitrary properties
- Chain of Summaries
- Chain of Thought
- Single label classification
- Multiclass classification
- Entity relationship extraction
- Handling errors
- Limiting the length of lists
- Reflection Prompting
- Restating instructions
- Ask LLM to rewrite instructions
- Expanding search queries
- Summary with Keywords
- Reusing components
- Using CoT to improve interpretation of component data
Polyglot - LLM Advanced
Customize parameters of LLM driver
Overview
You can provide your own LLM configuration instance to Inference
object. This is useful
when you want to initialize LLM client with custom values.
Example
<?php
require 'examples/boot.php';
use Cognesy\Polyglot\LLM\Data\LLMConfig;
use Cognesy\Polyglot\LLM\Enums\LLMProviderType;
use Cognesy\Polyglot\LLM\Inference;
use Cognesy\Utils\Env;
use Cognesy\Utils\Str;
// Create instance of LLM client initialized with custom parameters
$config = new LLMConfig(
apiUrl: 'https://api.deepseek.com',
apiKey: Env::get('DEEPSEEK_API_KEY'),
endpoint: '/chat/completions',
model: 'deepseek-chat',
maxTokens: 128,
httpClient: 'guzzle',
providerType: LLMProviderType::OpenAICompatible->value,
);
$answer = (new Inference)
->withConfig($config)
->create(
messages: [['role' => 'user', 'content' => 'What is the capital of France']],
options: ['max_tokens' => 64]
)
->toText();
echo "USER: What is capital of France\n";
echo "ASSISTANT: $answer\n";
assert(Str::contains($answer, 'Paris'));
?>