Skip to main content
Instructor for PHP home page
Search...
⌘K
Issues
Github
Github
Search...
Navigation
Cookbook \ Prompting \ Miscellaneous
Page Not Found
Main
Packages
Cookbook
Community
Blog
Cookbook \ Instructor \ Basics
Basic
Basic constructor
Basic get set
Basic private vs public fields
Basic via mixin
Fluent API
Handling errors with `Maybe` helper class
Messages
Mixed Type Property
Modes
Making some fields optional
Automatic correction based on validation results
Using attributes
Using LLM API connection presets from config file
Validation
Validation custom
Validation mixin
Validation with LLM
Cookbook \ Instructor \ Advanced
Use custom configuration providers
Context caching
Context caching oai
Custom client parameters
Use custom HTTP client instance
Use custom HTTP client instance - Laravel
Custom prompts
Dsn
Function arguments
Manual Schema Building
Partial updates
Providing examples
Extracting scalar values
Extracting sequences of objects
Streaming
Structures
Cookbook \ Instructor \ Troubleshooting
Debugging
Laravel Logging Integration
Monolog Logging with Functional Pipeline
PSR-3 Logging with Functional Pipeline
Symfony Logging Integration
Receive specific internal event with onEvent()
Modifying Settings Path
Token usage
Receive all internal events with wiretap()
Cookbook \ Instructor \ LLM API Support
A21
Anthropic
Azure oai
Cerebras
Cohere2
Deep seek
Fireworks ai
Gemini
Gemini oai
Groq
Hugging face
Inception
Meta
Minimaxi
Mistral
Moonshot
Local / Ollama
Open ai
Open router
Perplexity
Samba nova
Together ai
X ai
Cookbook \ Instructor \ Extras
Extraction of complex objects
Extraction of complex objects (Anthropic)
Extraction of complex objects (Cohere)
Extraction of complex objects (Gemini)
Custom Content Extractors
Data inputs
Image processing - car damage detection
Image to data (OpenAI)
Image to data (Anthropic)
Image to data (Gemini)
Json schema
Return extracted data as array
Use different class for schema and output
Streaming with array output format
Pure Array Processing (No Classes)
Generating JSON Schema from PHP classes
Generating JSON Schema dynamically
Create tasks from meeting transcription
Translating UI text fields
Web page to PHP objects
Cookbook \ Polyglot \ LLM Basics
Llm
Working directly with LLMs and JSON - JSON mode
Working directly with LLMs and JSON - JSON Schema mode
Working directly with LLMs and JSON - MdJSON mode
Working directly with LLMs and JSON - Tools mode
Llm with schema helper
Llm with tools helper
Cookbook \ Polyglot \ LLM Advanced
Customize configuration providers of LLM driver
Context caching (text inference)
Context cache llmoai
Custom client parameters
Custom embeddings driver
Using custom LLM driver
Dsn
Embeddings utils
Embeddings
Work directly with HTTP client facade
Parallel Calls
Reasoning Content Access
Cookbook \ Polyglot \ LLM Troubleshooting
Debugging HTTP Calls
Logging laravel embeddings
Logging laravel inference
Logging monolog
Logging symfony
Cookbook \ Polyglot \ LLM API Support
A21
Anthropic
Azure oai
Cerebras
Cohere
Deep seek
Fireworks ai
Gemini
Gemini oai
Groq
Inception
Meta
Minimaxi
Mistral
Moonshot
Local / Ollama
Open ai
Open router
Perplexity
Samba nova
Together ai
X ai
Cookbook \ Polyglot \ LLM Extras
Basic Agent Usage
Basic Agent Control Usage
Agent Control Events & Monitoring
Agent Control Streaming
Agent Control Runtime Switching
Agent file sys
Agent-Driven Codebase Search
Agent self critic
Agent with Structured Output Extraction
Agent Subagent Orchestration
Multi-Participant AI Chat Panel Discussion
Chat with summary
Claude code
Claude Code CLI - Agentic Search
OpenAI Codex CLI - Basic
OpenAI Codex CLI - Streaming
Image
Metrics collection
Open code basic
Open code streaming
Prompt template
Summary llm
Inference and tool use
Tool use re act
Cookbook \ Prompting \ Zero-Shot Prompting
Assign a Role
Auto-Refine The Prompt
Clarify Ambiguous Information
Define Style
Emotional language
Generate Follow-Up Questions
Ask Model to Repeat the Query
Simulate a Perspective
Cookbook \ Prompting \ Few-Shot Prompting
Consistency based examples
Example ordering
Generate In-Context Examples
Select effective samples
Cookbook \ Prompting \ Thought Generation
Analogical Prompting
Automate Example Selection
Prioritize Complex Examples
Examine The Context
Higher level context
Include Incorrect Examples
Use Majority Voting
Generate Prompt Variations
Structure The Reasoning
Uncertain examples
Cookbook \ Prompting \ Ensembling
Combine Multiple Reasoning Chains
Use LLMs to Combine Different Responses
Combine specialized ll ms
Prioritize Consistent Examples
Use Distinct Example Subsets
Use Ensembles To Test Prompts
Generate Multiple Candidate Responses
Use Task Specific Evaluation Metrics
Use Translation for Paraphrasing
Verify Responses over Majority Voting
Cookbook \ Prompting \ Self-Criticism
Break Down Reasoning Into Multiple Steps
Determine Uncertainty of Reasoning Chain
Improve With Feedback
Reconstruct Prompt from Reasoning Steps
Self-Verify Responses
Independently Verify Responses
Cookbook \ Prompting \ Decomposition
Break Down Complex Tasks
Ditch Vanilla Chain Of Thought
Generate Code for Intermediate Steps
Generate in Parallel
Solve Simpler Subproblems
Leverage Task Specific Systems
Cookbook \ Prompting \ Miscellaneous
Arbitrary props
Arbitrary props consistency
Chain of Summaries
Chain of Thought
Single label classification
Multiclass classification
Entity relationship extraction
Handling errors
Http client
Http client streaming
HTTP Middleware (Hooks + Conditional Decoration)
HTTP Middleware (Stream)
HTTP Middleware (Sync)
Http pool
Limiting length of lists
Reflection Prompting
Restating instructions
Rewriting instructions
Search criteria
Summary
Time range
Time range with cot
404
Page Not Found
We couldn't find the page.
⌘I