- Install the
cognesy/instructor-structpackage - Provide LLM provider credentials
Installation
Instructor requires PHP 8.3 or later.
Providing API Keys
Instructor reads provider credentials from environment variables. The simplest approach is to set them in your shell or a.env file at the root of your project:
Never commit API keys to version control. Add.envto your.gitignorefile.
Preset-Based Setup
Presets are the fastest way to get started. A preset name maps to a provider configuration that reads credentials from the environment:Explicit Provider Configuration
When you need full control over the driver, model, API base URL, or other connection parameters, useLLMConfig directly:
LLMConfig from an array for more detailed configuration:
Runtime Configuration
StructuredOutput handles single requests. When you need to configure behavior that
applies across multiple requests — retries, output mode, event listeners, or custom
pipeline extensions — use StructuredOutputRuntime:
What Belongs Where
Understanding the separation of concerns helps you structure your application:| Layer | Responsibility | Examples |
|---|---|---|
LLMConfig | Provider connection details | Driver, model, API key, base URL, max tokens |
StructuredOutputConfig | Extraction behavior | Output mode, retry prompt template, schema naming |
StructuredOutputRuntime | Runtime behavior | Max retries, event listeners, custom validators/transformers |
StructuredOutput | Single request | Messages, response model, system prompt, examples |
Output Modes
Instructor supports multiple strategies for getting structured output from the LLM. The default mode (Tools) uses the provider’s function/tool calling API. You can switch
modes via the runtime:
| Mode | Description |
|---|---|
OutputMode::Tools | Uses the provider’s tool/function calling API (default) |
OutputMode::Json | Requests JSON output via the provider’s JSON mode |
OutputMode::JsonSchema | Sends a JSON Schema and requests strict conformance |
OutputMode::MdJson | Asks the LLM to return JSON inside a Markdown code block |
OutputMode::Text | Extracts JSON from unstructured text responses |
OutputMode::Unrestricted | No output constraints; extraction is best-effort |
Event Listeners
The runtime exposes a full event system for monitoring and debugging:Using a Local Model with Ollama
Instructor works with local models through Ollama. Install Ollama, pull a model, and point Instructor at the local endpoint:Framework Integration
Instructor is a standalone library that works in any PHP application. It does not require published config files, service providers, or framework-specific bindings. For Laravel-specific installation, configuration, facades, events, and testing, use the dedicated Laravel package docs:Next Steps
- Quickstart — run your first extraction
- Usage — the full request-building API
- Configuration — advanced configuration options
- Modes — output mode details and trade-offs
- LLM Providers — supported providers and driver options