Skip to main content
Instructor is provider-agnostic. Provider selection is handled through LLMConfig and the Polyglot runtime that powers the underlying inference calls. Once the configuration is resolved, your application code stays the same regardless of which provider you use.

Selecting a Provider

The most common approaches:
use Cognesy\Instructor\StructuredOutput;
use Cognesy\Polyglot\Inference\Config\LLMConfig;

// Use a named preset
$so = StructuredOutput::using('anthropic');

// Build from a config object
$so = StructuredOutput::fromConfig(
    LLMConfig::fromPreset('openai')
);
// @doctest id="72d2"
Presets are YAML files stored in the config/llm/presets directory. Each file defines the API URL, driver, default model, and other provider-specific settings.

Supported Providers

The following providers have built-in presets:
ProviderPreset name
A21a21
Anthropicanthropic
AWS Bedrockaws-bedrock
Azure OpenAIazure
Cerebrascerebras
Coherecohere
DeepSeekdeepseek
DeepSeek (Reasoning)deepseek-r
Fireworksfireworks
Google Geminigemini
Gemini (OpenAI-compatible)gemini-oai
GLMglm
Groqgroq
Hugging Facehuggingface
Inceptioninception
Metameta
MiniMaximinimaxi
MiniMaxi (OpenAI-compatible)minimaxi-oai
Mistralmistral
Moonshot / Kimimoonshot-kimi
Ollamaollama
OpenAIopenai
OpenAI Responsesopenai-responses
OpenRouteropenrouter
Perplexityperplexity
Qwenqwen
SambaNovasambanova
Togethertogether
xAIxai

Custom Providers

Any OpenAI-compatible API can be used by building an LLMConfig manually:
use Cognesy\Polyglot\Inference\Config\LLMConfig;
use Cognesy\Instructor\StructuredOutput;

$config = new LLMConfig(
    apiUrl: 'https://my-provider.example.com/v1',
    apiKey: $_ENV['MY_PROVIDER_KEY'],
    model: 'my-model',
    driver: 'openai-compatible',
    maxTokens: 2048,
);

$result = StructuredOutput::fromConfig($config)
    ->with(
        messages: 'Extract the data.',
        responseModel: MyModel::class,
    )
    ->get();
// @doctest id="9c9a"