Skip to main content
Provider objects sit between configuration and runtime assembly. They resolve config values from presets, arrays, or explicit objects, and optionally carry an explicit driver instance. Runtimes use providers to determine which driver to build and how to configure it.

LLMProvider

LLMProvider is a builder that wraps an LLMConfig and an optional explicit driver. It implements CanResolveLLMConfig and HasExplicitInferenceDriver, which the runtime uses during assembly. Namespace: Cognesy\Polyglot\Inference\LLMProvider

Creating a Provider

use Cognesy\Polyglot\Inference\LLMProvider;
use Cognesy\Polyglot\Inference\Config\LLMConfig;

// From a named preset
$provider = LLMProvider::using('openai');

// With a custom base path for presets
$provider = LLMProvider::using('openai', basePath: '/path/to/presets');

// From an explicit config
$provider = LLMProvider::fromLLMConfig($config);

// From an array
$provider = LLMProvider::fromArray([
    'driver' => 'anthropic',
    'apiUrl' => 'https://api.anthropic.com/v1',
    'apiKey' => getenv('ANTHROPIC_API_KEY'),
    'endpoint' => '/messages',
    'model' => 'claude-sonnet-4-20250514',
]);

// Default (OpenAI with gpt-4.1-nano)
$provider = LLMProvider::new();
// @doctest id="0c0d"

Customizing a Provider

All mutators return a new immutable instance:
// Override specific config values
$provider = LLMProvider::using('openai')
    ->withModel('gpt-4.1')
    ->withConfigOverrides(['maxTokens' => 4096]);

// Replace the entire config
$provider = $provider->withLLMConfig($newConfig);

// Inject an explicit driver (bypasses the driver factory)
$provider = $provider->withDriver($customDriver);
// @doctest id="0575"
When an explicit driver is set, the runtime uses it directly instead of building one from the config. This is useful for testing or for providers that need custom initialization.

How the Runtime Uses It

When you call InferenceRuntime::fromProvider($provider), the runtime:
  1. Calls $provider->resolveConfig() to get the LLMConfig
  2. Checks if $provider->explicitInferenceDriver() returns a driver
  3. If an explicit driver exists, uses it directly
  4. Otherwise, looks up the driver name from the config and creates one via the InferenceDriverRegistry

EmbeddingsProvider

EmbeddingsProvider serves the same role for embeddings. It wraps an EmbeddingsConfig and an optional explicit driver. Namespace: Cognesy\Polyglot\Embeddings\EmbeddingsProvider

Creating a Provider

use Cognesy\Polyglot\Embeddings\EmbeddingsProvider;
use Cognesy\Polyglot\Embeddings\Config\EmbeddingsConfig;

// Default (empty config)
$provider = EmbeddingsProvider::new();

// From an explicit config
$provider = EmbeddingsProvider::fromEmbeddingsConfig($config);

// From an array
$provider = EmbeddingsProvider::fromArray([
    'driver' => 'openai',
    'apiUrl' => 'https://api.openai.com/v1',
    'apiKey' => getenv('OPENAI_API_KEY'),
    'endpoint' => '/embeddings',
    'model' => 'text-embedding-3-small',
]);
// @doctest id="0322"
Unlike LLMProvider, EmbeddingsProvider does not have a using(...) shortcut for presets. Use Embeddings::using(...) or construct the config explicitly.

Customizing a Provider

$provider = EmbeddingsProvider::fromArray([...])
    ->withConfigOverrides(['dimensions' => 256])
    ->withDriver($customDriver);
// @doctest id="a777"

Driver Factories

Inference Driver Registry

The InferenceDriverRegistry manages the mapping between driver names and their factory callables. Polyglot ships with a default set of bundled drivers via BundledInferenceDrivers::registry(). Supported inference drivers include:
Driver NameClassNotes
a21A21DriverA21 Labs
anthropicAnthropicDriverAnthropic Messages API
azureAzureDriverAzure OpenAI
bedrock-openaiBedrockOpenAIDriverAWS Bedrock (OpenAI-compatible)
cerebrasCerebrasDriverCerebras
cohereCohereV2DriverCohere v2
deepseekDeepseekDriverDeepSeek
fireworksFireworksDriverFireworks AI
geminiGeminiDriverGoogle Gemini native API
gemini-oaiGeminiOAIDriverGemini via OpenAI-compatible endpoint
glmGlmDriverGLM
groqGroqDriverGroq
huggingfaceHuggingFaceDriverHugging Face
inceptionInceptionDriverInception
metaMetaDriverMeta Llama API
minimaxiMinimaxiDriverMinimaxi
mistralMistralDriverMistral
openaiOpenAIDriverOpenAI Chat Completions API
openai-responsesOpenAIResponsesDriverOpenAI Responses API
openresponsesOpenResponsesDriverOpen Responses API
openrouterOpenRouterDriverOpenRouter
perplexityPerplexityDriverPerplexity
qwenQwenDriverAlibaba Qwen
sambanovaSambaNovaDriverSambaNova
xaiXAiDriverxAI (Grok)
moonshotOpenAICompatibleDriverMoonshot (via OpenAI-compatible)
ollamaOpenAICompatibleDriverOllama (via OpenAI-compatible)
openai-compatibleOpenAICompatibleDriverGeneric OpenAI-compatible APIs
togetherOpenAICompatibleDriverTogether AI (via OpenAI-compatible)
You can extend the registry with custom drivers:
use Cognesy\Polyglot\Inference\Creation\InferenceDriverRegistry;
use Cognesy\Polyglot\Inference\Creation\BundledInferenceDrivers;

$registry = BundledInferenceDrivers::registry()
    ->withDriver('my-provider', MyCustomDriver::class);

$runtime = InferenceRuntime::fromConfig($config, drivers: $registry);
// @doctest id="e79b"
A custom driver can be registered as a class name (must accept LLMConfig, CanSendHttpRequests, and CanHandleEvents in its constructor) or as a callable factory:
$registry = $registry->withDriver('my-provider', function ($config, $httpClient, $events) {
    return new MyCustomDriver($config, $httpClient, $events);
});
// @doctest id="863f"
You can also remove drivers from the registry:
$registry = $registry->withoutDriver('openai-compatible');
// @doctest id="9de4"

Embeddings Driver Registry

The EmbeddingsDriverRegistry follows the same immutable instance-based pattern as InferenceDriverRegistry. Bundled embeddings drivers are provided via BundledEmbeddingsDrivers::registry() and include: openai, azure, cohere, gemini, jina, mistral, and ollama. Custom embeddings drivers can be registered through the registry:
use Cognesy\Polyglot\Embeddings\Creation\BundledEmbeddingsDrivers;

$registry = BundledEmbeddingsDrivers::registry()
    ->withDriver('my-provider', MyEmbeddingsDriver::class);

$runtime = EmbeddingsRuntime::fromConfig($config, drivers: $registry);
// @doctest id="a033"
Or with a factory callable:
$registry = $registry->withDriver('my-provider', function ($config, $httpClient, $events) {
    return new MyEmbeddingsDriver($config, $httpClient, $events);
});
// @doctest id="e535"
Both InferenceDriverRegistry and EmbeddingsDriverRegistry use immutable instance-based registration, so driver registrations can vary per runtime.

Key Contracts

The provider system is built on a small set of interfaces:

Provider Contracts

InterfacePurpose
CanResolveLLMConfigReturns an LLMConfig from a provider
HasExplicitInferenceDriverOptionally returns a pre-built inference driver
CanAcceptLLMConfigAllows setting an LLMConfig on a provider
CanResolveEmbeddingsConfigReturns an EmbeddingsConfig from a provider
HasExplicitEmbeddingsDriverOptionally returns a pre-built embeddings driver

Driver Contracts

InterfacePurpose
CanProcessInferenceRequestMain inference driver contract (make responses, stream deltas, report capabilities)
CanHandleVectorizationMain embeddings driver contract (handle requests, parse responses)
CanProvideInferenceDriversRegistry that creates inference drivers by name

Adapter Contracts

InterfacePurpose
CanTranslateInferenceRequestConverts InferenceRequest to HttpRequest
CanTranslateInferenceResponseConverts HttpResponse to InferenceResponse or stream deltas
CanMapMessagesMaps typed Messages to provider format
CanMapRequestBodyAssembles the request body
CanMapUsageExtracts token usage from response data
The driver contract CanProcessInferenceRequest also includes a capabilities() method that reports what features a driver supports (e.g., streaming, tool calls, structured output). This can be used to make runtime decisions about which features to use with a given provider:
$driver->capabilities()->supportsStreaming;
$driver->capabilities('deepseek-reasoner')->supportsToolCalls;
// @doctest id="162b"