Learn how to configure different LLM providers and models in Polyglot.
config/llm.php
: Contains configurations for LLM providers (chat/completion)config/embed.php
: Contains configurations for embedding providersllm.php
configuration file has the following structure:
embed.php
configuration file follows a similar pattern:
providerType
: The type of provider (OpenAI, Anthropic, etc.)apiUrl
: The base URL for the provider’s APIapiKey
: The API key for authenticationendpoint
: The specific API endpoint for chat completions or embeddingsmetadata
: Additional provider-specific settingsmodel
: The default model to usemaxTokens
: Default maximum tokens for responsescontextLength
: Maximum context length supported by the modelmaxOutputLength
: Maximum output length supported by the modelhttpClient
: (Optional) Custom HTTP client to useproviderType
: The type of provider (OpenAI, Anthropic, etc.)apiUrl
: The base URL for the provider’s APIapiKey
: The API key for authenticationendpoint
: The specific API endpoint for chat completions or embeddingsmetadata
: Additional provider-specific settingsmodel
: The default model to usedefaultDimensions
: The default dimensions of embedding vectorsmaxInputs
: Maximum number of inputs that can be processed in a single requestllm.php
contains a list of connection presets with the default names that might resemble
provider type names, but those are separate entities.
Provider type name refers to one of the supported LLM API providers and its underlying driver implementation,
either specific to this provider or a generic one - for example compatible with OpenAI (‘openai-compatible’).
Connection preset name refers to LLM API provider endpoint configuration with specific provider type, but also URL,
credentials, default model name, and default model parameter values.
.env
file in your project root:
vlucas/phpdotenv
:
create()
method.
config/llm.php
and config/embed.php
files directly:
LLMConfig
class:
CanHandleInference
interface.