Each LLM provider has unique quirks and issues. This section covers common provider-specific issues and how to resolve them.

OpenAI

  1. Organization IDs: Set the organization ID if using a shared account
// In config/llm.php
'metadata' => [
    'organization' => 'org-your-organization-id',
],
// @doctest id="4a1b"
  1. API Versions: Pay attention to API version changes
// Updates to OpenAI API may require changes to your code
// Monitor OpenAI's release notes for changes
// @doctest id="9424"

Anthropic

  1. Message Format: Anthropic uses a different message format
// Polyglot handles this automatically, but be aware when debugging
// @doctest id="0b5e"
  1. Tool Support: Tool support has specific requirements
// When using tools with Anthropic, check their latest documentation
// for supported features and limitations
// @doctest id="3a8e"

Mistral

  1. Rate Limits: Mistral has strict rate limits on free tier
// Implement more aggressive rate limiting for Mistral
// @doctest id="2ff1"

Ollama

  1. Local Setup: Ensure Ollama is properly installed and running
# Check if Ollama is running
curl http://localhost:11434/api/version
# @doctest id="94a1"
  1. Model Availability: Download models before using them
# Pull a model before using it
ollama pull llama2
# @doctest id="bee8"