Each LLM provider has unique quirks and issues. This section covers common provider-specific issues and how to resolve them.

OpenAI

  1. Organization IDs: Set the organization ID if using a shared account
// In config/llm.php
'metadata' => [
    'organization' => 'org-your-organization-id',
],
  1. API Versions: Pay attention to API version changes
// Updates to OpenAI API may require changes to your code
// Monitor OpenAI's release notes for changes

Anthropic

  1. Message Format: Anthropic uses a different message format
// Polyglot handles this automatically, but be aware when debugging
  1. Tool Support: Tool support has specific requirements
// When using tools with Anthropic, check their latest documentation
// for supported features and limitations

Mistral

  1. Rate Limits: Mistral has strict rate limits on free tier
// Implement more aggressive rate limiting for Mistral

Ollama

  1. Local Setup: Ensure Ollama is properly installed and running
# Check if Ollama is running
curl http://localhost:11434/api/version
  1. Model Availability: Download models before using them
# Pull a model before using it
ollama pull llama2