Debugging LLM interactions is essential for troubleshooting and optimizing your applications.
Polyglot debug mode provides a simple way to enable HTTP-level debugging for
LLM interactions. Debugging is essential for troubleshooting and optimizing your
applications. It allows you to inspect the requests sent to the LLM and the
responses received, helping you identify issues and improve performance.
Polyglot provides a simple way to enable HTTP debug mode:
Copy
<?phpuse Cognesy\Http\Creation\HttpClientBuilder;use Cognesy\Polyglot\Inference\Inference;use Cognesy\Polyglot\Inference\InferenceRuntime;// Enable HTTP debug middleware in the HTTP client used by runtime$http = (new HttpClientBuilder())->withHttpDebugPreset('on')->create();$inference = Inference::fromRuntime(InferenceRuntime::using( preset: 'openai', httpClient: $http,));// Make a request - debug output will show the request and response details$response = $inference->with(messages: 'What is the capital of France?')->get();// @doctest id="505a"