Changing LLM model and options
You can specify model and other options that will be passed to OpenAI / LLM endpoint.
<?php
$person = (new Instructor)->respond(
messages: [['role' => 'user', 'content' => $text]],
responseModel: Person::class,
model: 'gpt-3.5-turbo',
options: [
'temperature' => 0.0
],
);
Providing custom client
You can pass a custom configured instance of client to the Instructor. This allows you to specify your own API key, base URI or organization.
<?php
use Cognesy\Instructor\Features\LLM\Data\LLMConfig;
use Cognesy\Instructor\Features\LLM\Drivers\OpenAIDriver;
use Cognesy\Instructor\Instructor;
$driver = new OpenAIDriver(new LLMConfig(
apiUrl: 'https://api.openai.com/v1',
apiKey: $yourApiKey,
endpoint: '/chat/completions',
metadata: ['organization' => ''],
model: 'gpt-4o-mini',
maxTokens: 128,
));
$instructor = (new Instructor)->withDriver($driver);
$person = $instructor->respond(
messages: [['role' => 'user', 'content' => $text]],
responseModel: Person::class,
model: 'gpt-3.5-turbo',
options: ['temperature' => 0.0],
);