In your project directory, create a new PHP file test-instructor.php:
Copy
<?phprequire __DIR__ . '/vendor/autoload.php';use Cognesy\Instructor\StructuredOutput;// Set up OpenAI API key$apiKey = 'your-openai-api-key';putenv("OPENAI_API_KEY=" . $apiKey);// WARNING: In real project you should set up API key in .env file.// Step 1: Define target data structure(s)class City { public string $name; public string $country; public int $population;}// Step 2: Use Instructor to run LLM inference$city = (new StructuredOutput) ->using('openai') ->withResponseClass(City::class) ->withMessages('What is the capital of France?') ->get();var_dump($city);
You should never put your API keys directly in your real project code to avoid getting them compromised. Set them up in your .env file.
You can start using Instructor in your project right away after installation.But it’s recommended to publish configuration files and prompt templates to your project directory, so you can
customize the library’s behavior and use your own prompt templates.You should also set up LLM provider API keys in your .env file instead of putting them directly in your code.See Setup Instructions for more details.