Basic usage
This is a simple example demonstrating how Instructor retrieves structured information from provided text (or chat message sequence). Response model class is a plain PHP class with typehints specifying the types of fields of the object.NOTE: By default, Instructor looks for OPENAI_API_KEY environment variable to get your API key. You can also provide the API key explicitly when creating the Instructor instance.
Scalar
for simple values, Maybe
for optional data, Sequence
for arrays, and Structure
for dynamically defined schemas.
Fluent API Methods
StructuredOutput provides a comprehensive fluent API for configuring requests:Request Configuration
Response Model Configuration
Configuration and Behavior
LLM Provider Configuration
Processing Overrides
Request Execution Methods
After configuring yourStructuredOutput
instance, you have several ways to execute the request and access different types of responses:
Direct Execution Methods
Pending Execution with create()
The create()
method returns a PendingStructuredOutput
instance, which acts as an execution handler that provides the same access methods:
Response Types Explained
get()
: Returns the parsed and validated structured result (e.g.,Person
object)response()
: Returns the raw LLM response object with metadata like tokens, model info, etc.stream()
: ReturnsStructuredOutputStream
for real-time processing of streaming responses
PendingStructuredOutput
class serves as a flexible execution interface that lets you choose how to process the LLM response based on your specific needs.
String as Input
You can provide a string instead of an array of messages. This is useful when you want to extract data from a single block of text and want to keep your code simple.Structured-to-structured data processing
Instructor offers a way to use structured data as an input. This is useful when you want to use object data as input and get another object with a result of LLM inference. Theinput
field of Instructor’s with()
method
can be an object, but also an array or just a string.
Streaming support
Instructor supports streaming of partial results, allowing you to start processing the data as soon as it is available.Scalar responses
See Scalar responses for more information on how to generate scalar responses withScalar
adapter class.
Partial responses and streaming
See Streaming and partial updates for more information on how to work with partial updates and streaming.Extracting arguments for function call
See FunctionCall helper class for more information on how to extract arguments for callable objects.Execution Methods Summary
Once configured, you can execute your request using different methods depending on your needs:get()
: Returns the parsed and validated structured resultresponse()
: Returns the raw LLM response with metadatastream()
: ReturnsStructuredOutputStream
for real-time processingcreate()
: ReturnsPendingStructuredOutput
for flexible execution control