The Inference Class
TheInference class is the main entry point for LLM interactions. It encapsulates the complexities of different providers behind a unified interface.
Inference class follows a fluent interface pattern for request building; infrastructure is assembled in InferenceRuntime.
The Embeddings Class
Similarly, theEmbeddings class provides a unified interface for generating embeddings:
EmbedUtils, not by the Embeddings facade: