LLM API Support
Local / Ollama
Overview
You can use Instructor with local Ollama instance.
Please note that, at least currently, OS models do not perform on par with OpenAI (GPT-3.5 or GPT-4) model for complex data schemas.
Supported modes:
- Mode::MdJson - fallback mode, works with any capable model
- Mode::Json - recommended
- Mode::Tools - supported (for selected models - check Ollama docs)