Overview
Mistral.ai is a company that builds OS language models, but also offers a platform hosting those models. You can use Instructor with Mistral API by configuring the client as demonstrated below. Please note that the larger Mistral models support native JSON object response_format, which is much more reliable than Instructor markdown-JSON fallback. Inference feature compatibility:- tool calling - supported (Mistral-Small / Mistral-Medium / Mistral-Large)
- native JSON object response_format - recommended (Mistral-Small / Mistral-Medium / Mistral-Large)
- Instructor markdown-JSON fallback - fallback mode (Mistral 7B / Mixtral 8x7B)