Overview
Groq is LLM providers offering a very fast inference thanks to their custom hardware. They provide a several models - Llama2, Mixtral and Gemma. Supported modes depend on the specific model, but generally include:- Instructor markdown-JSON fallback - fallback mode
- native JSON object response_format - recommended
- tool calling - supported