Overview
MistralLLMService provides access to Mistral’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and vision with Mistral-specific optimizations for tool use and message handling.
Mistral LLM API Reference
Pipecat’s API methods for Mistral integration
Example Implementation
Complete example with function calling
Mistral Documentation
Official Mistral API documentation and features
Mistral Console
Access models and manage API keys
Installation
To use Mistral services, install the required dependency:Prerequisites
Mistral Account Setup
Before using Mistral LLM services, you need:- Mistral Account: Sign up at Mistral Console
- API Key: Generate an API key from your console dashboard
- Model Selection: Choose from available models (Mistral Small, Mistral Large, etc.)
Required Environment Variables
MISTRAL_API_KEY: Your Mistral API key for authentication
Configuration
Mistral API key for authentication.
Base URL for Mistral API endpoint.
Model identifier to use.Deprecated in v0.0.105. Use
settings=MistralLLMService.Settings(model=...) instead.Settings
Runtime-configurable settings passed via thesettings constructor argument using MistralLLMService.Settings(...). These can be updated mid-conversation with LLMUpdateSettingsFrame. See Service Settings for details.
This service uses the same settings as OpenAILLMService. See OpenAI LLM Settings for the full parameter reference.
Usage
Basic Setup
With Custom Settings
Updating Settings at Runtime
Notes
- Function calling: Mistral supports tool/function calling. The service includes deduplication logic to prevent repeated execution of the same function calls across conversation turns.
- Mistral API constraints: The service automatically handles Mistral-specific requirements, such as ensuring tool result messages are followed by an assistant message and that system messages appear only at the start of the conversation.
- Vision: Supports image inputs via base64-encoded JPEG content.
- Default model:
mistral-small-latestis used when no model is specified.