Mistral AI
Overview
The MistralLLMService integrates Mistral AI's powerful open-weights models. It is optimized for low-latency inference and supports Mistral's specific API requirements, including system message placement and assistant prefixing.
Installation
To use Mistral AI, install the required dependencies:
pip install "piopiy-ai[mistral]"
Prerequisites
- A Mistral AI account and API key (Get yours here).
- Set your API key in your environment:
export MISTRAL_API_KEY="your_api_key_here"
Configuration
MistralLLMService Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
api_key | str | Required | Your Mistral API key. |
model | str | "mistral-small-latest" | Mistral model identifier. |
base_url | str | "https://api.mistral.ai/v1" | API endpoint. |
Usage
Basic Setup
import os
from piopiy.services.mistral.llm import MistralLLMService
llm = MistralLLMService(
api_key=os.getenv("MISTRAL_API_KEY"),
model="mistral-large-latest"
)
Notes
- Parameter Mapping: The service automatically maps standard parameters like
seedto Mistral'srandom_seed. - Message Fixups: Mistral has specific rules for message roles (e.g., system messages must be at the start).
MistralLLMServiceautomatically handles these transformations for you. - Function Calling: Supports Mistral's tool-calling interface for extensible agent capabilities.