Skip to main content

Mistral AI

Overview

The MistralLLMService integrates Mistral AI's powerful open-weights models. It is optimized for low-latency inference and supports Mistral's specific API requirements, including system message placement and assistant prefixing.

Installation

To use Mistral AI, install the required dependencies:

pip install "piopiy-ai[mistral]"

Prerequisites

  • A Mistral AI account and API key (Get yours here).
  • Set your API key in your environment:
    export MISTRAL_API_KEY="your_api_key_here"

Configuration

MistralLLMService Parameters

ParameterTypeDefaultDescription
api_keystrRequiredYour Mistral API key.
modelstr"mistral-small-latest"Mistral model identifier.
base_urlstr"https://api.mistral.ai/v1"API endpoint.

Usage

Basic Setup

import os
from piopiy.services.mistral.llm import MistralLLMService

llm = MistralLLMService(
api_key=os.getenv("MISTRAL_API_KEY"),
model="mistral-large-latest"
)

Notes

  • Parameter Mapping: The service automatically maps standard parameters like seed to Mistral's random_seed.
  • Message Fixups: Mistral has specific rules for message roles (e.g., system messages must be at the start). MistralLLMService automatically handles these transformations for you.
  • Function Calling: Supports Mistral's tool-calling interface for extensible agent capabilities.