Grok (xAI)
Overview
The GrokLLMService provides access to xAI's Grok models. It uses an OpenAI-compatible interface and includes specialized token usage tracking to handle Grok's incremental reporting style.
Installation
To use Grok, install the base SDK:
pip install piopiy-ai
Prerequisites
- An xAI account and API key (Get yours here).
- Set your API key in your environment:
export GROK_API_KEY="your_api_key_here"
Configuration
GrokLLMService Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
api_key | str | Required | Your Grok API key. |
model | str | "grok-3-beta" | Model identifier. |
base_url | str | "https://api.x.ai/v1" | API endpoint. |
Usage
Basic Setup
import os
from piopiy.services.grok.llm import GrokLLMService
llm = GrokLLMService(
api_key=os.getenv("GROK_API_KEY"),
model="grok-3-beta"
)
Notes
- Token Tracking: The service accurately accumulates prompt, completion, and reasoning tokens for precise metrics reporting.
- Compatibility: Supports standard OpenAI features like streaming and internal context aggregation.