OpenRouter
Overview
The OpenRouterLLMService provides a unified interface to access any model available on OpenRouter. It is fully OpenAI-compatible and serves as a gateway to dozens of different LLM providers through a single API key.
Installation
pip install piopiy-ai
Prerequisites
- An OpenRouter API key (Get yours here).
- Set your API key in your environment:
export OPENROUTER_API_KEY="your_api_key_here"
Configuration
OpenRouterLLMService Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
api_key | str | None | Your OpenRouter API key (defaults to env var). |
model | str | "openai/gpt-4o-2024-11-20" | Model identifier (e.g., anthropic/claude-3.5-sonnet). |
base_url | str | "https://openrouter.ai/api/v1" | API endpoint. |
Usage
Basic Setup
import os
from piopiy.services.openrouter.llm import OpenRouterLLMService
llm = OpenRouterLLMService(
api_key=os.getenv("OPENROUTER_API_KEY"),
model="anthropic/claude-3.5-sonnet"
)
Notes
- Model Selection: You can use any model string supported by OpenRouter (e.g.,
google/gemini-pro-1.5,meta-llama/llama-3-70b-instruct). - Transparency: OpenRouter provides detailed information about which underlying provider is being used for each request.