Skip to main content

OpenRouter

Overview

The OpenRouterLLMService provides a unified interface to access any model available on OpenRouter. It is fully OpenAI-compatible and serves as a gateway to dozens of different LLM providers through a single API key.

Installation

pip install piopiy-ai

Prerequisites

  • An OpenRouter API key (Get yours here).
  • Set your API key in your environment:
    export OPENROUTER_API_KEY="your_api_key_here"

Configuration

OpenRouterLLMService Parameters

ParameterTypeDefaultDescription
api_keystrNoneYour OpenRouter API key (defaults to env var).
modelstr"openai/gpt-4o-2024-11-20"Model identifier (e.g., anthropic/claude-3.5-sonnet).
base_urlstr"https://openrouter.ai/api/v1"API endpoint.

Usage

Basic Setup

import os
from piopiy.services.openrouter.llm import OpenRouterLLMService

llm = OpenRouterLLMService(
api_key=os.getenv("OPENROUTER_API_KEY"),
model="anthropic/claude-3.5-sonnet"
)

Notes

  • Model Selection: You can use any model string supported by OpenRouter (e.g., google/gemini-pro-1.5, meta-llama/llama-3-70b-instruct).
  • Transparency: OpenRouter provides detailed information about which underlying provider is being used for each request.