DeepSeek
Overview
The DeepSeekLLMService provides access to DeepSeek's high-efficiency models via an OpenAI-compatible interface.
Installation
pip install piopiy-ai
Prerequisites
- A DeepSeek API key (Get yours here).
- Set your API key in your environment:
export DEEPSEEK_API_KEY="your_api_key_here"
Configuration
DeepSeekLLMService Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
api_key | str | Required | Your DeepSeek API key. |
model | str | "deepseek-chat" | Model identifier. |
base_url | str | "https://api.deepseek.com/v1" | API endpoint. |
Usage
Basic Setup
import os
from piopiy.services.deepseek.llm import DeepSeekLLMService
llm = DeepSeekLLMService(
api_key=os.getenv("DEEPSEEK_API_KEY"),
model="deepseek-chat"
)
Notes
- Efficiency: DeepSeek models are known for high performance at a lower cost point.
- OpenAI Compatible: Easily drops into any pipeline expecting an OpenAI-style interface.