Skip to main content

AWS Bedrock

Overview

The AWSBedrockLLMService provides access to high-performance foundation models from Amazon, Anthropic, and other leading AI companies through AWS Bedrock. It supports streaming responses, function calling, and multimodal inputs (vision).

Installation

To use AWS Bedrock, install the required dependencies:

pip install "piopiy-ai[aws]"

Prerequisites

  • An AWS account with Bedrock model access enabled.
  • AWS credentials (Access Key ID and Secret Access Key) (Get yours here).
  • Configure your environment:
    export AWS_ACCESS_KEY_ID="your_access_key"
    export AWS_SECRET_ACCESS_KEY="your_secret_key"
    export AWS_REGION="us-east-1"

Configuration

AWSBedrockLLMService Parameters

ParameterTypeDefaultDescription
modelstrRequiredBedrock model ID (e.g., amazon.nova-pro-v1:0).
aws_access_keystrNoneAWS Access Key (defaults to env var).
aws_secret_keystrNoneAWS Secret Key (defaults to env var).
aws_regionstrNoneAWS Region (e.g., us-east-1).
paramsInputParamsInputParams()Generation settings.

InputParams

ParameterTypeDefaultDescription
max_tokensintNoneMaximum tokens to generate.
temperaturefloatNoneSampling temperature (0.0 to 1.0).
top_pfloatNoneNucleus sampling parameter.

Usage

Basic Setup

import os
from piopiy.services.aws.llm import AWSBedrockLLMService

llm = AWSBedrockLLMService(
model="amazon.nova-pro-v1:0",
aws_region="us-east-1"
)

Notes

  • Model IDs: Ensure you use the exact model ID from the AWS Bedrock console. Common models include anthropic.claude-3-5-sonnet-20240620-v1:0 and amazon.nova-lite-v1:0.
  • Latency: For real-time voice, models like amazon.nova-lite or anthropic.claude-3-haiku are recommended for lower latency.