LogoAIAny
Icon for item

LiteLLM

LiteLLM is an open-source LLM gateway and Python SDK that lets developers call more than 100 commercial and open-source models through a single OpenAI-compatible interface, complete with cost tracking, rate-limiting, load-balancing and guardrails.

Introduction

Overview

LiteLLM provides a drop-in replacement for the OpenAI client plus an HTTP proxy server, abstracting away provider-specific differences so platform teams can ship multi-LLM products faster .

Key Capabilities
  • OpenAI-compatible SDK & proxy – call 100+ LLMs (OpenAI, Azure, Anthropic, Bedrock, Google Gemini, Hugging Face etc.) via the /v1 schema
  • Cost & usage analytics – real-time spend tracking tagged by key / user / team with S3 & GCS logging support
  • Rate limiting & budgets – set RPM/TPM limits and per-project budgets to control usage
  • Automatic fallbacks & retries – route requests across multiple providers to improve reliability
  • Extensible guardrails – plug in custom validation, redaction and audit hooks to meet compliance needs

Information

Categories