AI Open Source · SDK 与开发工具

BerriAI/litellm

统一调用 100+ LLM API 的 Python SDK 与代理网关,把 OpenAI、Anthropic、 Bedrock、Azure、VertexAI 等接口抹平成 OpenAI 兼容格式。配套成本统计、限流、 负载均衡、日志,研究里跨模型对比实验或多 provider 切换部署时省去大量胶水代码。

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

Stars
47k
Language
Python
License
NOASSERTION
Last push
today
Created
2023-07-27
Topics
ai-gatewayanthropicazure-openaibedrockgatewaylangchain

README

<h1 align="center"> 🚅 LiteLLM </h1> <p align="center"> <p align="center">LiteLLM AI Gateway </p> <p align="center">Open Source AI Gateway for 100+ LLMs. Self-hosted. Enterprise-ready. Call any LLM in OpenAI format.</p> <p align="center"> <a href="https://render.com/deploy?repo=https://github.com/BerriAI/litellm" target="_blank" rel="nofollow"><img src="https://render.com/images/deploy-to-render-button.svg" alt="Deploy to Render"></a> <a href="https://railway.com/deploy/RhvhdC?referralCode=7mRv9K&utm_medium=integration&utm_source=template&utm_campaign=generic"> <img src="https://railway.com/button.svg" alt="Deploy on Railway"> </a> </p> </p> <h4 align="center"><a href="https://docs.litellm.ai/docs/simple_proxy" target="_blank">LiteLLM Proxy Server (AI Gateway)</a> | <a href="https://docs.litellm.ai/docs/enterprise#hosted-litellm-proxy" target="_blank"> Hosted Proxy</a> | <a href="https://litellm.ai/enterprise"target="_blank">Enterprise Tier</a> | <a href="https://www.litellm.ai/ai-gateway" target="_blank">Website</a></h4> <h4 align="center"> <a href="https://pypi.org/project/litellm/" target="_blank"> <img src="https://img.shields.io/pypi/v/litellm.svg" alt="PyPI Version"> </a> <a href="https://github.com/BerriAI/litellm" target="_blank"> <img src="https://img.shields.io/github/stars/BerriAI/litellm.svg?style=social" alt="GitHub Stars"> </a> <a href="https://www.ycombinator.com/companies/berriai"> <img src="https://img.shields.io/badge/Y%20Combinator-W23-orange?style=flat-square" alt="Y Combinator W23"> </a> <a href="https://wa.link/huol9n"> <img src="https://img.shields.io/static/v1?label=Chat%20on&message=WhatsApp&color=success&logo=WhatsApp&style=flat-square" alt="Whatsapp"> </a> <a href="https://discord.gg/wuPM9dRgDw"> <img src="https://img.shields.io/static/v1?label=Chat%20on&message=Discord&color=blue&logo=Discord&style=flat-square" alt="Discord"> </a> <a href="https://www.litellm.ai/support"> <img src="https://img.shields.io/static/v1?label=Chat%20on&message=Slack&color=black&logo=Slack&style=flat-square" alt="Slack"> </a> <a href="https://codspeed.io/BerriAI/litellm?utm_source=badge"> <img src="https://img.shields.io/endpoint?url=https://codspeed.io/badge.json" alt="CodSpeed"/> </a> </h4> <img width="2688" height="1600" alt="Group 7154 (1)" src="https://github.com/user-attachments/assets/c5ee0412-6fb5-4fb6-ab5b-bafae4209ca6" />

What is LiteLLM

LiteLLM is an open source AI Gateway that gives you a single, unified interface to call 100+ LLM providers — OpenAI, Anthropic, Gemini, Bedrock, Azure, and more — using the OpenAI format.

Use it as a Python SDK for direct library integration, or deploy the AI Gateway (Proxy Server) as a centralized service for your team or organization.

Jump to LiteLLM Proxy (LLM Gateway) Docs <br> Jump to Supported LLM Providers


Why LiteLLM

Managing LLM calls across providers gets complicated fast — different SDKs, auth patterns, request formats, and error types for every model. LiteLLM removes that friction:

  • Unified API — one interface for 100+ LLMs, no provider-specific SDK juggling
  • Drop-in OpenAI compatibility — swap providers without rewriting your code
  • Production-ready gateway — virtual keys, spend tracking, guardrails, load balancing, and an admin dashboard out of the box
  • 8ms P95 latency at 1k RPS (benchmarks)

OSS Adopters

<table> <tr> <td><img height="60" alt="Stripe" src="https://github.com/user-attachments/assets/f7296d4f-9fbd-460d-9d05-e4df31697c4b" /></td> <td><img height="60" alt="image" src="https://github.com/user-attachments/assets/436fca71-988b-40bb-b5fe-8450c80fdbd0" /></td> <td><img height="60" alt="Google ADK" src="https://github.com/user-attachments/assets/caf270a2-5aee-45c4-8222-41a2070c4f19" /></td> <td><img height="60" alt="Greptile" src="https://github.com/user-attachments/assets/3db0ae72-0843-4005-a56d-bba1dde2193d" /></td> <td><img height="60" alt="OpenHands" src="https://github.com/user-attachments/assets/a6150c4c-149e-4cae-888b-8b92be6e003f" /></td> <td><h2>Netflix</h2></td> <td><img height="60" alt="OpenAI Agents SDK" src="https://github.com/user-attachments/assets/c02f7be0-8c2e-4d27-aea7-7c024bfaebc0" /></td> </tr> </table>

Features

<details open> <summary><b>LLMs</b> - Call 100+ LLMs (Python SDK + AI Gateway)</summary>

All Supported Endpoints - /chat/completions, /responses, /embeddings, /images, /audio, /batches, /rerank, /a2a, /messages and more.

Python SDK

uv add litellm
from litellm import completio

同一分类的其他项