HubLens › Trending › BerriAI/litellm
BerriAI

litellm

AILLMAPI GatewayPythonProxy
View on GitHub
42,098
+340

// summary

LiteLLM provides a unified interface to call over 100 different LLMs using the standard OpenAI format. It offers both a Python SDK for direct code integration and an AI Gateway proxy server for centralized management. The platform supports advanced features like cost tracking, authentication, load balancing, and integration with agents and MCP tools.

// use cases

01
Call 100+ LLMs using a consistent OpenAI-compatible API format.
02
Deploy an AI Gateway for centralized authentication, cost tracking, and guardrails.
03
Connect MCP servers to LLMs or invoke A2A agents via a unified interface.