LogoAISecKit
icon of litellm

litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format.

Introduction

LiteLLM

LiteLLM is a Python SDK and Proxy Server (LLM Gateway) designed to facilitate seamless access to over 100 LLM APIs using the OpenAI format. It supports various providers including Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, and Groq.

Key Features:
  • Multi-Provider Support: Call APIs from multiple LLM providers using a unified interface.
  • OpenAI Format Compatibility: Interact with LLMs using the familiar OpenAI API format.
  • Proxy Server Functionality: Acts as a gateway to manage requests and responses efficiently.
  • Asynchronous and Streaming Support: Handle requests asynchronously and stream responses for real-time applications.
  • Logging and Observability: Integrate with various logging tools for better observability and tracking.
  • Rate Limiting and Budget Management: Set budgets and rate limits for API usage across projects.
Benefits:
  • Simplified API Management: Reduce complexity in managing multiple LLM APIs.
  • Enhanced Performance: Optimized for speed and efficiency in API calls.
  • Community Support: Open-source with a growing community of contributors.
Highlights:
  • Stable releases with rigorous testing.
  • Comprehensive documentation for easy setup and usage.
  • Active development and feature requests encouraged.

Newsletter

Join the Community

Subscribe to our newsletter for the latest news and updates