logo
icon

LiteLLM

Open-source, high-performance LLM proxy for OpenAI, Anthropic, Cohere, and more.

PlatformZeabur
Dideploy255
PenerbitzeaburZeabur
Dideploy255 kali
PenerbitzeaburZeabur
Dibuat2024-07-27
Layanan
service icon
service icon
Tag
AILLMProxy

LiteLLM

LiteLLM is an open-source, high-performance proxy for large language models. It provides a unified OpenAI-compatible API for 100+ LLM providers including OpenAI, Anthropic, Google Gemini, Azure, AWS Bedrock, Cohere, and more.

Features

  • Unified API — Call 100+ LLMs using the OpenAI API format
  • Load balancing — Route requests across multiple models and providers
  • Cost tracking — Monitor spend per key, user, and team
  • Rate limiting — Set RPM/TPM limits per API key
  • Admin UI — Built-in web dashboard for managing models, keys, and usage

Getting Started

  1. After deployment, visit https://<your-domain>/ui to access the LiteLLM Admin UI.
  2. Login with username admin and the Master Key shown in the service instructions.
  3. Go to Models tab to add your LLM provider API keys (e.g. OpenAI, Anthropic).
  4. Use https://<your-domain> as the OpenAI-compatible API base URL in your applications.

Notes

  • First startup takes 1–2 minutes while database migrations run.
  • The PostgreSQL database is included and configured automatically.

For more information, visit LiteLLM's official documentation.