LiteLLM Proxy Server - LLM Gateway
BerriAI
LiteLLM Proxy Server - LLM Gateway
BerriAI
LiteLLM Proxy Server - LLM Gateway
BerriAI
LLM Gateway to call 100+ LLM APIs using the OpenAI format Bedrock,VertexAI, Azure OpenAI
With LiteLLM Proxy Server, you'll get access to a Proxy Server to call 100+ LLMs in a unified interface where you'll be able to track spend, set budgets per virtual key and users.
You'll be able to set budgets & rate limits per project, API key, and model on OpenAI Proxy Servers.
You can also translate inputs to the provider's completion, embedding, and image_generation endpoints as well as retry/fallback logic across multiple LLM deployments.