LiteLLM is a universal LLM API gateway that provides OpenAI-compatible APIs for 100+ LLM providers including Bedrock, Azure, OpenAI, Anthropic, Vertex AI, and more.
hub.docker.comPython / Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
github.comTypeScript / 🪢 Open source LLM engineering platform: LLM Observability, metrics, evals, prompt management, playground, datasets. Integrates with OpenTelemetry, Langchain, OpenAI SDK, LiteLLM, and more. 🍊YC W23
github.comPython / A model-driven approach to building AI agents in just a few lines of code.
github.comPython / The most accurate document search and store for building AI apps
github.comGo / Fastest enterprise AI gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support &
github.comTypeScript / Create multiple isolated Claude Code variants with custom providers (Z.ai, MiniMax, OpenRouter, LiteLLM)
github.comPython / Claude Code settings, commands and agents for vibe coding
github.comLightweight CLI tool for evaluating AI skills (SKILL.md) with Control vs Treatment testing using LiteLLM
gitlab.comProxy Server to Call all LLM APIs using the OpenAI format. Bedrock, Azure, OpenAI (100+ LLMs)
hub.docker.com