LiteLLM MCP Server

Stableother
120 starsPythonberriai
GitHub
About

Overview

The LiteLLM MCP Server provides a unified proxy for calling multiple LLM providers. It normalizes the API across OpenAI, Anthropic, Cohere, and many others with automatic fallback and load balancing.
Capabilities

Tools & Capabilities

completion

Generate text via any supported provider

list_models

List configured models

get_spend

Get API spend tracking

Setup

Installation

bash
Install
pip install mcp-server-litellm
Examples

Example Usage

javascript
Usage
{
  "mcpServers": {
    "litellm": {
      "command": "python",
      "args": ["-m", "mcp_server_litellm"]
    }
  }
}

Quick Info

Authorberriai
LanguagePython
StatusStable
Stars 120
Last UpdatedFeb 12, 2026

Need a Custom MCP Server?

Our team builds custom MCP servers tailored to your workflow.

Get in Touch

Need a Custom MCP Server?

Our team builds custom MCP servers tailored to your workflow. From proprietary data sources to internal tools, we have you covered.

Contact Us
CortexAgent Customer Service

Want to skip the form?

Our team is available to help you get started with CortexAgent.

This chat may be recorded for quality assurance. You can view our Privacy Policy.