About
Overview
The Ollama MCP Server enables AI agents to use locally-running language models through Ollama. It provides tools for listing available models, generating text, creating embeddings, and pulling new models. Ideal for privacy-sensitive applications that require on-premises AI inference.
Capabilities
Tools & Capabilities
⚡generate
Generate text using a local model
⚡list_models
List available local models
⚡embed
Generate embeddings from text
⚡pull_model
Pull a new model from the Ollama registry
Setup
Installation
bash
Install
npx -y mcp-server-ollamaExamples
Example Usage
javascript
Usage
{
"mcpServers": {
"ollama": {
"command": "npx",
"args": ["-y", "mcp-server-ollama"],
"env": {
"OLLAMA_URL": "http://localhost:11434"
}
}
}
}Quick Info
Authorcommunity
LanguageTypeScript
StatusStable
Stars★ 240
Last UpdatedFeb 12, 2026