Ollama MCP
by rawveg
Complete Ollama SDK integration for local LLM management, chat, and embeddings within MCP clients.
What it does
Ollama MCP connects your AI assistant to a local Ollama instance, providing full programmatic control over local LLMs. It allows the AI to manage models, generate completions, and create embeddings without leaving the chat interface.
Tools
ollama_list: List available local models.ollama_show: Get detailed model info.ollama_pull: Download models from the Ollama library.ollama_push: Push models to the library.ollama_copy: Duplicate existing models.ollama_delete: Remove models from storage.ollama_create: Create custom models via Modelfile.ollama_ps: List currently running models.ollama_generate: Generate text completions.ollama_chat: Interactive chat with local models.ollama_embed: Generate text embeddings.ollama_web_search: Search the web via Ollama Cloud (requires API key).ollama_web_fetch: Fetch web page content via Ollama Cloud (requires API key).
Installation
Add to claude_desktop_config.json:
{
"mcpServers": {
"ollama": {
"command": "npx",
"args": ["-y", "ollama-mcp"]
}
}
}
Supported hosts
- Claude Desktop
- Windsurf
- Cursor
- Cline
Quick install
npx -y ollama-mcpInformation
- Pricing
- free
- Published
- 4/17/2026






