
Ollama MCP Bridge
Supports UIby jonigl
Transparent API proxy adding MCP server tool capabilities to Ollama LLMs.
What it does
Ollama MCP Bridge provides a seamless API layer in front of the Ollama API, allowing local LLMs to access tools from multiple MCP servers transparently. It intercepts /api/chat requests and manages the multi-round tool execution loop between the model and various MCP transports.
Tools
This is a bridge/proxy and does not provide its own tools; instead, it exposes all tools from any MCP servers configured in mcp-config.json (e.g., filesystem, weather, or custom servers).
Installation
Add the bridge to your local environment and configure your MCP servers in mcp-config.json:
{
"mcpServers": {
"example-server": {
"command": "uvx",
"args": ["some-mcp-server"]
}
}
}
Run via Docker or pip: pip install ollama-mcp-bridge
Supported hosts
Confirmed for local Ollama deployments and any client that can point to an Ollama-compatible API endpoint.
Quick install
pip install ollama-mcp-bridgeInformation
- Pricing
- free
- Published
- 5/11/2026
- stars





