
Dolphin MCP
by QuixiAI
Flexible Python library and CLI to interact with MCP servers using any LLM provider.
What it does
Dolphin MCP acts as a universal bridge between various Large Language Model (LLM) providers and MCP servers. It allows users to query and interact with multiple MCP servers simultaneously through a conversational interface, making the tools exposed by those servers available to models from OpenAI, Anthropic, Ollama, and LMStudio.
Tools
- Tool Discovery: Automatically identifies and lists available tools from connected MCP servers.
- Multi-Provider Routing: Routes natural language queries to a selected LLM provider and executes the resulting tool calls against the MCP servers.
- Conversational Interface: Maintains context across multiple turns of interaction via a CLI or programmatic library.
Installation
To use Dolphin MCP as a server/client bridge, install via pip:
pip install dolphin-mcp
Then configure your servers in mcp_config.json:
{
"mcpServers": {
"example-server": {
"command": "command-to-start-server",
"args": ["arg1"]
}
}
}
Supported hosts
- claude
- cursor
- codex
Quick install
pip install dolphin-mcpInformation
- Pricing
- free
- Published
- 5/7/2026
- stars





