
MCP Manager
Supports UIby dmaxter
Bridge LLMs and MCP servers with flexible workspace configurations and multi-model support.
What it does
MCP Manager acts as a middleware bridge between Large Language Models (LLMs) and Model Context Protocol (MCP) servers. It enables users to interact with both local and remote APIs using natural language prompts via a standardized HTTP API, allowing for the creation of distinct workspaces with different LLM endpoints and server configurations.
Tools
- Workspace Management: Configure separate environments with unique LLM and MCP server pairings via YAML.
- Model Routing: Support for Gemini and Azure OpenAI (with Claude and OpenAI planned).
- HTTP Gateway: Exposes an OpenAI-compatible API for sending prompts to configured MCP environments.
Installation
Download the binary from the GitLab releases page and configure the config.yaml file.
{
"mcpServers": {
"mcp-manager": {
"command": "mcp-manager",
"args": []
}
}
}
Supported hosts
Confirmed support for Gemini and Azure OpenAI endpoints.
Quick install
curl -L https://gitlab.com/DMaxter/mcp-manager/-/releases/permalink/latest -o mcp-manager && chmod +x mcp-managerInformation
- Pricing
- free
- Published
- 4/30/2026
- stars






