MCP Bridge
by secretiveshell
Middleware providing an OpenAI-compatible endpoint to call MCP tools from any OpenAI-compatible client.
What it does
MCP Bridge acts as a translation layer between the OpenAI API and the Model Context Protocol (MCP). It allows developers to use any LLM client that supports the OpenAI API to interact with MCP tools, effectively bypassing the need for native MCP support in the client application.
Tools
As a bridge, it dynamically exposes the tools of the MCP servers configured in its config.json. Key functionality includes:
- OpenAI API Compatibility: Provides standard chat completion endpoints that trigger MCP tool calls.
- SSE Bridge: Offers an SSE endpoint allowing MCP-native clients to use MCP Bridge as a server.
- Dynamic Tool Routing: Forwards tool requests from the OpenAI API to the appropriate underlying MCP server.
Installation
Docker
- Clone the repo and create a
compose.yml. - Add configuration via environment variables or volume mount:
environment:
- MCP_BRIDGE__CONFIG__JSON='{"inference_server":{"base_url":"http://example.com/v1","api_key":"None"},"mcp_servers":{"fetch":{"command":"uvx","args":["mcp-server-fetch"]}}}'
- Run
docker-compose up --build -d.
Manual
uv sync- Create
config.jsonin the root. uv run mcp_bridge/main.py
Supported hosts
Confirmed for any OpenAI-compatible client, including Open WebUI.
Quick install
docker-compose up --build -dInformation
- Pricing
- free
- Published







