MCP Rubber Duck
Supports UIby nesquikm
Bridge to multiple OpenAI-compatible LLMs with interactive compare, vote, and debate UIs. Rubber duck debugging with AI panels that actually think.
What it does
MCP Rubber Duck connects your AI client to multiple LLM providers simultaneously — OpenAI, Gemini, Groq, Claude, Codex, and more. Inspired by rubber duck debugging, it lets you explain a problem to a "panel of ducks" and get diverse AI perspectives at once. Four tools render rich interactive HTML panels inside MCP Apps-compatible clients (Claude Desktop, Cursor, VS Code Copilot) for side-by-side comparison, voting, debates, and usage analytics.
Key features
- Compare Ducks — Side-by-side model responses with latency, token counts, and error states in a live interactive panel
- Duck Vote — Multi-LLM voting with confidence bars, consensus badge, and collapsible reasoning
- Duck Debate — Structured Oxford, Socratic, or adversarial debates rendered round-by-round
- Usage Stats — Provider breakdown with estimated costs and token distribution bars
- MCP Bridge — Route duck queries through external MCP servers for extended tool access
- CLI Agent Support — Use Claude Code, Codex, Gemini CLI, Grok, or Aider as ducks
- Guardrails — Rate limiting, token limits, pattern blocking, and PII redaction
Installation
Claude Desktop — add to claude_desktop_config.json:
{
"mcpServers": {
"rubber-duck": {
"command": "npx",
"args": ["mcp-rubber-duck"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
}
VS Code Copilot — add to .vscode/settings.json:
{
"github.copilot.chat.mcp.servers": {
"rubber-duck": {
"command": "npx",
"args": ["mcp-rubber-duck"],
"env": { "OPENAI_API_KEY": "sk-..." }
}
}
}
Cursor — add to ~/.cursor/mcp.json:
{
"mcpServers": {
"rubber-duck": {
"command": "npx",
"args": ["mcp-rubber-duck"],
"env": { "OPENAI_API_KEY": "sk-..." }
}
}
}
Supported hosts
Claude Desktop, VS Code Copilot, Cursor, Windsurf (all confirmed in README/docs).
Quick install
npx mcp-rubber-duckInformation
- Pricing
- free
- Published
- 4/6/2026
- stars