
OpenClaude — Multi-LLM Shim
from trending-skills22
Routes Claude Code tool calls through an OpenAI-compatible shim so Claude Code can use GPT-4o, Gemini, Ollama and other OpenAI-style models while preserving too
What it does
OpenClaude provides an OpenAI-compatible shim for Claude Code, translating Anthropic-style tool and stream interfaces into OpenAI Chat Completions calls. This lets Claude Code continue to use its full tool system (bash, file ops, webfetch, agents, notebooks) while running on a wide range of models: GPT-4o, DeepSeek, Gemini (via OpenRouter), Ollama, Groq, Mistral, Azure OpenAI, and local LM hosts.
When to use it
Use this skill when you need Claude Code's agent/tooling surface but prefer or must run on a non-Anthropic provider (OpenAI, Ollama, OpenRouter-backed Gemini, or other OpenAI-compatible endpoints). It's useful for teams migrating from Anthropic, testing alternative models, or consolidating tooling around OpenAI-compatible APIs.
What's included
- Scripts: none bundled with this SKILL.md (has_scripts=false)
- References: none in repo root (has_references=false)
- Instructions: The SKILL.md documents installation options (npm or from-source), required environment variables (CLAUDE_CODE_USE_OPENAI, OPENAI_API_KEY/BASE_URL/MODEL), provider examples (OpenAI, DeepSeek, Gemini/OpenRouter, Ollama, Groq, Mistral, Azure, Codex), and developer workflow commands for running and diagnosing the shim.
Compatible agents
Primarily targets Claude Code (Anthropic-based agent stacks) but enables compatibility with any agent runtime that can call OpenAI-style chat completions (OpenAI, OpenRouter-backed Gemini, Ollama local runtimes, Codex).







