
Entroly
Supports UIby juyterman1000
Cut AI token costs by 70-95% with a self-evolving codebase compression daemon for LLM context windows.
What it does
Entroly is a high-performance codebase compression engine that solves the 'blind spot' problem in AI coding tools. It compresses entire repositories into a razor-sharp context window, allowing models like Claude and GPT to see the full project structure without burning through token budgets.
Tools
compress: Compresses any content (code, JSON, logs) to a specific token budget.compress_messages: Optimizes full LLM conversation histories to retain critical context while removing noise.wrap: Wraps existing coding agents (Claude Code, Cursor, Codex) to automatically optimize their context delivery.proxy: Provides an HTTP proxy (localhost:9377) to inject compression into any OpenAI/Anthropic compatible app.
Installation
Add to claude_desktop_config.json:
{
"mcpServers": {
"entroly": {
"command": "pip install entroly && entroly go"
}
}
}
Supported hosts
Claude, Cursor, Codex, Aider, GitHub Copilot, Windsurf, Cline, Cody
Quick install
pip install entroly && entroly goInformation
- Pricing
- free
- Published
- 5/3/2026
- stars





