✨ Winx - High-Performance Rust MCP Server ✨
🚀 1:1 Optimized Rust Implementation of WCGW (What Could Go Wrong) 🚀
Winx is a specialized Model Context Protocol (MCP) server that provides high-performance tools for LLM code agents. It implements the core functionality of WCGW in pure Rust for maximum efficiency and stability.
⚡ Performance
Benchmarks on i9-13900K + RTX 4090 (WSL2)
| Metric | Winx (Rust) | Python (WCGW) | Improvement |
|---|---|---|---|
| Startup Time | < 5ms | ~200ms | 🚀 40x Faster |
| Shell Command Latency | < 1ms | ~15ms | 🚀 15x Lower |
| File Read (1MB) | 0.4ms | ~40ms | 🚀 100x Faster |
| Memory Footprint | ~5MB | ~65MB | 📉 13x Smaller |
Benchmarks performed using hyperfine and memory profiling tools on standard workloads.
🛠️ MCP Tools
| Tool | Description |
|---|---|
Initialize |
Required. Setup workspace environment and shell mode options (Restricted/Full). |
BashCommand |
Execute shell commands with full PTY support (interactive, stateful). |
ReadFiles |
Efficient zero-copy file reading with line-range support. |
FileWriteOrEdit |
Robust file modification using exact SEARCH/REPLACE blocks. |
ContextSave |
Snapshot current project context (files + description) for resumption. |
ReadImage |
Optimized base64 image reading for multimodal agent contexts. |
🚀 Quick Start
Prerequisites
- Rust 1.75+
- Linux / macOS / WSL2
Installation
git clone https://github.com/gabrielmaialva33/winx-code-agent.git
cd winx-code-agent
cargo build --release
Integration with Claude Desktop
Add to ~/.config/Claude/claude_desktop_config.json:
{
"mcpServers": {
"winx": {
"command": "/path/to/winx-code-agent/target/release/winx-code-agent",
"args": ["serve"],
"env": { "RUST_LOG": "info" }
}
}
}
🏗️ Architecture
- PTY Shell: Full pseudo-terminal support for interactive commands.
- Zero-Copy I/O: Uses memory-mapped files for blazing fast reads.
- Strict Typing: Powered by Rust's safety and performance guarantees.
- WCGW Parity: Designed to be a drop-in replacement for Python-based toolsets.
📜 License
MIT - Gabriel Maia (@gabrielmaialva33)
✨ Optimized for the next generation of AI Agents ✨
Recommend MCP Servers 💡
canvas-lms-mcp
A minimal MCP server bridging AI systems with Canvas LMS for accessing education data such as courses, assignments, quizzes, and files.
fulcra-context-mcp
An MCP server for accessing Fulcra Context data via the Fulcra API
filesystem@quarkiverse/quarkus-mcp-servers
An MCP server enabling LLMs to interact with filesystem (list, read, modify files).
doris-mcp-server
Doris MCP (Model Context Protocol) Server is a backend service built with Python and FastAPI. It implements the MCP, allowing clients to interact with it through defined "Tools". It's primarily designed to connect to Apache Doris databases, potentially leveraging Large Language Models (LLMs) for tasks like converting natural language queries to SQL (NL2SQL), executing queries, and performing metadata management and analysis.
vision-tools-mcp
VisionAgent MCP is a lightweight, side-car MCP server that runs locally on STDIN/STDOUT, translating tool calls from an MCP-compatible client into authenticated HTTPS requests to Landing AI’s VisionAgent REST APIs for computer vision and document analysis.
Kiran1689/storyblok-mcp-server
MCP Server for managing Storyblok spaces, stories, components, assets, and workflows via Model Context Protocol