User Prompt MCP
A Model Context Protocol (MCP) server for Cursor that enables requesting user input during generation. This is mostly AI-generated code.
Overview
This project implements an MCP server that allows Cursor (or any MCP-compatible client) to request additional input from users during model generation without ending the generation process. It serves as a bridge between the AI model and the user, creating a more interactive experience.
Features
- User Input Prompting: Allows the AI to ask for more information during generation
- Simple GUI: Presents input prompts in a dialog box with text wrapping
- Cross-Platform: Should work on both Linux (tested) and macOS
- Stdio Transport: Integration with Cursor via stdio
Installation
Prerequisites
- For GUI functionality:
- Linux: zenity
- macOS: osascript (built-in)
Quick Install (Recommended)
The easiest way to install user-prompt-mcp is using our installation script, which automatically downloads the appropriate binary for your system:
# Install the latest version
curl -sSL https://raw.githubusercontent.com/nazar256/user-prompt-mcp/main/install.sh | bash
# Or install a specific version
curl -sSL https://raw.githubusercontent.com/nazar256/user-prompt-mcp/main/install.sh | bash -s -- -v v1.0.0
The installation script will:
- Detect your OS and architecture (Linux or macOS, amd64 or arm64)
- Download the appropriate pre-compiled binary
- Install it to a suitable directory in your PATH
- Make it executable
Install from source
If you prefer to build from source (requires Go 1.18+):
go install github.com/nazar256/user-prompt-mcp/cmd/user-prompt-mcp@latest
Manual Installation
You can also download pre-compiled binaries directly from the Releases page.
Usage with Cursor
- Open Cursor and go to Settings
- Navigate to the MCP Servers section and add a new server by editing JSON:
{
"mcpServers": {
"prompt": {
"command": "user-prompt-mcp"
}
}
}
-
Save and refresh
-
Add to global rules something like: "Important - do not finish responding until explicitly told to do so, instead always use MCP
input_user_promptand wait for response in all cases, either you feel stuck, you have question or you finished work on a prompt - always communicate with the user using this MCP."
Configuration
The server can be configured using command-line flags or environment variables:
Timeout Configuration
By default, the server will wait 20 minutes for user input before timing out. You can customize this timeout using:
- Command line flag:
--timeout <seconds>user-prompt-mcp --timeout 600 # Set timeout to 10 minutes - Environment variable:
USER_PROMPT_TIMEOUT=<seconds>export USER_PROMPT_TIMEOUT=1800 # Set timeout to 30 minutes user-prompt-mcp
Now when using Cursor, the AI can request additional input from you without ending its generation.
License
MIT
Acknowledgements
Recommend MCP Servers 💡
memory-plus
A lightweight, local RAG memory store for MCP agents that enables recording, retrieving, updating, deleting, and visualizing persistent memories across sessions.
mcp-server-port
Port's MCP Server enables advanced automations and natural language interactions for developers and AI applications by providing access to Port.io's internal developer portal capabilities, including managing blueprints, entities, scorecards, and invoking AI agents.
agent-twitter-client-mcp
A Model Context Protocol (MCP) server that integrates with Twitter, enabling AI models to interact with the platform for tweet operations, user management, and Grok AI integration without direct API access.
mcp-server-odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
ysthink/Filesystem-MCP-Server-SSE
Node.js server implementing Model Context Protocol (MCP) with SSE transport for filesystem operations
antymijaljevic/k8s-doc-mcp
A Kubernetes documentation assistant MCP server that fetches and converts docs to Markdown and provides recommendations.