User Prompt MCP
A Model Context Protocol (MCP) server for Cursor that enables requesting user input during generation. This is mostly AI-generated code.
Overview
This project implements an MCP server that allows Cursor (or any MCP-compatible client) to request additional input from users during model generation without ending the generation process. It serves as a bridge between the AI model and the user, creating a more interactive experience.
Features
- User Input Prompting: Allows the AI to ask for more information during generation
- Simple GUI: Presents input prompts in a dialog box with text wrapping
- Cross-Platform: Should work on both Linux (tested) and macOS
- Stdio Transport: Integration with Cursor via stdio
Installation
Prerequisites
- For GUI functionality:
- Linux: zenity
- macOS: osascript (built-in)
Quick Install (Recommended)
The easiest way to install user-prompt-mcp is using our installation script, which automatically downloads the appropriate binary for your system:
# Install the latest version
curl -sSL https://raw.githubusercontent.com/nazar256/user-prompt-mcp/main/install.sh | bash
# Or install a specific version
curl -sSL https://raw.githubusercontent.com/nazar256/user-prompt-mcp/main/install.sh | bash -s -- -v v1.0.0
The installation script will:
- Detect your OS and architecture (Linux or macOS, amd64 or arm64)
- Download the appropriate pre-compiled binary
- Install it to a suitable directory in your PATH
- Make it executable
Install from source
If you prefer to build from source (requires Go 1.18+):
go install github.com/nazar256/user-prompt-mcp/cmd/user-prompt-mcp@latest
Manual Installation
You can also download pre-compiled binaries directly from the Releases page.
Usage with Cursor
- Open Cursor and go to Settings
- Navigate to the MCP Servers section and add a new server by editing JSON:
{
"mcpServers": {
"prompt": {
"command": "user-prompt-mcp"
}
}
}
-
Save and refresh
-
Add to global rules something like: "Important - do not finish responding until explicitly told to do so, instead always use MCP
input_user_promptand wait for response in all cases, either you feel stuck, you have question or you finished work on a prompt - always communicate with the user using this MCP."
Configuration
The server can be configured using command-line flags or environment variables:
Timeout Configuration
By default, the server will wait 20 minutes for user input before timing out. You can customize this timeout using:
- Command line flag:
--timeout <seconds>user-prompt-mcp --timeout 600 # Set timeout to 10 minutes - Environment variable:
USER_PROMPT_TIMEOUT=<seconds>export USER_PROMPT_TIMEOUT=1800 # Set timeout to 30 minutes user-prompt-mcp
Now when using Cursor, the AI can request additional input from you without ending its generation.
License
MIT
Acknowledgements
Recommend MCP Servers 💡
MarkusPfundstein/mcp-obsidian
MCP server that interacts with Obsidian via the Obsidian rest API community plugin
@imprvhub/mcp-claude-spotify
An integration that allows Claude Desktop to interact with Spotify using the Model Context Protocol (MCP).
gitlab-mcp
A MCP server enabling AI clients to perform GitLab operations like repo management, issue tracking, and merge requests via the GitLab API.

kluster.ai
kluster.ai offers managed cloud or self-hosted MCP servers providing AI services like verification directly into development workflows.
@renant/mcp-tabnews
An MCP server providing tools to interact with TabNews, including fetching content, user-specific data, analytics, and RSS feeds.
@skeetbuild/opensearch
A Model Context Protocol server providing read-only access to OpenSearch clusters for LLMs to inspect indices and execute queries.