Langflow-DOC-QA-SERVER
A Model Context Protocol server for document Q&A powered by Langflow
This is a TypeScript-based MCP server that implements a document Q&A system. It demonstrates core MCP concepts by providing a simple interface to query documents through a Langflow backend.
Prerequisites
1. Create Langflow Document Q&A Flow
- Open Langflow and create a new flow from the "Document Q&A" template
- Configure your flow with necessary components (ChatInput, File Upload, LLM, etc.)
- Save your flow
2. Get Flow API Endpoint
- Click the "API" button in the top right corner of Langflow
- Copy the API endpoint URL from the cURL command
Example:
http://127.0.0.1:7860/api/v1/run/<flow-id>?stream=false - Save this URL as it will be needed for the
API_ENDPOINTconfiguration
Features
Tools
query_docs- Query the document Q&A system- Takes a query string as input
- Returns responses from the Langflow backend
Development
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"langflow-doc-qa-server": {
"command": "node",
"args": [
"/path/to/doc-qa-server/build/index.js"
],
"env": {
"API_ENDPOINT": "http://127.0.0.1:7860/api/v1/run/480ec7b3-29d2-4caa-b03b-e74118f35fac"
}
}
}
}
Installing via Smithery
To install Document Q&A Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @GongRzhe/Langflow-DOC-QA-SERVER --client claude
Environment Variables
The server supports the following environment variables for configuration:
API_ENDPOINT: The endpoint URL for the Langflow API service. Defaults tohttp://127.0.0.1:7860/api/v1/run/480ec7b3-29d2-4caa-b03b-e74118f35facif not specified.
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
📜 License
This project is licensed under the MIT License.
Recommend MCP Servers 💡
lingo.dev
A MCP server for Lingo.dev that enables AI tools to translate apps, websites, and data using LLMs via the Model Context Protocol
@timlukahorstmann/mcp-weather
A Model Context Protocol (MCP) server that provides hourly and daily weather forecasts using the AccuWeather API.
things3-mcp
An MCP server that provides comprehensive integration with Things3 on macOS, enabling AI assistants to manage tasks, projects, and tags with intelligent error correction and automatic tag creation.
yuga-planner
A neuro-symbolic system that provides agent-powered scheduling and task allocation by combining LLM with constraint solving to optimize calendar schedules around existing commitments.
OpenMM Documentation
An MCP server for semantic search of OpenMM molecular dynamics simulation documentation, providing vectorized content and semantic search capabilities for LLM integration.
jadx-ai-mcp
Plugin for JADX to integrate MCP server