[!IMPORTANT]
This repository has been merged into the Memgraph AI Toolkit monorepo to avoid duplicating tools.
It will be deleted in one month—please follow the MCP integration there for all future development, and feel free to open issues or PRs in that repo.
🚀 Memgraph MCP Server
Memgraph MCP Server is a lightweight server implementation of the Model Context Protocol (MCP) designed to connect Memgraph with LLMs.

⚡ Quick start
1. Run Memgraph MCP Server
- Install
uvand createvenvwithuv venv. Activate virtual environment with.venv\\Scripts\\activate. - Install dependencies:
uv add "mcp[cli]" httpx - Run Memgraph MCP server:
uv run server.py.
2. Run MCP Client
- Install Claude for Desktop.
- Add the Memgraph server to Claude config:
MacOS/Linux
code ~/Library/Application\\ Support/Claude/claude_desktop_config.json
Windows
code $env:AppData\\Claude\\claude_desktop_config.json
Example config:
{
"mcpServers": {
"mpc-memgraph": {
"command": "/Users/katelatte/.local/bin/uv",
"args": [
"--directory",
"/Users/katelatte/projects/mcp-memgraph",
"run",
"server.py"
]
}
}
}
[!NOTE]
You may need to put the full path to the uv executable in the command field. You can get this by runningwhich uvon MacOS/Linux orwhere uvon Windows. Make sure you pass in the absolute path to your server.
3. Chat with the database
- Run Memgraph MAGE:
Thedocker run -p 7687:7687 memgraph/memgraph-mage --schema-info-enabled=True--schema-info-enabledconfiguration setting is set toTrueto allow LLM to runSHOW SCHEMA INFOquery. - Open Claude Desktop and see the Memgraph tools and resources listed. Try it out! (You can load dummy data from Memgraph Lab Datasets)
🔧Tools
run_query()
Run a Cypher query against Memgraph.
🗃️ Resources
get_schema()
Get Memgraph schema information (prerequisite: --schema-info-enabled=True).
🗺️ Roadmap
The Memgraph MCP Server is just at its beginnings. We're actively working on expanding its capabilities and making it even easier to integrate Memgraph into modern AI workflows. In the near future, we'll be releasing a TypeScript version of the server to better support JavaScript-based environments. Additionally, we plan to migrate this project into our central AI Toolkit repository, where it will live alongside other tools and integrations for LangChain, LlamaIndex, and MCP. Our goal is to provide a unified, open-source toolkit that makes it seamless to build graph-powered applications and intelligent agents with Memgraph at the core.
Recommend MCP Servers 💡
@tomtom-org/tomtom-mcp
Provides seamless access to TomTom’s location services, including search, routing, traffic and static maps data, enabling easy integration of precise geolocation data into AI workflows and development environments.
devonmojito/ton-blockchain-mcp
An MCP server for natural language interaction with the TON blockchain
nexus-mcp
MCP Server to make searching openrouter easy
@modelcontextprotocol/server-gdrive
An MCP server that integrates with Google Drive to enable listing, reading, and searching over files, automatically converting Google Workspace documents to common formats.
paprika-3-mcp
An MCP server that exposes Paprika 3 recipes as LLM-readable resources and enables LLM-driven recipe creation/editing
datavis
A Model Context Protocol (MCP) server implementation that provides the LLM an interface for visualizing data using Vega-Lite syntax