Memgraph AI Toolkit
Build powerful AI applications with graph-powered RAG using Memgraph. This toolkit provides everything you need to integrate knowledge graphs into your GenAI workflows.
🚀 Quick Setup
Start Memgraph
docker run -p 7687:7687 \\
--name memgraph \\
memgraph/memgraph-mage:latest \\
--schema-info-enabled=true
Install Packages
# Core toolbox
pip install memgraph-toolbox
# LangChain integration
pip install langchain-memgraph
# MCP server
pip install mcp-memgraph
# Unstructured to Graph
pip install unstructured2graph
📚 Usage Examples
unstructured2graph - Build Knowledge Graphs from Documents
Transform PDFs, URLs, and documents into queryable knowledge graphs:
import asyncio
from memgraph_toolbox.api.memgraph import Memgraph
from lightrag_memgraph import MemgraphLightRAGWrapper
from unstructured2graph import from_unstructured, create_index
async def main():
memgraph = Memgraph()
create_index(memgraph, "Chunk", "hash")
lightrag = MemgraphLightRAGWrapper()
await lightrag.initialize(working_dir="./lightrag_storage")
# Ingest documents from URLs or local files
await from_unstructured(
sources=["https://example.com/doc.pdf", "./local_file.md"],
memgraph=memgraph,
lightrag_wrapper=lightrag,
link_chunks=True,
)
await lightrag.afinalize()
asyncio.run(main())
👉 Full Documentation | Examples
langchain-memgraph - LangChain Integration
Natural Language Queries with MemgraphQAChain
from langchain_memgraph.graphs.memgraph import MemgraphLangChain
from langchain_memgraph.chains.graph_qa import MemgraphQAChain
from langchain_openai import ChatOpenAI
graph = MemgraphLangChain(url="bolt://localhost:7687")
chain = MemgraphQAChain.from_llm(
ChatOpenAI(temperature=0),
graph=graph,
model_name="gpt-4-turbo",
allow_dangerous_requests=True,
)
response = chain.invoke("Who are the main characters in the dataset?")
print(response["result"])
Build Agents with MemgraphToolkit
from langchain.chat_models import init_chat_model
from langchain_memgraph import MemgraphToolkit
from langchain_memgraph.graphs.memgraph import MemgraphLangChain
from langgraph.prebuilt import create_react_agent
llm = init_chat_model("gpt-4o-mini", model_provider="openai")
db = MemgraphLangChain(url="bolt://localhost:7687")
toolkit = MemgraphToolkit(db=db, llm=llm)
agent = create_react_agent(llm, toolkit.get_tools())
events = agent.stream({"messages": [("user", "Find all Person nodes")]})
mcp-memgraph - Model Context Protocol Server
Expose Memgraph to LLMs via MCP. Run with Docker:
# HTTP mode (recommended)
docker run --rm -p 8000:8000 mcp-memgraph:latest
# Stdio mode for MCP clients
docker run --rm -i -e MCP_TRANSPORT=stdio mcp-memgraph:latest
Available Tools:
| Tool | Description |
|---|---|
run_query |
Execute Cypher queries |
get_schema |
Fetch graph schema |
get_page_rank |
Compute PageRank scores |
get_node_neighborhood |
Find nodes within distance |
search_node_vectors |
Vector similarity search |
sql2graph Agent - Automated Database Migration
Migrate from MySQL/PostgreSQL to Memgraph with AI assistance:
cd agents/sql2graph
uv run main.py
🛠️ Packages Overview
| Package | Description | Install |
|---|---|---|
| memgraph-toolbox | Core utilities for Memgraph | pip install memgraph-toolbox |
| langchain-memgraph | LangChain tools and chains | pip install langchain-memgraph |
| mcp-memgraph | MCP server for LLMs | pip install mcp-memgraph |
| unstructured2graph | Document to graph conversion | pip install unstructured2graph |
| sql2graph | Database migration agent | See docs |
❓ FAQ
Which databases are supported?
Memgraph is the primary target. The sql2graph agent supports MySQL and PostgreSQL as source databases.
Do I need an LLM API key?
Yes, for features like entity extraction (unstructured2graph) and natural language queries (langchain-memgraph).
Can I use local LLMs?
Yes! LangChain integration supports any LangChain-compatible model, including Ollama.
🤝 Community
⭐ If you find this toolkit helpful, please star the repository!
🧪 Developing Locally
You can build and test each package directly from your repo.
Core tests
uv pip install -e memgraph-toolbox[test]
pytest -s memgraph-toolbox/src/memgraph_toolbox/tests
LangChain integration tests
Create a .env file with your OPENAI_API_KEY, as the tests depend on LLM calls:
uv pip install -e integrations/langchain-memgraph[test]
pytest -s integrations/langchain-memgraph/tests
MCP integration tests
uv pip install -e integrations/mcp-memgraph[test]
pytest -s integrations/mcp-memgraph/tests
Agent integration tests
uv pip install -e integrations/agents[test]
pytest -s integrations/agents/tests
To run a complete migration workflow with the agent:
cd integrations/agents
uv run main.py
Note: The agent requires both MySQL and Memgraph connections. Set up your environment variables in .env based on .env.example.
If you are running any test on macOS in zsh, add "" to the command:
uv pip install -e memgraph-toolbox"[test]"
Recommend MCP Servers 💡
@skysqlinc/skysql-mcp
SkySQL MCP server and client repository.
mcp_server_waii
A Model Context Protocol server that enables Language Models to interact with databases through natural language via WAII.
mcp-clickhouse
Connect ClickHouse to your AI assistants.
AlekseyKapustyanenko/NihFix.Postgres.Mcp
Lightweight MCP server for PostgreSQL enabling AI agents to interact with databases in real-time
jdbc@quarkiverse/quarkus-mcp-servers
MCP server enabling LLMs to interact with JDBC-compatible databases via inspect, query, create, and modify operations.
mysql-mcp-server
A Model Context Protocol (MCP) server that enables secure interaction with MySQL databases