Loading languages...
TO

topoteretes/cognee-mcp

@topoteretes10279

Run cognee’s memory engine as a Model Context Protocol server

ai-agent
memory
knowledge-graph
mcp-server
python

Markdown Content: Image 1: Cognee Logo

cognee‑mcp -Run cognee’s memory engine as a Model Context Protocol server

Demo . Learn more · Join Discord · Join r/AIMemory

Image 2: GitHub forksImage 3: GitHub starsImage 4: GitHub commitsImage 5: Github tagImage 6: DownloadsImage 7: LicenseImage 8: Contributors

Image 9: cognee - Memory for AI Agents in 5 lines of code | Product Hunt

Image 10: topoteretes%2Fcognee | Trendshift

Build memory for Agents and query from any client that speaks MCP– in your terminal or IDE.

✨ Features

  • Multiple transports – choose Streamable HTTP --transport http (recommended for web deployments), SSE --transport sse (real‑time streaming), or stdio (classic pipe, default)
  • Integrated logging – all actions written to a rotating file (see get_log_file_location()) and mirrored to console in dev
  • Local file ingestion – feed .md, source files, Cursor rule‑sets, etc. straight from disk
  • Background pipelines – long‑running cognify & codify jobs spawn off‑thread; check progress with status tools
  • Developer rules bootstrap – one call indexes .cursorrules, .cursor/rules, AGENT.md, and friends into the developer_rules nodeset
  • Prune & reset – wipe memory clean with a single prune call when you want to start fresh

Please refer to our documentation here for further information.

🚀 Quick Start

  1. Clone cognee repo ``` git clone https://github.com/topoteretes/cognee.git
2.   Navigate to cognee-mcp subdirectory ```
cd cognee/cognee-mcp
  1. Install uv if you don't have one ``` pip install uv
4.   Install all the dependencies you need for cognee mcp server with uv ```
uv sync --dev --all-extras --reinstall
  1. Activate the virtual environment in cognee mcp directory ``` source .venv/bin/activate
6.   Set up your OpenAI API key in .env for a quick setup with the default cognee configurations ```
LLM_API_KEY="YOUR_OPENAI_API_KEY"
  1. Run cognee mcp server with stdio (default) python src/server.py or stream responses over SSE python src/server.py --transport sse or run with Streamable HTTP transport (recommended for web deployments) ``` python src/server.py --transport http --host 127.0.0.1 --port 8000 --path /mcp

You can do more advanced configurations by creating .env file using our [template.](https://github.com/topoteretes/cognee/blob/main/.env.template) To use different LLM providers / database configurations, and for more info check out our [documentation](https://docs.cognee.ai/).

🐳 Docker Usage
---------------

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#-docker-usage)
If you’d rather run cognee-mcp in a container, you have two options:

1.   **Build locally**
    1.   Make sure you are in /cognee root directory and have a fresh `.env` containing only your `LLM_API_KEY` (and your chosen settings).
    2.   Remove any old image and rebuild: docker rmi cognee/cognee-mcp:main || true
docker build --no-cache -f cognee-mcp/Dockerfile -t cognee/cognee-mcp:main . 
    3.   Run it: # For HTTP transport (recommended for web deployments)
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport http
# For SSE transport 
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport sse
# For stdio transport (default)
docker run --env-file ./.env --rm -it cognee/cognee-mcp:main 

2.   **Pull from Docker Hub** (no build required): # With HTTP transport (recommended for web deployments)
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport http
# With SSE transport
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport sse
# With stdio transport (default)
docker run --env-file ./.env --rm -it cognee/cognee-mcp:main 

💻 Basic Usage
--------------

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#-basicusage)
The MCP server exposes its functionality through tools. Call them from any MCP client (Cursor, Claude Desktop, Cline, Roo and more).

### Available Tools

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#available-tools)
*   **cognify**: Turns your data into a structured knowledge graph and stores it in memory

*   **codify**: Analyse a code repository, build a code graph, stores it in memory

*   **search**: Query memory – supports GRAPH_COMPLETION, RAG_COMPLETION, CODE, CHUNKS, INSIGHTS

*   **list_data**: List all datasets and their data items with IDs for deletion operations

*   **delete**: Delete specific data from a dataset (supports soft/hard deletion modes)

*   **prune**: Reset cognee for a fresh start (removes all data)

*   **cognify_status / codify_status**: Track pipeline progress

**Data Management Examples:**

# List all available datasets and data items
list_data()

# List data items in a specific dataset
list_data(dataset_id="your-dataset-id-here")

# Delete specific data (soft deletion - safer, preserves shared entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="soft")

# Delete specific data (hard deletion - removes orphaned entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="hard")

Remember– use the CODE search type to query your code graph. For huge repos, run codify on modules incrementally and cache results.

### IDE Example: Cursor

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#ide-example-cursor)
1.   After you run the server as described in the [Quick Start](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#-quickstart), create a run script for cognee. Here is a simple example:

#!/bin/bash export ENV=local export TOKENIZERS_PARALLELISM=false export EMBEDDING_PROVIDER="fastembed" export EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2" export EMBEDDING_DIMENSIONS=384 export EMBEDDING_MAX_TOKENS=256 export LLM_API_KEY=your-OpenAI-API-key uv --directory /{cognee_root_path}/cognee-mcp run cognee

Remember to replace _your-OpenAI-API-key_ and _{cognee\\_root\\_path}_ with correct values.

2.   Install Cursor and navigate to Settings→MCP Tools → New MCP Server

3.   Cursor will open _mcp.json_ file in a new tab. Configure your cognee MCP server by copy-pasting the following:

{ "mcpServers": { "cognee": { "command": "sh", "args": [ "/{path-to-your-script}/run-cognee.sh" ] } } }

Remember to replace _{path-to-your-script}_ with the correct value of the path of the script you created in the first step.

That's it! You can refresh the server from the toggle next to your new cognee server. Check the green dot and the available tools to verify your server is running.

Now you can open your Cursor Agent and start using cognee tools from it via prompting.

Development and Debugging
-------------------------

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#development-and-debugging)
### Debugging

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#debugging)
To use debugger, run: `bash mcp dev src/server.py`

Open inspector with timeout passed: `http://localhost:5173?timeout=120000`

To apply new changes while developing cognee you need to do:

1.   `poetry lock` in cognee folder
2.   `uv sync --dev --all-extras --reinstall`
3.   `mcp dev src/server.py`

### Development

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#development)
In order to use local cognee:

1.   Uncomment the following line in the cognee-mcp [`pyproject.toml`](https://github.com/topoteretes/cognee/blob/dev/cognee-mcp/pyproject.toml) file and set the cognee root path.

#"cognee[postgres,codegraph,gemini,huggingface,docs,neo4j] @ file:/Users//Desktop/cognee"

Remember to replace `file:/Users/<username>/Desktop/cognee` with your actual cognee root path.

2.   Install dependencies with uv in the mcp folder

uv sync --reinstall


Code of Conduct
---------------

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#code-of-conduct)
We are committed to making open source an enjoyable and respectful experience for our community. See [`CODE_OF_CONDUCT`](https://github.com/topoteretes/cognee/blob/main/CODE_OF_CONDUCT.md) for more information.

💫 Contributors
---------------

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#-contributors)[![Image 11: contributors](https://camo.githubusercontent.com/86a2f75e356cd9d05090de0e45f5159a0dd62404c8d39401df62d58a2dbe1d46/68747470733a2f2f636f6e747269622e726f636b732f696d6167653f7265706f3d746f706f746572657465732f636f676e6565)](https://github.com/topoteretes/cognee/graphs/contributors)
Star History
------------

[](https://github.com/topoteretes/cognee/tree/dev/cognee-mcp#star-history)
[![Image 12: Star History Chart](https://camo.githubusercontent.com/f4e842b9a679985888cf20c2966493ce6d98e104b422629aea9d334050ca0f87/68747470733a2f2f6170692e737461722d686973746f72792e636f6d2f7376673f7265706f733d746f706f746572657465732f636f676e656526747970653d44617465)](https://star-history.com/#topoteretes/cognee&Date)

# mcpServer Config

{
  "mcpServers": {
    "cognee": {
      "command": "sh",
      "args": [
        "/{path-to-your-script}/run-cognee.sh"
      ]
    }
  }
}

# stdio

python src/server.py

# streamableURL

http://127.0.0.1:8000/mcp
Transport:
stdio
sse
streamable
Language:
python
Created: 8/16/2023
Updated: 12/18/2025