pinecone-assistant
Pinecone Assistant
The Pinecone Assistant MCP server allows you to integrate Pinecone's powerful vector database and assistant capabilities with various AI agents and applications. It provides a standardized Model Context Protocol (MCP) interface for retrieving context snippets, building assistants, and enabling advanced AI workflows.
What it does:
- Provides an MCP endpoint for Pinecone Assistant.
- Facilitates retrieval of relevant context snippets for AI models.
- Enables integration with popular AI frameworks and tools like LangChain, Claude Desktop, and Cursor.
How to use: There are two primary ways to use the Pinecone Assistant MCP server:
-
Remote MCP Server (SSE/Streamable HTTP): Connect directly to your hosted Pinecone Assistant instance via HTTP. The endpoint URL follows the format:
https://<YOUR_PINECONE_ASSISTANT_HOST>/mcp/assistants/<YOUR_ASSISTANT_NAME>. It supports both Streamable HTTP and SSE transports. Example LangChain integration:from langchain_mcp_adapters.client import MultiServerMCPClient # ... (other imports and setup) async with MultiServerMCPClient( { "assistant_ai_news": { "url": "https://prod-1-data.ke.pinecone.io/mcp/assistants/ai-news", "transport": "streamable_http", "headers": {"Authorization": f"Bearer {pinecone_api_key}"} } } ) as client: # ... (use client.get_tools() with your agent) -
Local MCP Server (Stdio via Docker): Run the Pinecone Assistant MCP server locally using Docker. This provides a stdio interface. To start the server:
docker run -i --rm \ -e PINECONE_API_KEY=<PINECONE_API_KEY> \ -e PINECONE_ASSISTANT_HOST=<PINECONE_ASSISTANT_HOST> \ pinecone/assistant-mcpYou can configure this in your
mcp.jsonfor tools like Claude Desktop or Cursor:{ "mcpServers": { "pinecone-assistant": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "PINECONE_API_KEY", "-e", "PINECONE_ASSISTANT_HOST", "pinecone/assistant-mcp" ], "env": { "PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY>", "PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST>" } } } }
Implementation Details:
- The remote server leverages Pinecone's cloud infrastructure.
- The local server is provided as a Docker image (
pinecone/assistant-mcp) for easy deployment and integration.
Recommend MCP Servers 💡
Napier
An MCP server that enables AI agents to interact with WhatsApp, allowing them to search, read, and send messages, including media files, by connecting to a personal WhatsApp account locally.
luebken/playlist-mcp
An experimental MCP server that provides transcripts of YouTube playlists.
ConechoAI/openai-websearch-mcp
An MCP server that integrates OpenAI's web search functionality, allowing AI assistants to access up-to-date information during conversations.
search1api-mcp
A Model Context Protocol (MCP) server that provides search and crawl functionality using Search1API.
mcp-python-interpreter
A Model Context Protocol (MCP) server that allows LLMs to interact with Python environments, execute Python code, manage packages, and perform file operations.
PubTator-MCP-Server
A biomedical literature annotation and relationship mining server based on PubTator3, providing convenient access through the Model Context Protocol (MCP) interface for AI assistants.