pinecone-assistant
Pinecone Assistant
The Pinecone Assistant MCP server allows you to integrate Pinecone's powerful vector database and assistant capabilities with various AI agents and applications. It provides a standardized Model Context Protocol (MCP) interface for retrieving context snippets, building assistants, and enabling advanced AI workflows.
What it does:
- Provides an MCP endpoint for Pinecone Assistant.
- Facilitates retrieval of relevant context snippets for AI models.
- Enables integration with popular AI frameworks and tools like LangChain, Claude Desktop, and Cursor.
How to use: There are two primary ways to use the Pinecone Assistant MCP server:
-
Remote MCP Server (SSE/Streamable HTTP): Connect directly to your hosted Pinecone Assistant instance via HTTP. The endpoint URL follows the format:
https://<YOUR_PINECONE_ASSISTANT_HOST>/mcp/assistants/<YOUR_ASSISTANT_NAME>. It supports both Streamable HTTP and SSE transports. Example LangChain integration:from langchain_mcp_adapters.client import MultiServerMCPClient # ... (other imports and setup) async with MultiServerMCPClient( { "assistant_ai_news": { "url": "https://prod-1-data.ke.pinecone.io/mcp/assistants/ai-news", "transport": "streamable_http", "headers": {"Authorization": f"Bearer {pinecone_api_key}"} } } ) as client: # ... (use client.get_tools() with your agent) -
Local MCP Server (Stdio via Docker): Run the Pinecone Assistant MCP server locally using Docker. This provides a stdio interface. To start the server:
docker run -i --rm \ -e PINECONE_API_KEY=<PINECONE_API_KEY> \ -e PINECONE_ASSISTANT_HOST=<PINECONE_ASSISTANT_HOST> \ pinecone/assistant-mcpYou can configure this in your
mcp.jsonfor tools like Claude Desktop or Cursor:{ "mcpServers": { "pinecone-assistant": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "PINECONE_API_KEY", "-e", "PINECONE_ASSISTANT_HOST", "pinecone/assistant-mcp" ], "env": { "PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY>", "PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST>" } } } }
Implementation Details:
- The remote server leverages Pinecone's cloud infrastructure.
- The local server is provided as a Docker image (
pinecone/assistant-mcp) for easy deployment and integration.
Recommend MCP Servers 💡
@codacy/codacy-mcp
Codacy's MCP Server provides access to the Codacy API, enabling AI models to interact with repositories, analyze code quality, security, coverage, and manage files and pull requests.
glean-mcp-server
An MCP server implementation integrating Glean API for Search and Chat functions.
function-lookup
A Model Context Protocol (MCP) server that helps AI assistants like Copilot or Claude analyze code by looking up function usage within source files.
agile-luminary
A Model Context Protocol (MCP) server that integrates AI clients with the Agile Luminary project management system, enabling AI to access project details, work assignments, and documentation.
ones-wiki-mcp-server
ONES Wiki MCP Server
bootiful-wordpress-mcp-server
WordPress MCP Server implemented with Java and Spring Boot