pinecone-assistant
Pinecone Assistant
The Pinecone Assistant MCP server allows you to integrate Pinecone's powerful vector database and assistant capabilities with various AI agents and applications. It provides a standardized Model Context Protocol (MCP) interface for retrieving context snippets, building assistants, and enabling advanced AI workflows.
What it does:
- Provides an MCP endpoint for Pinecone Assistant.
- Facilitates retrieval of relevant context snippets for AI models.
- Enables integration with popular AI frameworks and tools like LangChain, Claude Desktop, and Cursor.
How to use: There are two primary ways to use the Pinecone Assistant MCP server:
-
Remote MCP Server (SSE/Streamable HTTP): Connect directly to your hosted Pinecone Assistant instance via HTTP. The endpoint URL follows the format:
https://<YOUR_PINECONE_ASSISTANT_HOST>/mcp/assistants/<YOUR_ASSISTANT_NAME>. It supports both Streamable HTTP and SSE transports. Example LangChain integration:from langchain_mcp_adapters.client import MultiServerMCPClient # ... (other imports and setup) async with MultiServerMCPClient( { "assistant_ai_news": { "url": "https://prod-1-data.ke.pinecone.io/mcp/assistants/ai-news", "transport": "streamable_http", "headers": {"Authorization": f"Bearer {pinecone_api_key}"} } } ) as client: # ... (use client.get_tools() with your agent) -
Local MCP Server (Stdio via Docker): Run the Pinecone Assistant MCP server locally using Docker. This provides a stdio interface. To start the server:
docker run -i --rm \ -e PINECONE_API_KEY=<PINECONE_API_KEY> \ -e PINECONE_ASSISTANT_HOST=<PINECONE_ASSISTANT_HOST> \ pinecone/assistant-mcpYou can configure this in your
mcp.jsonfor tools like Claude Desktop or Cursor:{ "mcpServers": { "pinecone-assistant": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "PINECONE_API_KEY", "-e", "PINECONE_ASSISTANT_HOST", "pinecone/assistant-mcp" ], "env": { "PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY>", "PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST>" } } } }
Implementation Details:
- The remote server leverages Pinecone's cloud infrastructure.
- The local server is provided as a Docker image (
pinecone/assistant-mcp) for easy deployment and integration.
Recommend MCP Servers 💡
@microsoft/clarity-mcp-server
A Model Context Protocol (MCP) server that integrates with Microsoft Clarity's data export API, enabling users to query analytics data from Clarity using MCP-compatible clients like Claude for Desktop.
flyworks-mcp
A Model Context Protocol (MCP) server that provides a convenient interface for interacting with the Flyworks API, facilitating fast and free zeroshot lipsync video creation for digital avatars.
gqai
Turn any GraphQL endpoint into a set of MCP tools
rr-mcp
A generic MCP server for extracting .NET interface, OpenAPI, and data (models/entities/enums) information from any .NET solution using PowerShell scripts for AI agents.
nacos-mcp-router
A MCP server that provides search, installation, proxy functionalities for other MCP servers with advanced search capabilities.
mcp-gp-pdf-reader
A comprehensive Model Context Protocol (MCP) server designed for advanced PDF text extraction, search, and analysis, offering tools for reading, searching, and extracting metadata from PDF files.