pinecone-assistant
Pinecone Assistant
The Pinecone Assistant MCP server allows you to integrate Pinecone's powerful vector database and assistant capabilities with various AI agents and applications. It provides a standardized Model Context Protocol (MCP) interface for retrieving context snippets, building assistants, and enabling advanced AI workflows.
What it does:
- Provides an MCP endpoint for Pinecone Assistant.
- Facilitates retrieval of relevant context snippets for AI models.
- Enables integration with popular AI frameworks and tools like LangChain, Claude Desktop, and Cursor.
How to use: There are two primary ways to use the Pinecone Assistant MCP server:
-
Remote MCP Server (SSE/Streamable HTTP): Connect directly to your hosted Pinecone Assistant instance via HTTP. The endpoint URL follows the format:
https://<YOUR_PINECONE_ASSISTANT_HOST>/mcp/assistants/<YOUR_ASSISTANT_NAME>. It supports both Streamable HTTP and SSE transports. Example LangChain integration:from langchain_mcp_adapters.client import MultiServerMCPClient # ... (other imports and setup) async with MultiServerMCPClient( { "assistant_ai_news": { "url": "https://prod-1-data.ke.pinecone.io/mcp/assistants/ai-news", "transport": "streamable_http", "headers": {"Authorization": f"Bearer {pinecone_api_key}"} } } ) as client: # ... (use client.get_tools() with your agent) -
Local MCP Server (Stdio via Docker): Run the Pinecone Assistant MCP server locally using Docker. This provides a stdio interface. To start the server:
docker run -i --rm \ -e PINECONE_API_KEY=<PINECONE_API_KEY> \ -e PINECONE_ASSISTANT_HOST=<PINECONE_ASSISTANT_HOST> \ pinecone/assistant-mcpYou can configure this in your
mcp.jsonfor tools like Claude Desktop or Cursor:{ "mcpServers": { "pinecone-assistant": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "PINECONE_API_KEY", "-e", "PINECONE_ASSISTANT_HOST", "pinecone/assistant-mcp" ], "env": { "PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY>", "PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST>" } } } }
Implementation Details:
- The remote server leverages Pinecone's cloud infrastructure.
- The local server is provided as a Docker image (
pinecone/assistant-mcp) for easy deployment and integration.
Recommend MCP Servers 💡
spacebridge-mcp
MCP server that turbocharges vibe coding by automating issue tracking
@burtthecoder/mcp-virustotal
A Model Context Protocol (MCP) server for querying the VirusTotal API. Provides comprehensive security analysis tools with automatic relationship data fetching. Integrates seamlessly with MCP-compatible applications like Claude Desktop.
asgardeo-mcp-server
Manages Asgardeo organization or WSO2 Identity Server deployments using LLM tools, enabling natural language interactions for various configuration tasks.
youtube-mcp
A MCP server for interacting with YouTube videos and retrieving subtitles to enable LLMs to analyze video content.
@8enSmith/mcp-open-library
A Model Context Protocol (MCP) server for the Internet Archive's Open Library API that enables AI assistants to search for book and author information.
@thoughtspot/mcp-server
The ThoughtSpot MCP Server provides secure OAuth-based authentication and a set of tools for querying and retrieving relevant data from your ThoughtSpot instance. It's a remote server hosted on Cloudflare.