Loading languages...
MC

mcp-simple-openai-assistant

@andybrandt33

A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol. [![smithery badge](https://smithery.ai/badge/mcp-simple-openai-assistant)](https://smithery.ai/mcp/known/mcp-simple-openai-assistant) [![MseeP.ai Security Assessment Badge](https://mseep.net/pr/andybrandt-mcp-simple-openai-assistant-badge.png)](https://mseep.ai/app/andybrandt-mcp-simple-openai-assistant) ## Features This server provides a suite of tools to manage and interact with OpenAI Assistants. The new streaming capabilities provide a much-improved, real-time user experience. ### Available Tools - **`create_assistant`**: (Create OpenAI Assistant) - Create a new assistant with a name, instructions, and model. - **`list_assistants`**: (List OpenAI Assistants) - List all available assistants associated with your API key. - **`retrieve_assistant`**: (Retrieve OpenAI Assistant) - Get detailed information about a specific assistant. - **`update_assistant`**: (Update OpenAI Assistant) - Modify an existing assistant's name, instructions, or model. - **`create_new_assistant_thread`**: (Create New Assistant Thread) - Creates a new, persistent conversation thread with a user-defined name and description for easy identification and reuse. This is the recommended way to start a new conversation. - **`list_threads`**: (List Managed Threads) - Lists all locally managed conversation threads from the database, showing their ID, name, description, and last used time. - **`delete_thread`**: (Delete Managed Thread) - Deletes a conversation thread from both OpenAI's servers and the local database. - **`ask_assistant_in_thread`**: (Ask Assistant in Thread and Stream Response) - The primary tool for conversation. Sends a message to an assistant within a thread and streams the response back in real-time. Because OpenAI assistants might take quite long to respond, this server uses a streaming approach for the main `ask_assistant_in_thread` tool. This provides real-time progress updates to the client and avoids timeouts. The server now includes local persistence for threads, which is a significant improvement. Since the OpenAI API does not allow listing threads, this server now manages them for you by storing their IDs and metadata in a local SQLite database. This allows you to easily find, reuse, and manage your conversation threads across sessions.

OpenAI
GPTs
Claude
AI Assistant

# mcpServer Config

{
  "mcpServers": {
    "openai-assistant": {
      "command": "python",
      "args": [
        "-m",
        "mcp_simple_openai_assistant"
      ],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
      }
    }
  }
}

# stdio

npx -y @smithery/cli install mcp-simple-openai-assistant --client claude
Transport:
stdio
언어:
Python
업데이트됨:7/23/2025