Loading languages...
@P

@pyroprompts/any-chat-completions-mcp

@pyroprompts137

# any-chat-completions-mcp MCP Server Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more. This implements the Model Context Protocol Server. Learn more: [https://modelcontextprotocol.io](https://modelcontextprotocol.io) This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API. It has one tool, `chat` which relays a question to a configured AI Chat Provider. <a href="https://glama.ai/mcp/servers/nuksdrfb55"><img width="380" height="200" src="https://glama.ai/mcp/servers/nuksdrfb55/badge" /></a> [![smithery badge](https://smithery.ai/badge/any-chat-completions-mcp-server)](https://smithery.ai/server/any-chat-completions-mcp-server) ## Development Install dependencies: ```bash npm install ``` Build the server: ```bash npm run build ``` For development with auto-rebuild: ```bash npm run watch ``` ## Installation To add OpenAI to Claude Desktop, add the server config: On MacOS: `~/Library/Application Support/Claude/claude_desktop_config.json` On Windows: `%APPDATA%/Claude/claude_desktop_config.json` You can use it via `npx` in your Claude Desktop configuration like this: ```json { "mcpServers": { "chat-openai": { "command": "npx", "args": [ "@pyroprompts/any-chat-completions-mcp" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this: ```json { "mcpServers": { "chat-openai": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments: ```json { "mcpServers": { "chat-pyroprompts": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "PYROPROMPTS_KEY", "AI_CHAT_NAME": "PyroPrompts", "AI_CHAT_MODEL": "ash", "AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1" } }, "chat-perplexity": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "PERPLEXITY_KEY", "AI_CHAT_NAME": "Perplexity", "AI_CHAT_MODEL": "sonar", "AI_CHAT_BASE_URL": "https://api.perplexity.ai" } }, "chat-openai": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` With these three, you'll see a tool for each in the Claude Desktop Home: ![Claude Desktop Home with Chat Tools](img/claude_desktop_home.png) And then you can chat with other LLMs and it shows in chat like this: ![Claude Chat with OpenAI](img/claude_chat_openai.png) Or, configure in [LibreChat](https://www.librechat.ai/) like: ```yaml chat-perplexity: type: stdio command: npx args: - -y - @pyroprompts/any-chat-completions-mcp env: AI_CHAT_KEY: "pplx-012345679" AI_CHAT_NAME: Perplexity AI_CHAT_MODEL: sonar AI_CHAT_BASE_URL: "https://api.perplexity.ai" PATH: '/usr/local/bin:/usr/bin:/bin' ```` And it shows in LibreChat: ![LibreChat with Perplexity Chat](img/librechat.png) ### Installing via Smithery To install Any OpenAI Compatible API Integrations for Claude Desktop automatically via [Smithery](https://smithery.ai/server/any-chat-completions-mcp-server): ```bash npx -y @smithery/cli install any-chat-completions-mcp-server --client claude ``` ### Debugging Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the [MCP Inspector](https://github.com/modelcontextprotocol/inspector), which is available as a package script: ```bash npm run inspector ``` The Inspector will provide a URL to access debugging tools in your browser. ### Acknowledgements - Obviously the modelcontextprotocol and Anthropic team for the MCP Specification and integration into Claude Desktop. [https://modelcontextprotocol.io/introduction](https://modelcontextprotocol.io/introduction) - [PyroPrompts](https://pyroprompts.com?ref=github-any-chat-completions-mcp) for sponsoring this project. Use code `CLAUDEANYCHAT` for 20 free automation credits on Pyroprompts. ---

AI
LLM
Tool
TypeScript
OpenAI
Claude

any-chat-completions-mcp MCP Server

Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.

This implements the Model Context Protocol Server. Learn more: https://modelcontextprotocol.io

This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API.

It has one tool, chat which relays a question to a configured AI Chat Provider.

smithery badge

Development

Install dependencies:

npm install

Build the server:

npm run build

For development with auto-rebuild:

npm run watch

Installation

To add OpenAI to Claude Desktop, add the server config:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json

On Windows: %APPDATA%/Claude/claude_desktop_config.json

You can use it via npx in your Claude Desktop configuration like this:

{
  "mcpServers": {
    "chat-openai": {
      "command": "npx",
      "args": [
        "@pyroprompts/any-chat-completions-mcp"
      ],
      "env": {
        "AI_CHAT_KEY": "OPENAI_KEY",
        "AI_CHAT_NAME": "OpenAI",
        "AI_CHAT_MODEL": "gpt-4o",
        "AI_CHAT_BASE_URL": "https://api.openai.com/v1"
      }
    }
  }
}

Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this:


{
  "mcpServers": {
    "chat-openai": {
      "command": "node",
      "args": [
        "/path/to/any-chat-completions-mcp/build/index.js"
      ],
      "env": {
        "AI_CHAT_KEY": "OPENAI_KEY",
        "AI_CHAT_NAME": "OpenAI",
        "AI_CHAT_MODEL": "gpt-4o",
        "AI_CHAT_BASE_URL": "https://api.openai.com/v1"
      }
    }
  }
}

You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments:


{
  "mcpServers": {
    "chat-pyroprompts": {
      "command": "node",
      "args": [
        "/path/to/any-chat-completions-mcp/build/index.js"
      ],
      "env": {
        "AI_CHAT_KEY": "PYROPROMPTS_KEY",
        "AI_CHAT_NAME": "PyroPrompts",
        "AI_CHAT_MODEL": "ash",
        "AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1"
      }
    },
    "chat-perplexity": {
      "command": "node",
      "args": [
        "/path/to/any-chat-completions-mcp/build/index.js"
      ],
      "env": {
        "AI_CHAT_KEY": "PERPLEXITY_KEY",
        "AI_CHAT_NAME": "Perplexity",
        "AI_CHAT_MODEL": "sonar",
        "AI_CHAT_BASE_URL": "https://api.perplexity.ai"
      }
    },
    "chat-openai": {
      "command": "node",
      "args": [
        "/path/to/any-chat-completions-mcp/build/index.js"
      ],
      "env": {
        "AI_CHAT_KEY": "OPENAI_KEY",
        "AI_CHAT_NAME": "OpenAI",
        "AI_CHAT_MODEL": "gpt-4o",
        "AI_CHAT_BASE_URL": "https://api.openai.com/v1"
      }
    }
  }
}

With these three, you'll see a tool for each in the Claude Desktop Home:

Claude Desktop Home with Chat Tools

And then you can chat with other LLMs and it shows in chat like this:

Claude Chat with OpenAI

Or, configure in LibreChat like:

  chat-perplexity:
    type: stdio
    command: npx
    args:
      - -y
      - @pyroprompts/any-chat-completions-mcp
    env:
      AI_CHAT_KEY: "pplx-012345679"
      AI_CHAT_NAME: Perplexity
      AI_CHAT_MODEL: sonar
      AI_CHAT_BASE_URL: "https://api.perplexity.ai"
      PATH: '/usr/local/bin:/usr/bin:/bin'

And it shows in LibreChat:

LibreChat with Perplexity Chat

Installing via Smithery

To install Any OpenAI Compatible API Integrations for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install any-chat-completions-mcp-server --client claude

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

Acknowledgements


# mcpServer Config

{
  "mcpServers": {
    "chat-openai": {
      "command": "npx",
      "args": [
        "@pyroprompts/any-chat-completions-mcp"
      ],
      "env": {
        "AI_CHAT_KEY": "OPENAI_KEY",
        "AI_CHAT_NAME": "OpenAI",
        "AI_CHAT_MODEL": "gpt-4o",
        "AI_CHAT_BASE_URL": "https://api.openai.com/v1"
      }
    }
  }
}

# stdio

npx -y @pyroprompts/any-chat-completions-mcp
Transport:
stdio
Language:
TypeScript
Updated: 7/23/2025