@pyroprompts/any-chat-completions-mcp
# any-chat-completions-mcp MCP Server Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more. This implements the Model Context Protocol Server. Learn more: [https://modelcontextprotocol.io](https://modelcontextprotocol.io) This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API. It has one tool, `chat` which relays a question to a configured AI Chat Provider. <a href="https://glama.ai/mcp/servers/nuksdrfb55"><img width="380" height="200" src="https://glama.ai/mcp/servers/nuksdrfb55/badge" /></a> [](https://smithery.ai/server/any-chat-completions-mcp-server) ## Development Install dependencies: ```bash npm install ``` Build the server: ```bash npm run build ``` For development with auto-rebuild: ```bash npm run watch ``` ## Installation To add OpenAI to Claude Desktop, add the server config: On MacOS: `~/Library/Application Support/Claude/claude_desktop_config.json` On Windows: `%APPDATA%/Claude/claude_desktop_config.json` You can use it via `npx` in your Claude Desktop configuration like this: ```json { "mcpServers": { "chat-openai": { "command": "npx", "args": [ "@pyroprompts/any-chat-completions-mcp" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this: ```json { "mcpServers": { "chat-openai": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments: ```json { "mcpServers": { "chat-pyroprompts": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "PYROPROMPTS_KEY", "AI_CHAT_NAME": "PyroPrompts", "AI_CHAT_MODEL": "ash", "AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1" } }, "chat-perplexity": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "PERPLEXITY_KEY", "AI_CHAT_NAME": "Perplexity", "AI_CHAT_MODEL": "sonar", "AI_CHAT_BASE_URL": "https://api.perplexity.ai" } }, "chat-openai": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` With these three, you'll see a tool for each in the Claude Desktop Home:  And then you can chat with other LLMs and it shows in chat like this:  Or, configure in [LibreChat](https://www.librechat.ai/) like: ```yaml chat-perplexity: type: stdio command: npx args: - -y - @pyroprompts/any-chat-completions-mcp env: AI_CHAT_KEY: "pplx-012345679" AI_CHAT_NAME: Perplexity AI_CHAT_MODEL: sonar AI_CHAT_BASE_URL: "https://api.perplexity.ai" PATH: '/usr/local/bin:/usr/bin:/bin' ```` And it shows in LibreChat:  ### Installing via Smithery To install Any OpenAI Compatible API Integrations for Claude Desktop automatically via [Smithery](https://smithery.ai/server/any-chat-completions-mcp-server): ```bash npx -y @smithery/cli install any-chat-completions-mcp-server --client claude ``` ### Debugging Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the [MCP Inspector](https://github.com/modelcontextprotocol/inspector), which is available as a package script: ```bash npm run inspector ``` The Inspector will provide a URL to access debugging tools in your browser. ### Acknowledgements - Obviously the modelcontextprotocol and Anthropic team for the MCP Specification and integration into Claude Desktop. [https://modelcontextprotocol.io/introduction](https://modelcontextprotocol.io/introduction) - [PyroPrompts](https://pyroprompts.com?ref=github-any-chat-completions-mcp) for sponsoring this project. Use code `CLAUDEANYCHAT` for 20 free automation credits on Pyroprompts. ---
any-chat-completions-mcp MCP Server
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
This implements the Model Context Protocol Server. Learn more: https://modelcontextprotocol.io
This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API.
It has one tool, chat which relays a question to a configured AI Chat Provider.
Development
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Installation
To add OpenAI to Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
You can use it via npx in your Claude Desktop configuration like this:
{
"mcpServers": {
"chat-openai": {
"command": "npx",
"args": [
"@pyroprompts/any-chat-completions-mcp"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this:
{
"mcpServers": {
"chat-openai": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments:
{
"mcpServers": {
"chat-pyroprompts": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "PYROPROMPTS_KEY",
"AI_CHAT_NAME": "PyroPrompts",
"AI_CHAT_MODEL": "ash",
"AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1"
}
},
"chat-perplexity": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "PERPLEXITY_KEY",
"AI_CHAT_NAME": "Perplexity",
"AI_CHAT_MODEL": "sonar",
"AI_CHAT_BASE_URL": "https://api.perplexity.ai"
}
},
"chat-openai": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
With these three, you'll see a tool for each in the Claude Desktop Home:

And then you can chat with other LLMs and it shows in chat like this:

Or, configure in LibreChat like:
chat-perplexity:
type: stdio
command: npx
args:
- -y
- @pyroprompts/any-chat-completions-mcp
env:
AI_CHAT_KEY: "pplx-012345679"
AI_CHAT_NAME: Perplexity
AI_CHAT_MODEL: sonar
AI_CHAT_BASE_URL: "https://api.perplexity.ai"
PATH: '/usr/local/bin:/usr/bin:/bin'
And it shows in LibreChat:

Installing via Smithery
To install Any OpenAI Compatible API Integrations for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install any-chat-completions-mcp-server --client claude
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Acknowledgements
- Obviously the modelcontextprotocol and Anthropic team for the MCP Specification and integration into Claude Desktop. https://modelcontextprotocol.io/introduction
- PyroPrompts for sponsoring this project. Use code
CLAUDEANYCHATfor 20 free automation credits on Pyroprompts.
Recommend MCP Servers 💡
MCP-wolfram-alpha
Connect your chat repl to wolfram alpha computational intelligence
@Jktfe/servemyapi
A personal MCP (Model Context Protocol) server for securely storing and accessing API keys across projects using the macOS Keychain.
dtkmn/mcp-zap-server
A Spring Boot MCP server that exposes OWASP ZAP actions for AI agents to orchestrate security scans, OpenAPI imports, and report generation.
clj-kondo-mcp
An MCP server providing clj-kondo linting capabilities for Clojure, ClojureScript, and EDN files.
lldb-mcp
LLDB MCP server
mcp-matlab-executor
MCP tool enabling secure execution of MATLAB code with user approval prompts