@pyroprompts/any-chat-completions-mcp
# any-chat-completions-mcp MCP Server Claude를 OpenAI SDK와 호환되는 모든 Chat Completion API(OpenAI, Perplexity, Groq, xAI, PyroPrompts 등)와 통합하세요. 이 서버는 Model Context Protocol을 구현합니다. 자세한 내용은 [https://modelcontextprotocol.io](https://modelcontextprotocol.io)를 참조하세요. 이 서버는 TypeScript 기반의 mcp servers로, OpenAI SDK와 호환되는 모든 Chat Completions API를 구현합니다. 설정된 AI 채팅 제공자에게 질문을 전달하는 `chat` 도구 하나를 제공합니다. <a href="https://glama.ai/mcp/servers/nuksdrfb55"><img width="380" height="200" src="https://glama.ai/mcp/servers/nuksdrfb55/badge" /></a> [](https://smithery.ai/server/any-chat-completions-mcp-server) ## 개발 의존성 설치: ```bash npm install ``` 서버 빌드: ```bash npm run build ``` 자동 재빌드와 함께 개발: ```bash npm run watch ``` ## 설치 Claude Desktop에 OpenAI를 추가하려면 서버 설정을 추가하세요: MacOS: `~/Library/Application Support/Claude/claude_desktop_config.json` Windows: `%APPDATA%/Claude/claude_desktop_config.json` Claude Desktop 설정에서 다음과 같이 `npx`를 사용하여 사용할 수 있습니다: ```json { "mcpServers": { "chat-openai": { "command": "npx", "args": [ "@pyroprompts/any-chat-completions-mcp" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` 또는 저장소를 복제한 경우, 빌드 후 Claude Desktop 설정에서 다음과 같이 사용할 수 있습니다: ```json { "mcpServers": { "chat-openai": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` 동일한 mcp servers를 여러 번 참조하되 서로 다른 환경 변수 인수를 사용하여 여러 제공자를 추가할 수 있습니다: ```json { "mcpServers": { "chat-pyroprompts": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "PYROPROMPTS_KEY", "AI_CHAT_NAME": "PyroPrompts", "AI_CHAT_MODEL": "ash", "AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1" } }, "chat-perplexity": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "PERPLEXITY_KEY", "AI_CHAT_NAME": "Perplexity", "AI_CHAT_MODEL": "sonar", "AI_CHAT_BASE_URL": "https://api.perplexity.ai" } }, "chat-openai": { "command": "node", "args": [ "/path/to/any-chat-completions-mcp/build/index.js" ], "env": { "AI_CHAT_KEY": "OPENAI_KEY", "AI_CHAT_NAME": "OpenAI", "AI_CHAT_MODEL": "gpt-4o", "AI_CHAT_BASE_URL": "https://api.openai.com/v1" } } } } ``` 이 세 가지를 설정하면 Claude Desktop 홈에서 각각에 대한 도구를 볼 수 있습니다:  이제 다른 LLM과 대화할 수 있으며, 채팅창에 다음과 같이 표시됩니다:  또는 [LibreChat](https://www.librechat.ai/)에서 다음과 같이 설정하세요: ```yaml chat-perplexity: type: stdio command: npx args: - -y - @pyroprompts/any-chat-completions-mcp env: AI_CHAT_KEY: "pplx-012345679" AI_CHAT_NAME: Perplexity AI_CHAT_MODEL: sonar AI_CHAT_BASE_URL: "https://api.perplexity.ai" PATH: '/usr/local/bin:/usr/bin:/bin' ```` 그러면 LibreChat에 다음과 같이 표시됩니다:  ### Smithery를 통한 설치 [Smithery](https://smithery.ai/server/any-chat-completions-mcp-server)를 통해 Claude Desktop용 OpenAI 호환 API 통합을 자동으로 설치하려면: ```bash npx -y @smithery/cli install any-chat-completions-mcp-server --client claude ``` ### 디버깅 mcp servers는 stdio를 통해 통신하므로 디버깅이 어려울 수 있습니다. 패키지 스크립트로 제공되는 [MCP Inspector](https://github.com/modelcontextprotocol/inspector) 사용을 권장합니다: ```bash npm run inspector ``` Inspector는 브라우저에서 디버깅 도구에 액세스할 수 있는 URL을 제공합니다. ### 감사의 말 - MCP 사양 및 Claude Desktop 통합을 제공한 modelcontextprotocol 및 Anthropic 팀에 감사드립니다. [https://modelcontextprotocol.io/introduction](https://modelcontextprotocol.io/introduction) - 이 프로젝트를 후원해 준 [PyroPrompts](https://pyroprompts.com?ref=github-any-chat-completions-mcp)에 감사드립니다. Pyroprompts에서 20개의 무료 자동화 크레딧을 받으려면 코드 `CLAUDEANYCHAT`을 사용하세요. ---
any-chat-completions-mcp MCP Server
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
This implements the Model Context Protocol Server. Learn more: https://modelcontextprotocol.io
This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API.
It has one tool, chat which relays a question to a configured AI Chat Provider.
Development
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Installation
To add OpenAI to Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
You can use it via npx in your Claude Desktop configuration like this:
{
"mcpServers": {
"chat-openai": {
"command": "npx",
"args": [
"@pyroprompts/any-chat-completions-mcp"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this:
{
"mcpServers": {
"chat-openai": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments:
{
"mcpServers": {
"chat-pyroprompts": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "PYROPROMPTS_KEY",
"AI_CHAT_NAME": "PyroPrompts",
"AI_CHAT_MODEL": "ash",
"AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1"
}
},
"chat-perplexity": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "PERPLEXITY_KEY",
"AI_CHAT_NAME": "Perplexity",
"AI_CHAT_MODEL": "sonar",
"AI_CHAT_BASE_URL": "https://api.perplexity.ai"
}
},
"chat-openai": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
With these three, you'll see a tool for each in the Claude Desktop Home:

And then you can chat with other LLMs and it shows in chat like this:

Or, configure in LibreChat like:
chat-perplexity:
type: stdio
command: npx
args:
- -y
- @pyroprompts/any-chat-completions-mcp
env:
AI_CHAT_KEY: "pplx-012345679"
AI_CHAT_NAME: Perplexity
AI_CHAT_MODEL: sonar
AI_CHAT_BASE_URL: "https://api.perplexity.ai"
PATH: '/usr/local/bin:/usr/bin:/bin'
And it shows in LibreChat:

Installing via Smithery
To install Any OpenAI Compatible API Integrations for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install any-chat-completions-mcp-server --client claude
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Acknowledgements
- Obviously the modelcontextprotocol and Anthropic team for the MCP Specification and integration into Claude Desktop. https://modelcontextprotocol.io/introduction
- PyroPrompts for sponsoring this project. Use code
CLAUDEANYCHATfor 20 free automation credits on Pyroprompts.
Recommend MCP Servers 💡
restcsv-mcp-server
MCP Server for RestCSV, Generated using MCPGen
truto-mcp-stdio
A CLI stdio proxy for HTTP Streamable MCP servers, forwarding JSON-RPC messages between stdin/stdout and a specified endpoint.
quarkus-mcp-servers
Model Context Protocol Servers in Quarkus
@felores/placid-mcp-server
Generate image and video creatives using Placid.app templates in MCP compatible hosts
Medinios/SuricataMCP
A Model Context Protocol Server enabling MCP clients to use Suricata for network traffic analysis through programmatic tools.
fastmcp
The fast, Pythonic way to build MCP servers and clients