OpenSearch
A Model Context Protocol server that provides read-only access to OpenSearch clusters. This server enables LLMs to inspect indices and execute read-only queries.
To learn more about MCP Servers see:
This OpenSearch MCP Server was designed for seamless integration with skeet.build
Components
Tools
- search
- Execute read-only search queries against the connected OpenSearch cluster
- Input:
query(string): The OpenSearch query to execute - Input:
index(string): The index to search (optional) - All queries are executed with read-only permissions
Resources
The server provides schema information for each index in the OpenSearch cluster:
- Index Mappings (
opensearch://<host>/<index>/mapping)- JSON schema information for each index
- Includes field names and data types
- Automatically discovered from cluster metadata
Usage with Claude Desktop
To use this server with the Claude Desktop app, add the following configuration to the "mcpServers" section of your claude_desktop_config.json:
NPX
{
"mcpServers": {
"opensearch": {
"command": "npx",
"args": [
"-y",
"@skeetbuild/opensearch",
"https://username:password@localhost:9200"
]
}
}
}
Usage with Cursor
To use this server with Cursor, add the following configuration to your global (~/.cursor/mcp.json) or project-specific (.cursor/mcp.json) configuration file:
Global Configuration
{
"mcpServers": {
"opensearch": {
"command": "npx",
"args": [
"-y",
"@skeetbuild/opensearch",
"https://username:password@localhost:9200"
]
}
}
}
For more details on setting up MCP with Cursor, see the Cursor MCP documentation.
Usage with GitHub Copilot in VS Code
To use this server with GitHub Copilot in VS Code, add a new MCP server using the VS Code command palette:
- Press
Cmd+Shift+Pand search for "Add MCP Server" - Select "SSE MCP Server" and use the following configuration:
{
"mcp": {
"servers": {
"opensearch": {
"command": "npx",
"args": [
"-y",
"@skeetbuild/opensearch",
"https://username:password@localhost:9200"
]
}
}
}
}
For detailed setup instructions, see the GitHub Copilot MCP documentation.
Usage with Windsurf
To use this server with Windsurf, add the following configuration to your Windsurf MCP settings:
{
"mcpServers": {
"opensearch": {
"command": "npx",
"args": [
"-y",
"@skeetbuild/opensearch",
"https://username:password@localhost:9200"
]
}
}
}
For more information on configuring MCP with Windsurf, refer to the Windsurf MCP documentation.
License
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
Recommend MCP Servers 💡
patronus-mcp-server
An MCP server implementation for the Patronus SDK, providing a standardized interface for running powerful LLM system optimizations, evaluations, and experiments.
dealwallet1/meiliseachmcp
An MCP server that enables LLM interfaces like Claude to interact with Meilisearch for indexing, document management, search, and settings configuration.
Utsav-Ladani/WordPress-MCP
A Model Context Protocol server for WordPress content management, supporting post creation, update, search, retrieval, and block types schema access
@tacticlaunch/mcp-linear
MCP server that enables AI assistants to interact with Linear project management system through natural language, allowing users to retrieve, create, and update issues, projects, and teams.
Wren Engine
Wren Engine is a semantic engine for Model Context Protocol (MCP) clients and AI agents, enabling accurate, contextual, and governed access to enterprise data across various databases and cloud storage.
gibson
The GibsonAI Model Context Protocol Server provides tools for MCP Clients to interact with GibsonAI projects and databases using natural language instructions, enabling tasks like schema design, database queries, and project deployment.