MCP Telemetry
A Model Context Protocol (MCP) server for telemetry within chat systems using Weights & Biases Weave
Overview
MCP Telemetry provides a simple interface for logging and tracking conversations between users and LLMs. It leverages the Model Context Protocol to expose telemetry tools that can be used to trace and analyze conversations.
Features
- Start tracing sessions with custom identifiers
- Log comprehensive conversation data including:
- User inputs
- LLM responses
- LLM actions
- Tool calls and their results
- Seamless integration with Weights & Biases Weave for visualization and analysis
- Real-time monitoring of conversation flows
- Export and share conversation analytics
Installation
First, get a WandB API Key from: https://wandb.ai/settings#api
This server can be installed by adding the following json to your Claude desktop config:
{
"mcpServers": {
"MCP Telemetry": {
"command": "uv", -- this needs to be the location where uv is available, check via 'which uv'
"args": [
"run",
"--with",
"mcp[cli]",
"--with",
"weave",
"mcp",
"run",
"~/mcp-telemetry/server.py"
],
"env": {
"WANDB_API_KEY": "..." -- get one from wandb.com
}
}
}
}
Usage
Once installed, the MCP Telemetry server will automatically start when you launch Claude. It will begin collecting telemetry data for all conversations. You can view your telemetry data in the Weights & Biases dashboard.
Basic Usage
- Start a conversation with Claude
- The server will automatically track:
- User messages
- LLM responses
- Tool calls and their results
- Conversation metadata
Configuration
The server can be configured through environment variables:
WANDB_API_KEY- Your Weights & Biases API key (required)
Examples
Starting a Tracing Session
Prompt Claude to trace that conversation. Example: Log this conversation with MCP Telemetry, topic will be Cats
Viewing Telemetry Data
- Log in to your Weights & Biases account
- Navigate to your project
- You'll see various visualizations including:
- Conversation flows
- Tool usage patterns
- Response times
- Error rates
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Recommend MCP Servers 💡
mcp-ipfs
A Node.js MCP server implementing the Model Context Protocol for interacting with storacha.network via the w3 CLI.
@aashari/mcp-server-atlassian-confluence
Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.
weather-mcp-server
A Model Context Protocol (MCP) server for weather data, built with FastAPI and the MCP framework. This server provides various weather-related tools that can be used by AI assistants to retrieve current weather conditions, forecasts, air quality data, and more.
@mcpfinder/server
MCPfinder is a service that enables LLMs to dynamically discover and access new tools and capabilities by locating relevant MCP servers through a central registry.
builtwith
A Model Context Protocol (MCP) server that integrates with BuiltWith's technology detection API, allowing AI assistants to identify the technology stack behind any website through natural language commands.
mcp-outline
A Model Context Protocol (MCP) server that enables AI assistants to interact with Outline documentation services, providing features like document search, reading, creation, and comment management.