Netra exposes a remote MCP server so you can pull observability context directly into your editor or agent workspace.Documentation Index
Fetch the complete documentation index at: https://docs.getnetra.ai/llms.txt
Use this file to discover all available pages before exploring further.
Choose the MCP endpoint that matches your Netra data region, then replace the
API key placeholder with your own project API key.
Server Details
- Region-specific MCP endpoints:
- Auth header:
x-api-key
Client Setup
- Cursor
- Claude Code
- Windsurf
- Antigravity
- Codex
Add this
netra entry to your MCP server configuration in mcp.json:If you’re in the EU region, then use the url
https://api.eu.getnetra.ai/mcp.Why Netra MCP
Integrate Netra’s observability context directly into your development workflow. The Netra MCP server enables your AI coding assistant to query traces, filter performance data, and gain deep system insights without leaving the IDE. By bridging the gap between telemetry and your code, it helps you identify and resolve complex LLM performance issues as they happen.Available Tools
| Tool | Description |
|---|---|
netra_get_trace_by_id | Retrieve all spans for a given trace ID |
netra_query_traces | Get traces using trace filters (like the filter field in the dashboard) |
Troubleshooting
- Ensure your JSON/TOML configs are properly formatted
- Confirm the MCP server is reachable at your region-specific MCP endpoint
- Verify your API key is correct and has not expired
- Restart the client after making config changes
Next Step
After saving your config, restart the client and confirm thenetra MCP server shows up in the available tools or MCP server list.