LLM Providers

  • OpenAI - GPT models and completions API
  • Anthropic Claude - Claude 3 models and messaging API
  • Cohere - Command models and generation API
  • Google GenAI (Gemini) - Gemini Pro and other Google AI models
  • Mistral AI - Mistral models and chat completions
  • Aleph Alpha - Advanced European AI models
  • AWS Bedrock - Amazon’s managed AI service
  • Groq - High-performance AI inference
  • Ollama - Local LLM deployment and management
  • Replicate - Cloud-based model hosting platform
  • Together AI - Collaborative AI platform
  • Transformers - Hugging Face transformers library
  • Vertex AI - Google Cloud AI platform
  • Watson X - IBM’s enterprise AI platform

Vector Databases

  • Weaviate - Open-source vector database with GraphQL
  • Qdrant - High-performance vector similarity search
  • Pinecone - Managed vector database service
  • Chroma - Open-source embedding database
  • LanceDB - Fast vector database for AI applications
  • Marqo - Tensor-based search engine
  • Milvus - Open-source vector database at scale
  • Redis - Vector search with Redis Stack

HTTP Clients & Web Frameworks

  • HTTPX - Modern async HTTP client
  • AIOHTTP - Asynchronous HTTP client/server
  • FastAPI - Modern web framework for APIs
  • Requests - Popular HTTP library for Python
  • Django - High-level Python web framework
  • Flask - Lightweight WSGI web application framework
  • Falcon - High-performance Python web framework
  • Starlette - Lightweight ASGI framework/toolkit
  • Tornado - Asynchronous networking library and web framework
  • gRPC - High-performance, open-source universal RPC framework
  • Urllib - Standard Python HTTP client library
  • Urllib3 - Powerful, user-friendly HTTP client for Python

Database Clients

  • PyMySQL - Pure Python MySQL client
  • Redis - In-memory data structure store
  • SQLAlchemy - SQL toolkit and Object-Relational Mapper
  • Psycopg - Modern PostgreSQL database adapter for Python
  • Pymongo - Python driver for MongoDB
  • Elasticsearch - Distributed, RESTful search and analytics engine
  • Cassandra - Distributed NoSQL database
  • PyMSSQL - Simple Microsoft SQL Server client
  • MySQL Connector - Official MySQL driver
  • Sqlite3 - Built-in SQL database engine
  • Aiopg - Asynchronous PostgreSQL client
  • Asyncpg - Fast asynchronous PostgreSQL client
  • Pymemcache - Comprehensive Memcached client
  • Tortoise ORM - Easy-to-use asyncio ORM

Messaging & Task Queues

  • Celery - Distributed task queue
  • Pika - Pure-Python implementation of the AMQP 0-9-1 protocol
  • AIO Pika - Asynchronous AMQP client
  • Kafka-Python - Python client for Apache Kafka
  • AIOKafka - Asynchronous Python client for Kafka
  • Confluent-Kafka - Confluent’s Python client for Apache Kafka
  • Boto3 SQS - Amazon SQS client via Boto3

AI Frameworks & Orchestration

  • LangChain - Framework for developing LLM applications
  • LangGraph - Modern framework for LLM applications
  • LlamaIndex - Data framework for LLM applications
  • Haystack - End-to-end NLP framework
  • CrewAI - Multi-agent AI systems
  • MCP (Model Context Protocol) - AI model communication standard

Custom Instrumentation Selection

Control which instrumentations are enabled as per your application. By default, all installed packages are instrumented. The default behavior is enabled when you initialize the SDK. You can selectively enable or disable instrumentations using the instruments and block_instruments parameters in Netra.init(). Below is a complete list of available members in the InstrumentSet enum that you can use:

LLM Providers

  • InstrumentSet.OPENAI
  • InstrumentSet.ANTHROPIC
  • InstrumentSet.COHERE
  • InstrumentSet.GOOGLE_GENAI
  • InstrumentSet.MISTRAL
  • InstrumentSet.ALEPH_ALPHA
  • InstrumentSet.BEDROCK
  • InstrumentSet.GROQ
  • InstrumentSet.OLLAMA
  • InstrumentSet.REPLICATE
  • InstrumentSet.TOGETHER
  • InstrumentSet.TRANSFORMERS
  • InstrumentSet.VERTEXAI
  • InstrumentSet.WATSONX

Vector Databases

  • InstrumentSet.WEAVIATE
  • InstrumentSet.QDRANT
  • InstrumentSet.PINECONE
  • InstrumentSet.CHROMA
  • InstrumentSet.LANCEDB
  • InstrumentSet.MARQO
  • InstrumentSet.MILVUS
  • InstrumentSet.REDIS

Web Frameworks

  • InstrumentSet.FASTAPI
  • InstrumentSet.DJANGO
  • InstrumentSet.FLASK
  • InstrumentSet.FALCON
  • InstrumentSet.STARLETTE
  • InstrumentSet.TORNADO
  • InstrumentSet.GRPC

HTTP Clients

  • InstrumentSet.HTTPX
  • InstrumentSet.AIOHTTP
  • InstrumentSet.REQUESTS
  • InstrumentSet.URLLIB
  • InstrumentSet.URLLIB3

Database Clients

  • InstrumentSet.PYMYSQL
  • InstrumentSet.SQLALCHEMY
  • InstrumentSet.PSYCOPG
  • InstrumentSet.PYMONGO
  • InstrumentSet.ELASTICSEARCH
  • InstrumentSet.CASSANDRA
  • InstrumentSet.PYMSSQL
  • InstrumentSet.MYSQL
  • InstrumentSet.SQLITE3
  • InstrumentSet.AIOPG
  • InstrumentSet.ASYNCPG
  • InstrumentSet.PYMEMCACHE
  • InstrumentSet.TORTOISEORM

Task Queues

  • InstrumentSet.CELERY
  • InstrumentSet.PIKA
  • InstrumentSet.AIOPika
  • InstrumentSet.KAFKA
  • InstrumentSet.AIOKAFKA
  • InstrumentSet.CONFLUENT_KAFKA
  • InstrumentSet.BOTO3_SQS

AI Frameworks

  • InstrumentSet.LANGCHAIN
  • InstrumentSet.LANGGRAPH
  • InstrumentSet.LLAMAINDEX
  • InstrumentSet.HAYSTACK
  • InstrumentSet.CREWAI
  • InstrumentSet.MCP

Example Usage

from netra import Netra
from netra.instrumentation.instruments import InstrumentSet

# Enable specific instruments only
Netra.init(
    app_name="Selective App",
    instruments={
        InstrumentSet.OPENAI,
        InstrumentSet.WEAVIATEDB,
        InstrumentSet.FASTAPI
    }
)

# Block specific instruments
Netra.init(
    app_name="Blocked App",
    block_instruments={
        InstrumentSet.HTTPX,  # Don't trace HTTPX calls
        InstrumentSet.REDIS   # Don't trace Redis operations
    }
)