Model Providers

Google

If you use Google AI models (Gemini) within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either

  1. Judgeval Wrap: Add our wrap around the initialization of your client
  2. OpenTelemetry: Use the GoogleGenerativeAiInstrumentor library to wrap Google SDK calls and send Otel spans to Judgment.

Judgeval Wrap

import google.generativeai as genai
from judgeval.tracer import wrap

# Configure the API key
genai.configure(api_key="your-api-key")

# Create and wrap the model
model = genai.GenerativeModel('gemini-pro')
wrapped_model = wrap(model)  # Use wrapped_model from here on

OpenTelemetry

from judgeval.tracer import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-google-generativeai
from opentelemetry.instrumentation.google_generativeai import GoogleGenerativeAiInstrumentor


tracer = Tracer(project_name="google_ai_project")
GoogleGenerativeAiInstrumentor().instrument()

Always initialize the Tracer before calling GoogleGenerativeAiInstrumentor().instrument() to ensure proper trace routing.

Only use one of these options to trace Google AI calls or you may see duplicate tracing.