Model Providers

Google

If you use Google AI models (Gemini) within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either

  1. Judgeval Wrap: Add our wrap around the initialization of your client
  2. OpenTelemetry: Use the GoogleGenerativeAiInstrumentor library to wrap Google SDK calls and send Otel spans to Judgment.

Judgeval Wrap

import google.generativeai as genai
from judgeval import Judgeval

client = Judgeval(project_name="google_ai_project")
tracer = client.tracer.create()

# Configure the API key
genai.configure(api_key="your-api-key")

# Create and wrap the model
model = genai.GenerativeModel('gemini-pro')
wrapped_model = tracer.wrap(model)  # Use wrapped_model from here on

OpenTelemetry

from judgeval import Judgeval
# Make sure you installed the dependency: opentelemetry-instrumentation-google-generativeai
from opentelemetry.instrumentation.google_generativeai import GoogleGenerativeAiInstrumentor

client = Judgeval(project_name="google_ai_project")
tracer = client.tracer.create()
GoogleGenerativeAiInstrumentor().instrument()

Always initialize the Tracer before calling GoogleGenerativeAiInstrumentor().instrument() to ensure proper trace routing.

Only use one of these options to trace Google AI calls or you may see duplicate tracing.