Model Providers

Google

If you use Google AI models (Gemini) within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either

  1. Judgeval Wrap: Add our wrap around the initialization of your client
  2. OpenTelemetry: Use the GoogleGenerativeAiInstrumentor library to wrap Google SDK calls and send Otel spans to Judgment.

Judgeval Wrap

import google.generativeai as genai
from judgeval import Tracer, wrap

Tracer.init(project_name="google_ai_project")

genai.configure(api_key="your-api-key")

model = genai.GenerativeModel('gemini-pro')
wrapped_model = wrap(model)  # Use wrapped_model from here on

OpenTelemetry

from judgeval import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-google-generativeai
from opentelemetry.instrumentation.google_generativeai import GoogleGenerativeAiInstrumentor

Tracer.init(project_name="google_ai_project")
Tracer.registerOTELInstrumentation(GoogleGenerativeAiInstrumentor())

Only use one of these options to trace Google AI calls or you may see duplicate tracing.