Model Providers
If you use Google AI models (Gemini) within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either
- Judgeval Wrap: Add our wrap around the initialization of your client
- OpenTelemetry: Use the GoogleGenerativeAiInstrumentor library to wrap Google SDK calls and send Otel spans to Judgment.
Judgeval Wrap
import google.generativeai as genai
from judgeval.tracer import wrap
# Configure the API key
genai.configure(api_key="your-api-key")
# Create and wrap the model
model = genai.GenerativeModel('gemini-pro')
wrapped_model = wrap(model) # Use wrapped_model from here onOpenTelemetry
from judgeval.tracer import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-google-generativeai
from opentelemetry.instrumentation.google_generativeai import GoogleGenerativeAiInstrumentor
tracer = Tracer(project_name="google_ai_project")
GoogleGenerativeAiInstrumentor().instrument()Last updated on