Model Providers
If you use Google AI models (Gemini) within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either
- Judgeval Wrap: Add our wrap around the initialization of your client
- OpenTelemetry: Use the GoogleGenerativeAiInstrumentor library to wrap Google SDK calls and send Otel spans to Judgment.
Judgeval Wrap
import google.generativeai as genai
from judgeval import Tracer, wrap
Tracer.init(project_name="google_ai_project")
genai.configure(api_key="your-api-key")
model = genai.GenerativeModel('gemini-pro')
wrapped_model = wrap(model) # Use wrapped_model from here onOpenTelemetry
from judgeval import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-google-generativeai
from opentelemetry.instrumentation.google_generativeai import GoogleGenerativeAiInstrumentor
Tracer.init(project_name="google_ai_project")
Tracer.registerOTELInstrumentation(GoogleGenerativeAiInstrumentor())Last updated on