Model Providers
If you use Google AI models (Gemini) within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either
- Judgeval Wrap: Add our wrap around the initialization of your client
- OpenTelemetry: Use the GoogleGenerativeAiInstrumentor library to wrap Google SDK calls and send Otel spans to Judgment.
Judgeval Wrap
import google.generativeai as genai
from judgeval import Judgeval
client = Judgeval(project_name="google_ai_project")
tracer = client.tracer.create()
# Configure the API key
genai.configure(api_key="your-api-key")
# Create and wrap the model
model = genai.GenerativeModel('gemini-pro')
wrapped_model = tracer.wrap(model) # Use wrapped_model from here onOpenTelemetry
from judgeval import Judgeval
# Make sure you installed the dependency: opentelemetry-instrumentation-google-generativeai
from opentelemetry.instrumentation.google_generativeai import GoogleGenerativeAiInstrumentor
client = Judgeval(project_name="google_ai_project")
tracer = client.tracer.create()
GoogleGenerativeAiInstrumentor().instrument()Last updated on