Model Providers

OpenAI

If you use OpenAI models within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either

  1. Judgeval Wrap: Add our wrap around the initialization of your client
  2. OpenTelemetry: Use the OpenAIInstrumentor library to wrap OpenAI SDK calls and send Otel spans to Judgment.

Judgeval Wrap

from openai import OpenAI
from judgeval import Tracer, wrap

Tracer.init(project_name="openai_project")

openai_client = OpenAI()
wrapped_client = wrap(openai_client)  # Use wrapped_client from here on

OpenTelemetry

from judgeval import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-openai
from opentelemetry.instrumentation.openai import OpenAIInstrumentor

Tracer.init(project_name="openai_project")
Tracer.registerOTELInstrumentation(OpenAIInstrumentor())

Only use one of these options to trace OpenAI calls or you may see duplicate tracing.