Model Providers

Together AI

If you use Together AI models within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either

  1. Judgeval Wrap: Add our wrap around the initialization of your client
  2. OpenTelemetry: Use the TogetherAiInstrumentor library to wrap TogetherAI SDK calls and send Otel spans to Judgment.

Judgeval Wrap

from together import Together
from judgeval import Tracer, wrap

Tracer.init(project_name="together_ai_project")

together_client = Together()
wrapped_client = wrap(together_client)  # Use wrapped_client from here on

OpenTelemetry

from judgeval import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-together
from opentelemetry.instrumentation.together import TogetherInstrumentor

Tracer.init(project_name="together_ai_project")
Tracer.registerOTELInstrumentation(TogetherInstrumentor())

Together AI uses an OpenAI-compatible API, so you can also use the OpenAI SDK with Together AI's base URL and trace it using the OpenAI integration.

Only use one of these options to trace Together AI calls or you may see duplicate tracing.