Model Providers

Together AI

If you use Together AI models within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either

  1. Judgeval Wrap: Add our wrap around the initialization of your client
  2. OpenTelemetry: Use the TogetherAiInstrumentor library to wrap TogetherAI SDK calls and send Otel spans to Judgment.

Judgeval Wrap

from together import Together
from judgeval.tracer import wrap

client = Together()
wrapped_client = wrap(client)  # Use wrapped_client from here on

OpenTelemetry

from judgeval.tracer import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-together
from opentelemetry.instrumentation.together import TogetherInstrumentor


tracer = Tracer(project_name="together_ai_project")
TogetherInstrumentor().instrument()

Always initialize the Tracer before calling TogetherInstrumentor().instrument() to ensure proper trace routing.

Together AI uses an OpenAI-compatible API, so you can also use the OpenAI SDK with Together AI's base URL and trace it using the OpenAI integration.

Only use one of these options to trace Together AI calls or you may see duplicate tracing.