Model Providers
Together AI
If you use Together AI models within your application, you can trace, monitor, and analyze them with Judgment. You can do this by either
- Judgeval Wrap: Add our wrap around the initialization of your client
- OpenTelemetry: Use the TogetherAiInstrumentor library to wrap TogetherAI SDK calls and send Otel spans to Judgment.
Judgeval Wrap
from together import Together
from judgeval import Tracer, wrap
Tracer.init(project_name="together_ai_project")
together_client = Together()
wrapped_client = wrap(together_client) # Use wrapped_client from here onOpenTelemetry
from judgeval import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-together
from opentelemetry.instrumentation.together import TogetherInstrumentor
Tracer.init(project_name="together_ai_project")
Tracer.registerOTELInstrumentation(TogetherInstrumentor())Last updated on