Model Providers

Ollama

If you use Ollama within your application, you can trace, monitor, and analyze them with Judgment using OllamaInstrumentor.

OpenTelemetry

pip install opentelemetry-instrumentation-ollama
from judgeval import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-ollama
from opentelemetry.instrumentation.ollama import OllamaInstrumentor

Tracer.init(project_name="ollama_project")
Tracer.registerOTELInstrumentation(OllamaInstrumentor())

Ollama allows you to run large language models locally. This instrumentation automatically traces all Ollama API calls.

On this page