Model Providers

Ollama

If you use Ollama within your application, you can trace, monitor, and analyze them with Judgment using OllamaInstrumentor.

OpenTelemetry

pip install opentelemetry-instrumentation-ollama
from judgeval.tracer import Tracer
# Make sure you installed the dependency: opentelemetry-instrumentation-ollama
from opentelemetry.instrumentation.ollama import OllamaInstrumentor


tracer = Tracer(project_name="ollama_project")
OllamaInstrumentor().instrument()

Always initialize the Tracer before calling OllamaInstrumentor().instrument() to ensure proper trace routing.

Ollama allows you to run large language models locally. This instrumentation automatically traces all Ollama API calls.

On this page