Model Providers

OpenRouter

If you use OpenRouter models within your application, you can trace, monitor, and analyze them with Judgment. You can do this by using:

  1. Judgeval Wrap: Add our wrap around the initialization of your client

OpenRouter provides a unified interface to access multiple LLM providers through a single API. Since OpenRouter uses an OpenAI-compatible API, you can use the OpenAI client with OpenRouter's base URL.

Judgeval Wrap

from openai import OpenAI
from judgeval.tracer import wrap

client = OpenAI(
    api_key="YOUR_OPENROUTER_API_KEY",
    base_url="https://openrouter.ai/api/v1",
    default_headers={
        "HTTP-Referer": "YOUR_OPENROUTER_URL",
        "X-Title": "YOUR_OPENROUTER_NAME",
    }
)
wrapped_client = wrap(client)  # Use wrapped_client from here on

Cost Tracking

Once wrapped, you can use the client as normal and all calls will be traced:

response = wrapped_client.chat.completions.create(
    model="openai/gpt-4",
    messages=[
        {"role": "user", "content": "Hello, how are you?"}
    ],
    extra_body={"usage": {"include": True}}, # Include this to capture costs
)

Make sure to add the additional params supported by OpenRouter Usage Accounting to enable cost tracking.