Use a Judge

Once you have created your Judge using the scorable tool, you can integrate it into your application through various methods, including SDKs, API, or CLI.

Since OpenAI's API serves as the lingua franca of LLMs, it's one of the most popular ways to use Judges. Root Signals provides an easy integration method by simply changing your base URL to point to the Root Signals OpenAI proxy.

🔍 Run a Judge to evaluate the quality of returns policy claims

Let's walk through an example where we have a Judge that evaluates the quality of returns policy claims.

# pip install openai
from openai import OpenAI

client = OpenAI(
    api_key="$MY_ROOT_SIGNALS_API_KEY",
    base_url="https://api.app.rootsignals.ai/v1/judges/$MY_JUDGE_ID/openai/"
)
response = client.chat.completions.create(
    model="claude-sonnet-4",
    messages=[
        {"role": "user", "content": "I want to return my product"}
    ]
)

The response will include the Judge's evaluation results in the model_extra field:

print(response.model_extra.get('evaluator_results'))

[
  {
    "name": "Returns Policy Claims",
    "score": 0.95,
    "justification": "The policy claims are clear and easy to understand..."
  }
  ...
]

Background Execution

You can also run the Judge in the background and check the results later through the monitoring dashboard:

client = OpenAI(
    api_key="$MY_ROOT_SIGNALS_API_KEY",
    # 💡 Add ?async_judge=true to the base_url
    base_url="https://api.app.rootsignals.ai/v1/judges/$MY_JUDGE_ID/openai/?async_judge=true"
)
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[
        {"role": "user", "content": "I want to return my product"}
    ]
)
Judge results are stored and can be viewed from the monitoring pages or fetched from the API.

Judges automatically run in the background when you stream responses.

✨ Use the Judge to improve model responses automatically

By switching the base URL to the Root Signals OpenAI proxy refine endpoint, you can use the Judge to improve the model responses automatically.

client = OpenAI(
    api_key="$MY_ROOT_SIGNALS_API_KEY",
    base_url="https://api.app.rootsignals.ai/v1/judges/$MY_JUDGE_ID/refine/openai/"
)
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[
        {"role": "user", "content": "I want to return my product"}
    ]
)

Here, based on the Judge's evaluation, the Root Signals platform will ensure that the model response aligns with the safeguards you have configured in the Judge.

Summary

Integrating Judges into your application is straightforward — simply change the base URL in your existing OpenAI client configuration. Root Signals supports all major LLMs and providers, and you can bring your own models if needed.

Last updated