Submit Evaluations

이 페이지는 아직 한국어로 제공되지 않으며 번역 작업 중입니다. 번역에 관한 질문이나 의견이 있으시면 언제든지 저희에게 연락해 주십시오.

LLM Observability is not available in the selected site () at this time.

Overview

In the context of LLM applications, it’s important to track user feedback and evaluate the quality of your LLM application’s responses. While LLM Observability provides a few out-of-the-box evaluations for your traces, you can submit your own evaluations to LLM Observability in two ways: with Datadog’s Python SDK, or with the LLM Observability API. See Naming custom metrics for guidelines on how to choose an appropriate label for your evaluations.

Submitting evaluations with the SDK

To submit evaluations from your traced LLM application to Datadog, you’ll need to associate it with a span using the below steps:

  1. Extract the span context from the given span by using LLMObs.export_span(span). If span is not provided (as when using function decorators), the SDK exports the current active span. See Exporting a span for more details.
  2. Use LLMObs.submit_evaluation() with the extracted span context and evaluation information. See Submitting evaluations in the SDK documentation for details.

Example

from ddtrace.llmobs import LLMObs
from ddtrace.llmobs.decorators import llm

@llm(model_name="claude", name="invoke_llm", model_provider="anthropic")
def llm_call():
    completion = ... # user application logic to invoke LLM
    span_context = LLMObs.export_span(span=None)
    LLMObs.submit_evaluation(
        span_context,
        label="sentiment",
        metric_type="score",
        value=10,
    )
    return completion

Submitting evaluations with the API

You can use the evaluations API provided by LLM Observability to send evaluations associated with spans to Datadog. See the Evaluations API for more details on the API specifications.

Example

{
  "data": {
    "type": "evaluation_metric",
    "attributes": {
      "metrics": [
        {
          "span_id": "61399242116139924211",
          "trace_id": "13932955089405749200",
          "timestamp": 1609459200,
          "metric_type": "categorical",
          "label": "Sentiment",
          "categorical_value": "Positive"
        },
        {
          "span_id": "20245611112024561111",
          "trace_id": "13932955089405749200",
          "metric_type": "score",
          "label": "Accuracy",
          "score_value": 3
        }
      ]
    }
  }
}

Further Reading

PREVIEWING: rtrieu/product-analytics-ui-changes