- 필수 기능
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- 디지털 경험
- 소프트웨어 제공
- 보안
- 로그 관리
- 관리
- 인프라스트럭처
- ci
- containers
- csm
- ndm
- otel_guides
- overview
- slos
- synthetics
- tests
- 워크플로
Datadog’s LLM Observability Python SDK provides integrations that automatically trace and annotate calls to LLM frameworks and libraries. Without changing your code, you can get out-of-the-box traces and observability for calls that your LLM application makes to the following frameworks:
Framework | Supported Versions |
---|---|
OpenAI | >= 0.26.5 |
Langchain | >= 0.0.192,<0.2.0 |
AWS Bedrock | >= 1.31.57 |
Anthropic | >= 0.28.0 |
In addition to capturing latency and errors, the integrations capture the input parameters, input and output messages, and token usage (when available) of each traced call.
All integrations are enabled by default.
To disable all integrations, use the in-code SDK setup and specify integrations_enabled=False
.
To only enable specific integrations:
integrations_enabled=False
.ddtrace.patch()
at the top of the entrypoint file of your LLM application:from ddtrace import patch
from ddtrace.llmobs import LLMObs
LLMObs.enable(integrations_enabled=False, ...)
patch(<INTEGRATION_NAME_IN_LOWERCASE>=True)
Note: Use botocore
as the name of the AWS Bedrock integration when manually enabling.
The OpenAI integration provides automatic tracing for the OpenAI Python SDK’s completion and chat completion endpoints.
The OpenAI integration instruments the following methods, including streamed calls:
OpenAI().completions.create()
AsyncOpenAI().completions.create()
OpenAI().chat.completions.create()
AsyncOpenAI().chat.completions.create()
The LangChain integration provides automatic tracing for the LangChain Python SDK’s LLM, chat model, and chain calls.
The LangChain integration instruments the following methods:
llm.invoke()
, llm.ainvoke()
chat_model.invoke()
, chat_model.ainvoke()
chain.invoke()
, chain.ainvoke()
chain.batch()
, chain.abatch()
Note: The LangChain integration does not yet support tracing streamed calls.
The AWS Bedrock integration provides automatic tracing for the AWS Bedrock Runtime Python SDK’s chat model calls (using Boto3/Botocore).
The AWS Bedrock integration instruments the following methods:
InvokeModel
InvokeModelWithResponseStream
Note: The AWS Bedrock integration does not yet support tracing embedding calls.
The Anthropic integration provides automatic tracing for the Anthropic Python SDK’s chat message calls.
The Anthropic integration instruments the following methods:
Anthropic().messages.create()
, AsyncAnthropic().messages.create()
Anthropic().messages.stream()
, AsyncAnthropic().messages.stream()