Monitor, troubleshoot, and evaluate your LLM-powered applications, such as chatbots or data extraction tools, using OpenAI. With LLM Observability, you can investigate the root cause of issues, monitor operational performance, and evaluate the quality, privacy, and safety of your LLM applications.
Get cost estimation, prompt and completion sampling, error tracking, performance metrics, and more out of OpenAI account-level, Python, Node.js, and PHP library requests using Datadog metrics and APM.
Note: This setup method only collects openai.api.usage.* metrics. To collect all metrics provided by this integration, also follow the APM setup instructions.
Datadog’s OpenAI integration allows you to collect usage metrics, cost data, and enables LLM Observability to monitor your OpenAI models. Follow the steps below to generate an OpenAI API key and configure the integration.
Under Account Name, enter a name for your account. Under API Key, enter your OpenAI API key. Optionally, add a comma-separated list of tags for metrics associated with this account.
Under Resources, enable toggles depending on your use case:
Validate that LLM Observability is properly capturing spans by checking your application logs for successful span creation. You can also run the following command to check the status of the ddtrace integration:
ddtrace-run --info
Look for the following message to confirm the setup:
Non-US1 customers must set DD_SITE on the application command to the correct Datadog site parameter as specified in the table in the Datadog Site page (for example, datadoghq.eu for EU1 customers).
If the Agent is using a non-default hostname or port, be sure to also set DD_AGENT_HOST, DD_TRACE_AGENT_PORT, or DD_DOGSTATSD_PORT.
Pass the --debug flag to ddtrace-run to enable debug logging.
ddtrace-run --debug
This displays any errors sending data:
ERROR:ddtrace.internal.writer.writer:failed to send, dropping 1 traces to intake at http://localhost:8126/v0.5/traces after 3 retries ([Errno 61] Connection refused)
Note: This setup method does not collect openai.api.usage.* metrics. To collect these metrics, also follow the API key setup instructions.
constllmobs=require('dd-trace').llmobs;// or, if dd-trace was not initialized via NODE_OPTIONS
constllmobs=require('dd-trace').init({llmobs:{mlApp:<YOUR_ML_APP>,}}).llmobs;// with DD_API_KEY and DD_SITE being set at the environment level
asyncfunctionhandler(event,context){...llmobs.flush()return...}
Validate that the APM Node.js library can communicate with your Agent by examining the debugging output from the application process. Within the section titled “Encoding payload,” you should see an entry with a name field and a correlating value of openai.request. See below for a truncated example of this output:
The library is automatically injected into your OpenAI PHP application.
Notes:
Non-US1 customers must set DD_SITE on the application command to the correct Datadog site parameter as specified in the table in the Datadog Site page (for example, datadoghq.eu for EU1 customers).
If the Agent is using a non-default hostname or port, set DD_AGENT_HOST, DD_TRACE_AGENT_PORT, or DD_DOGSTATSD_PORT.
To validate that the APM PHP library can communicate with your Agent, examine the phpinfo output of your service. Under the ddtrace section, Diagnostic checks should be passed.