Forwarding Audit Events to Custom Destinations

Audit Event Forwarding is not available in the US1-FED site.
Audit Event Forwarding is in beta.

Overview

Audit Event Forwarding allows you to send audit events from Datadog to custom destinations like Splunk, Elasticsearch, and HTTP endpoints. Audit events are forwarded in JSON format. You can add up to three destinations for each Datadog org.

The Custom Destinations section showing an active Login-Event-to-SIEM destination with 10.4 MB of estimated audit events volume in the last 24 hours and @action:login as query to filter.

Note: Only Datadog users with the audit_trail_write permission can create, edit, or delete custom destinations for forwarding audit events.

Set up audit event forwarding to custom destinations

  1. Add webhook IPs from the IP ranges list to the allowlist if necessary.
  2. Navigate to Audit Trail Settings.
  3. Click Add Destination in the Audit Event Forwarding section.
  4. Enter the query to filter your audit events for forwarding. For example, add @action:login as the query to filter if you only want to forward login events to your SIEM or custom destination. See Search Syntax for more information.
  5. Select the Destination Type.
  1. Enter a name for the destination.
  2. In the Define endpoint field, enter the endpoint to which you want to send the logs. The endpoint must start with https://.
  3. In the Configure Authentication section, select one of the following authentication types and provide the relevant details:
    • Basic Authentication: Provide the username and password for the account to which you want to send logs.
    • Request Header: Provide the header name and value. For example, if you use the Authorization header and the username for the account to which you want to send logs is myaccount and the password is mypassword:
      • Enter Authorization for the Header Name.
      • The header value is in the format of Basic username:password, where username:password is encoded in base64. For this example, the header value is Basic bXlhY2NvdW50Om15cGFzc3dvcmQ=.
  4. Click Save.
  1. Enter a name for the destination.
  2. In the Configure Destination section, enter the endpoint to which you want to send the logs. The endpoint must start with https://. For example, enter https://<your_account>.splunkcloud.com:8088. Note: /services/collector/event is automatically appended to the endpoint.
  3. In the Configure Authentication section, enter the Splunk HEC token. See Set up and use HTTP Event Collector for more information about the Splunk HEC token.
  4. Click Save.

Note: The indexer acknowledgment needs to be disabled.

  1. Enter a name for the destination.

  2. In the Configure Destination section, enter the following details:

    a. The endpoint to which you want to send the logs. The endpoint must start with https://. An example endpoint for Elasticsearch: https://<your_account>.us-central1.gcp.cloud.es.io.

    b. The name of the destination index where you want to send the logs.

    c. Optionally, select the index rotation for how often you want to create a new index: No Rotation, Every Hour, Every Day, Every Week, or Every Month. The default is No Rotation.

  3. In the Configure Authentication section, enter the username and password for your Elasticsearch account.

  4. Click Save.

Further Reading

Additional helpful documentation, links, and articles:

PREVIEWING: rtrieu/product-analytics-ui-changes