Amazon Security Lake Destination

This product is not supported for your selected Datadog site. ().

Use Observability Pipelines’ Amazon Security Lake destination to send logs to Amazon Security Lake.

Prerequisites

You need to do the following before setting up the Amazon Security Lake destination:

  1. Follow the Getting Started with Amazon Security Lake to set up Amazon Security Lake, and make sure to:
    • Enable Amazon Security Lake for the AWS account.
    • Select the AWS regions where S3 buckets will be created for OCSF data.
  2. Follow Collecting data from custom sources in Security Lake to create a custom source in Amazon Security Lake.
    • When you configure a custom log source in Security Lake in the AWS console:
      • Enter a source name.
      • Select the OCSF event class for the log source and type.
      • Enter the account details for the AWS account that will write logs to Amazon Security Lake:
        • AWS account ID
        • External ID
    • Select Create and use a new service for service access.
    • Take note of the name of the bucket that is created because you need it when you set up the Amazon Security Lake destination later on.
      • To find the bucket name, navigate to Custom Sources. The bucket name is in the location for your custom source. For example, if the location is s3://aws-security-data-lake-us-east-2-qjh9pr8hy/ext/op-api-activity-test, the bucket name is aws-security-data-lake-us-east-2-qjh9pr8hy.

Setup

Set up the Amazon Security Lake destination and its environment variables when you set up a pipeline. The information below is configured in the pipelines UI.

Set up the destination

  1. Enter your S3 bucket name.
  2. Enter the AWS region.
  3. Enter the custom source name.
  4. Optionally, select an AWS authentication option.
    1. Enter the ARN of the IAM role you want to assume.
    2. Optionally, enter the assumed role session name and external ID.
  5. Optionally, toggle the switch to enable TLS. If you enable TLS, the following certificate and key files are required.
    Note: All file paths are made relative to the configuration data directory, which is /var/lib/observability-pipelines-worker/config/ by default. See Advanced Configurations for more information. The file must be owned by the observability-pipelines-worker group and observability-pipelines-worker user, or at least readable by the group or user.
    • Server Certificate Path: The path to the certificate file that has been signed by your Certificate Authority (CA) Root File in DER or PEM (X.509).
    • CA Certificate Path: The path to the certificate file that is your Certificate Authority (CA) Root File in DER or PEM (X.509).
    • Private Key Path: The path to the .key private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.

Notes:

  • When you add the Amazon Security Lake destination, the OCSF processor is automatically added so that you can convert your logs to Parquet before they are sent to Amazon Security Lake. See Remap to OCSF documentation for setup instructions.
  • Only logs formatted by the OCSF processor are converted to Parquet.

Set the environment variables

There are no environment variables to configure for the Amazon Security Lake destination.

How the destination works

AWS Authentication

The Observability Pipelines Worker uses the standard AWS credential provider chain for authentication. See AWS SDKs and Tools standardized credential providers for more information.

Permissions

For Observability Pipelines to send logs to Amazon Security Lake, the following policy permissions are required:

  • s3:ListBucket
  • s3:PutObject

Event batching

A batch of events is flushed when one of these parameters is met. See event batching for more information.

Max EventsMax BytesTimeout (seconds)
None256,000,000300
PREVIEWING: sezen.leblay/aap-setup-page-java