Storage Monitoring for Amazon S3, Google Cloud Storage, and Azure Blob Storage provides deep, prefix-level analytics to help you understand exactly how your storage is being used, detect potential issues before they impact operations, and make data-driven decisions about storage optimization. Use these insights to help you track storage growth, investigate access patterns, and optimize costs.
This guide explains how to configure Storage Monitoring in Datadog for your S3 buckets, GCS buckets, and Azure Storage Accounts.
Access your Storage Monitoring data by navigating to Infrastructure -> Storage Monitoring.
Setup for Amazon S3
Installation
The fastest way to set up Storage Monitoring is using the provided CloudFormation templates. This process involves two steps:
Step 1: Configure inventory generation
This template configures your existing S3 bucket to generate inventory reports, which Datadog uses to generate detailed metrics about your bucket prefixes.
In AWS CloudFormation, click Create stack in the top right corner and select With existing resources (import resources).
In the Specify template step, select Upload a template file.
Click Choose file and select the source-bucket-inventory-cfn.yaml file, then click Next.
Enter the bucket name you want AWS to start generating inventories for, and click Next.
Fill in the required parameters:
DestinationBucketName: The bucket for storing inventory files. Note: You must only use one destination bucket for all inventory files generated in an AWS account.
SourceBucketName: The bucket you want to monitor and start generating inventory files for
Optional parameters:
SourceBucketPrefix: (Optional) Limit monitoring to a specific path in the source bucket
DestinationBucketPrefix: Specific path within the destination bucket. Ensure this path doesn’t include trailing slashes (/)
Click Next.
Wait for AWS to locate your source bucket, and click Import resources in the bottom right corner.
Note: This CloudFormation template can be rolled back, but rolling back doesn’t delete the created resources. This is to ensure the existing bucket doesn’t get deleted. You can manually delete the inventory configurations by going on the Management tab in the bucket view.
Note: Review Amazon S3 pricing for costs related to inventory generation.
Step 2: Configure required permissions
This template creates two IAM policies:
A policy to allow Datadog to read inventory files from the destination bucket
A policy to allow your source bucket to write inventory files to the destination bucket
In AWS CloudFormation, click Create stack in the top right corner and select With new resources (standard).
In the Specify template step, select Upload a template file.
Click Choose file and select the cloud-inventory-policies-cfn.yaml file, then click Next.
Fill in the required parameters:
DatadogIntegrationRole: Your Datadog AWS integration role name
DestinationBucketName: The name of the bucket that receives your inventory files. Note: You must only use one destination bucket for all inventory files generated in an AWS account.
SourceBucketName: The name of the bucket you want to start generating inventory files for
Optional parameters:
SourceBucketPrefix: This parameter limits the inventory generation to a specific prefix in the source bucket
DestinationBucketPrefix: If you want to reuse an existing bucket as the destination, this parameter allows the inventory files to be shipped to a specific prefix in that bucket. Ensure that any prefixes do not include trailing slashes (/)
On the Review and create step, verify the parameters have been entered correctly, and click Submit.
Post-setup steps
After completing the CloudFormation setup, fill out the post-setup form with the following required information:
Name of the destination bucket holding the inventory files.
Prefix where the files are stored in the destination bucket (if any).
Name of the source bucket you want to monitor (the bucket producing inventory files).
AWS region of the destination bucket holding the inventory files.
AWS account ID containing the buckets.
Datadog org ID.
To manually set up the required Amazon S3 Inventory and related configuration, follow these steps:
Step 1: Create a destination bucket
Create an S3 bucket to store your inventory files. This bucket acts as the central location for inventory reports. Note: You must only use one destination bucket for all inventory files generated in an AWS account.
Create a prefix within the destination bucket (optional).
Step 2: Configure the bucket and integration role policies
Follow the steps in the Amazon S3 user guide to add a bucket policy to your destination bucket allowing write access (s3:PutObject) from your source buckets.
Ensure the Datadog AWS integration role has s3:GetObject and s3:ListObjects permissions on the destination bucket. These permissions allow Datadog to read the generated inventory files.
Object versions: Datadog recommends selecting Current Versions Only
Destination: Select the common destination bucket for inventory files in your AWS account. For example, if the bucket is named destination-bucket, enter s3://your-destination-bucketNote: If you want to use a prefix on the destination bucket, add this as well
Frequency: Datadog recommends choosing Daily. This setting determines how often your prefix-level metrics are updated in Datadog
Output format: CSV
Status: Enabled
Server-side encryption: Don’t specify an encryption key
Select the following Additional metadata fields:
Size
Last Modified
Storage Class
Note: Review Amazon S3 pricing for costs related to inventory generation.
Post-setup steps
After completing the above steps, fill out the post-setup form with the following required information:
Name of the destination bucket where inventories are stored.
Prefix where the files are stored in the destination bucket (optional).
Region of the destination bucket.
AWS account ID containing the bucket.
Datadog role name that has the permissions to read objects in destination bucket.
Datadog org ID.
Validation
To verify your setup:
Wait for the first inventory report to generate (up to 24 hours for daily inventories)
Check the destination bucket for inventory files
Confirm the Datadog integration can access the files:
Navigate to Infrastructure -> Storage Monitoring -> Installation Recommendations to see if the bucket you configured is showing in the list
Troubleshooting
If you encounter any issues or need assistance:
Make sure to use only one destination bucket for all inventory files per AWS account
Verify all permissions are correctly configured
If you’re still encountering issues, reach out with your bucket details, AWS account ID, and Datadog org ID
Setup for Google Cloud Storage
Installation
The process involves the following steps:
Step 1: Install the GCP integration and enable resource collection
To collect GCP Storage metrics from your GCP project, install the GCP integration in Datadog. Enable Resource Collection for the project containing the buckets you want to monitor. Resource Collection allows Datadog to associate your buckets’ labels with the metrics collected through storage monitoring.
Note: While you can disable specific metric namespaces, keep the Cloud Storage namespace (gcp.storage) enabled.
After enabling the Storage Insights API, a project-level service agent is created automatically with the following format: service-PROJECT_NUMBER@gcp-sa-storageinsights.iam.gserviceaccount.com
The service agent requires these IAM roles:
roles/storage.insightsCollectorService on the source bucket (includes storage.buckets.getObjectInsights and storage.buckets.get permissions)
roles/storage.objectCreator on the destination bucket (includes the storage.objects.create permission)
Step 4: Create an inventory report configuration
You can create an inventory report configuration in multiple ways. The quickest methods use the Google Cloud CLI or Terraform templates. Regardless of the method, ensure the configuration:
Includes these metadata fields: "bucket", "name", "project", "size", "updated", "storageClass"
Generates CSV reports with '\n' as the delimiter and ',' as the separator
Uses this destination path format: <Bucket>/{{date}}, where <Bucket> is the monitored bucket-name
Copy the following Terraform template, substitute the necessary arguments, and apply it in the GCP project that contains your bucket.
Terraform configuration for inventory reports
locals {
source_bucket="" # The name of the bucket you want to monitor
destination_bucket="" # The bucket where inventory reports are written
frequency="" # Possible values: Daily, Weekly (report generation frequency)
location="" # The location of your source and destination buckets
}
data"google_project" "project" {
}
resource"google_storage_insights_report_config" "config" {
display_name="datadog-storage-monitoring" location=local.locationfrequency_options {
frequency=local.frequencystart_date {
day="" # Fill in the day
month="" # Fill in the month
year="" # Fill in the year
}
end_date {
day="" # Fill in the day
month="" # Fill in the month
year="" # Fill in the year
}
}
csv_options {
record_separator="\n" delimiter="," header_required=false }
object_metadata_report_options {
metadata_fields=["bucket", "name", "project", "size", "updated", "storageClass"]storage_filters {
bucket=local.source_bucket }
storage_destination_options {
bucket=google_storage_bucket.report_bucket.name destination_path="${local.source_bucket}/{{date}}" }
}
depends_on=[google_storage_bucket_iam_member.admin]}
resource"google_storage_bucket" "report_bucket" {
name=local.destination_bucket location=local.location force_destroy=true uniform_bucket_level_access=true}
resource"google_storage_bucket_iam_member" "admin" {
bucket=google_storage_bucket.report_bucket.name role="roles/storage.admin" member="serviceAccount:service-${data.google_project.project.number}@gcp-sa-storageinsights.iam.gserviceaccount.com"}
You can allow Datadog to handle the inventory report configuration by providing the proper permissions to your service account:
Navigate to IAM & Admin -> Service accounts
Find your Datadog service account and add the roles/storageinsights.Admin role
Navigate to the source bucket you want to monitor and grant these permissions:
roles/storage.insightsCollectorService
roles/storage.ObjectViewer
Navigate to the destination bucket and grant these permissions:
roles/storage.objectCreator
roles/storage.insightsCollectorService
Alternatively, you can create a custom role specifically for Datadog with these required permissions:
After granting the necessary permissions, Datadog can create the inventory report configuration with your setup details.
Step 5: Add the Storage Object Viewer role to your Datadog service account
Grant Datadog permission to access and extract the generated inventory reports from Google. This permission should be on the destination bucket where the inventory reports are stored.
Select the destination bucket for your inventory reports
In the bucket details page, click the Permissions tab
Under Permissions, click Grant Access to add a new principal
Principal: Enter the Datadog Service Account email
Ensure your shell environment is set to Bash before running the script. Ensure that the you replace the various placeholder inputs with the correct values:
<client_id>: The client ID of an App Registration already set up using the Datadog Azure integration
<subscription_id>: The subscription ID of the Azure subscription containing the storage accounts
<comma_separated_storage_account_names>: A comma-separated list of the storage accounts you want to monitor. For example, storageaccount1,storageaccount2
For Each Storage Account you wish to monitor:
Create a blob inventory policy
In the Azure portal, navigate to your Storage Account.
Go to Data management -> Blob inventory.
Click Add.
Configure the policy:
Name: datadog-storage-monitoring
Destination container:
Click Create new, and enter the name datadog-storage-monitoring.
Object type to inventory: Blob
Schedule: Daily
Blob types: Select Block blobs, Append blobs, and Page blobs.
Subtypes: Select Include blob versions
Schema fields: Select All, or ensure that at least the following are selected:
Name
Access tier
Last modified
Content length
Server encrypted
Current version status
Version ID
Exclude prefix: datadog-storage-monitoring
Click Add.
Add the role assignment
In the Azure portal, navigate to your Storage Account.
Go to Data storage -> Containers.
Click on the datadog-storage-monitoring container.
Click on Access control (IAM) in the left-hand menu.
Click Add -> Add role assignment.
Fill out the role assignment:
Role: Select Storage Blob Data Reader. Click Next.
Assign access to: User, group, or service principal.
Members: Click + Select members and search for your App Registration by its name and select it.
Note: This should be an App Registration set up in the Datadog Azure integration. Keep in mind the Client ID for later.
Click Review + assign.
Post-Installation
Once you finish with the above steps, fill out the post-setup form.