SaaS Cost Integrations

Join the Beta!

SaaS Cost Integrations are in public beta.

Overview

SaaS Cost Integrations allow you to send cost data directly from your providers by configuring the accounts associated with your cloud cost data in Datadog.

databricks
confluent cloud
mongodb
snowflake

elastic cloud
openai
fastly
twilio

If your provider is not supported, use Custom Costs to upload any cost data source to Datadog and understand the total cost of your services.

Setup

To use SaaS Cost Integrations, you must configure Cloud Cost Management for AWS, Azure, or Google Cloud.

See the respective documentation for your cloud provider:

aws
azure
google cloud

Configure your SaaS accounts

Navigate to Infrastructure > Cloud Costs > Settings > Accounts and click Configure on a provider to collect cost data.

Add your accounts with AWS, Azure, Google Cloud to collect cost data. You can also add your accounts for Fastly, Snowflake, Confluent Cloud, MongoDB, Databricks, OpenAI, and Twilio
  1. Navigate to the Databricks integration tile in Datadog and click Configure.
  2. Enter the workspace name, url, and access token corresponding to your Databricks account.
  3. Under the Select products to set up integration section, click the toggle for each account to enable Databricks Cloud Cost Management.
  4. Enter a System Tables SQL Warehouse ID corresponding to your Databricks instance’s warehouse to query for system table billing data.
  5. Click Save Databricks Workspace.

Your Databricks cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.

Integrate with Databricks to collect cost data.
  1. Create or acquire an API key with the billing admin role in Confluent Cloud.
  2. Navigate to the Confluent Cloud integration tile in Datadog and click Add Account.
  3. Enter your Confluent Cloud account name, API key, API secret, and optionally, specify tags.
  4. Under the Resources section, click the toggle for Collect cost data to view in Cloud Cost Management.
  5. Click Save.

Your Confluent Cloud cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.

If you wish to collect cluster-level tags or business metadata tags for your costs, you can add a Schema Registry API key and secret. Please look into Schema Management on Confluent Cloud for more information.

Integrate with Confluent to collect cost data.
  1. Create an API token in MongoDB with Organizational Billing Viewer permissions, and add Organizational Read Only permissions for cluster resource tags.
  2. Navigate to the MongoDB Cost Management integration tile in Datadog and click Add New.
  3. Enter your MongoDB account name, public key, private key, organizational ID, and optionally, specify tags.
  4. Click Save.

Your MongoDB cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.

Integrate with MongoDB to collect cost data.
  1. Navigate to the Snowflake integration tile in Datadog and click Add Snowflake Account.

  2. Enter your Snowflake account URL, for example: https://xyz12345.us-east-1.snowflakecomputing.com.

  3. Under the Connect your Snowflake account section, click the toggle to enable Snowflake in Cloud Cost Management.

  4. Enter your Snowflake user name in the User Name field.

  5. Create a Datadog-specific role and user to monitor Snowflake.

    Run the following in Snowflake to create a custom role:

    -- Create a new role intended to monitor Snowflake usage.
    create role DATADOG;
    
    -- Grant privileges on the SNOWFLAKE database to the new role.
    grant imported privileges on database SNOWFLAKE to role DATADOG;
    
    -- Grant usage to your default warehouse to the role DATADOG.
    grant usage on warehouse <WAREHOUSE> to role DATADOG;
    
    -- If you have cost usage collection enabled, ensure that your credentials have permission to view the ORGANIZATION_USAGE schema.
    grant role orgadmin to role DATADOG
    
    -- Create a user.
    create user DATADOG_USER
    LOGIN_NAME = DATADOG_USER
    password = <PASSWORD>
    default_warehouse = <WAREHOUSE>
    default_role = DATADOG
    
    -- Grant the monitor role to the user.
    grant role DATADOG to user <USER>
    
  6. Configure the key-value pair authentication:

  7. Click Save.

Your Snowflake cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.

Integrate with Snowflake to collect cost data.
  1. Go to the API Key section in your Elastic Cloud organization’s settings.
  2. Click Create New Key.
  3. Choose a Name and Expiration Date for your API key.
  4. Select the Billing Admin role.
  5. Click Create Key to generate the key.
  6. Go to the Elastic Cloud integration tile in Datadog
  7. Click Add Account.
  8. Enter your Elastic Cloud Organization ID and Billing API Key in the account table.

Your Elastic Cloud cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.

Integrate with Elastic Cloud to collect cost data.
  1. Create an API key in your account settings in OpenAI.
  2. Navigate to the OpenAI integration tile in Datadog and click Add Account.
  3. Enter your OpenAI account name, input your API key, and optionally, specify tags.
  4. Under the Resources section, click the toggle for each account to enable OpenAI Billing Usage Data Collection.
  5. Click Save.

Your OpenAI cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.

Integrate with OpenAI to collect cost data.
  1. Create an API token with at least the "global:read" scope and "Billing" role on the Personal API tokens page in Fastly.
  2. Navigate to the Fastly cost management integration tile in Datadog and click Add New.
  3. Enter your Fastly account name and API token.
  4. Click Save.

Your Fastly cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.

Integrate with Fastly to collect cost data.
  1. Navigate to the Twilio integration tile in Datadog and click Add Account.
  2. Under the Resources section, click the toggle for each account to enable Twilio in Cloud Cost Management.
  3. Enter an Account SID for your Twilio account.
  4. Click Save.

Your Twilio cost data for the past 15 months can be accessed in Cloud Cost Management after 24 hours. To access the available data collected by each SaaS Cost Integration, see the Data Collected section.

Integrate with Twilio to collect cost data.

Data Collected

You can view cost data on the Cloud Costs Analytics page, the Cloud Costs Tag Explorer, and in dashboards, notebooks, or monitors. You can also combine these cost metrics with other cloud cost metrics or observability metrics.

The following table contains a non-exhaustive list of out-of-the-box tags associated with each SaaS Cost integration.

Tag NameTag Description
record_idUnique ID for this record.
account_idID of the account this report was generated for.
workspace_idID of the Workspace this usage was associated with.
cloudCloud this usage is relevant for. Possible values are AWS, AZURE, and GCP.
billing_origin_productProduct or feature originating the billing event (for example, JOBS, CLUSTERS).
usage_typeType of usage being billed (for example, COMPUTE_TIME).
job_run_idIdentifier for the specific job run (if applicable).
node_typeType of node used in this billing record (for example, m5d.large).
destination_regionRegion where the workload is directed (if applicable).
central_clean_room_idID of the central clean room associated with the workload (if applicable).
notebook_pathPath to the notebook in Databricks (if applicable).
job_nameName of the job in Databricks (if applicable).
notebook_idID of the notebook used in this billing record (if applicable).
dlt_update_idDelta Live Table update ID associated with this usage (if applicable).
job_idUnique identifier for the job in Databricks.
dlt_maintenance_idMaintenance ID for Delta Live Tables (if applicable).
run_nameName of the current job or workflow run (if applicable).
instance_pool_idID of the instance pool used (if applicable).
cluster_idID of the cluster associated with this usage.
endpoint_idID of the endpoint for SQL-based or serving-related usage (if applicable).
warehouse_idID of the SQL warehouse (if applicable).
source_regionOriginating region for this billing record (if applicable).
dlt_pipeline_idID of the Delta Live Tables pipeline (if applicable).
endpoint_nameName of the SQL or serving endpoint (if applicable).
is_photonIndicates whether Photon processing was used (true or false).
dlt_tierTier of Delta Live Tables service (if applicable).
jobs_tierTier of the job, such as CLASSIC or PREMIUM.
networkingType of networking used for this job, if specified.
serving_typeType of serving model used, if applicable (for example, Model Serving).
sql_tierSQL tier associated with the usage (if applicable).
is_serverlessIndicates if the usage pertains to a serverless compute resource (true or false).
custom_tagsCustom tags applied to the usage, usually as key-value pairs for additional metadata or categorization.
usage_metadataMetadata related to the usage, which might include details like usage type, service category, or other relevant information.
Tag NameTag Description
resource_idThe unique identifier of the Confluent resource.
resource_nameThe name of the Confluent resource.
environment_idThe unique identifier for the environment.
network_access_typeNetwork access type for the cluster. Possible values are INTERNET, TRANSIT_GATEWAY, PRIVATE_LINK, and PEERED_VPC.
productProduct name. Possible values include KAFKA, CONNECT, KSQL, AUDIT_LOG, STREAM_GOVERNANCE, CLUSTER_LINK, CUSTOM_CONNECT, FLINK, SUPPORT_CLOUD_BASIC, SUPPORT_CLOUD_DEVELOPER, SUPPORT_CLOUD_BUSINESS, and SUPPORT_CLOUD_PREMIER.
Tag NameTag Description
organization_nameName of the organization.
contract_numberSnowflake contract number for the organization.
account_nameName of the account where the usage was consumed.
account_locatorLocator for the account where the usage was consumed.
regionName of the region where the account is located.
service_levelService level (edition) of the Snowflake account (Standard, Enterprise, or Business Critical).
user_nameName of the user or service account associated with the query.
warehouse_idIdentifier for the warehouse generating the cost.
warehouse_nameName of the warehouse associated with this usage.
warehouse_sizeSize of the warehouse (for example, Large, Medium).
cost_typeType of cost associated with the usage. Possible values include:
- CLOUD_SERVICES: General costs related to Snowflake’s underlying cloud services, excluding warehouse usage.
- IDLE_OR_LESS_100MS: Costs from warehouse idle time or queries that completed in under 100 milliseconds. Unattributed to specific queries. Falls under the warehouse_metering service type.
- QUERY_ATTRIBUTION: Costs attributed to specific queries, grouped by the parameterized query hash. For these costs, the parameterized query associated with this cost can be found under charge description. Falls under the warehouse_metering service type.
query_hashUnique hash representing a parameterized version of the query for attribution purposes. Only found for query attribution costs.
query_hash_versionVersion of the Snowflake query hash algorithm used to generate query_hash. Only found for query attribution costs.
database_nameName of the database in which the query was executed (if applicable). Only found for query attribution costs.
balance_sourceSource of the funds used to pay for the daily usage. The source can be one of the following:
- capacity: Usage paid with credits remaining on an organization’s capacity commitment.
- rollover: Usage paid with rollover credits. When an organization renews a capacity commitment, unused credits are added to the balance of the new contract as rollover credits.
- free usage: Usage covered by the free credits provided to the organization.
- overage: Usage that was paid at on-demand pricing, which occurs when an organization has exhausted its capacity, rollover, and free credits.
- rebate: Usage covered by the credits awarded to the organization when it shared data with another organization.
service_typeType of usage. Possible service types include:
- automatic_clustering: Refer to Automatic Clustering.
- data_transfer: Refer to Understanding data transfer cost.
- logging: Refer to Logging and Tracing Overview.
- materialized_view: Refer to Working with Materialized Views.
- replication: Refer to Introduction to replication and failover across multiple accounts.
- query_acceleration: Refer to Using the Query Acceleration Service.
- search_optimization: Refer to Search Optimization Service.
- serverless_task: Refer to Introduction to tasks.
- snowpipe: Refer to Snowpipe.
- snowpipe_streaming: Refer to Snowpipe Streaming.
- storage: Refer to Understanding storage cost.
- warehouse_metering: Refer to Virtual warehouse credit usage. Does not indicate usage of serverless or cloud services compute.
rating_typeIndicates how the usage in the record is rated, or priced. Possible values include:
- compute
- data_transfer
- storage
- Other
billing_typeIndicates what is being charged or credited. Possible billing types include:
- consumption: Usage associated with compute credits, storage costs, and data transfer costs.
- rebate: Usage covered by the credits awarded to the organization when it shared data with another organization.
- priority support: Charges for priority support services. This charge is associated with a stipulation in a contract, not with an account.
- vps_deployment_fee: Charges for a Virtual Private Snowflake deployment.
- support_credit: Snowflake Support credited the account to reverse charges attributed to an issue in Snowflake.
Tag NameTag Description
nameThe unique identifier of the Elastic Cloud resource.
price_per_hourThe cost of the Elastic Cloud resource per hour.
kindThe type of resource.
Tag NameTag Description
invoice_idThe unique identifier of the invoice.
statusState of the payment.
mongo_org_idMongoDB organization ID.
cluster_nameThe name of the cluster that incurred the charge.
group_idID of the project with which the line item is associated.
replica_set_nameName of the replica set with which the line item is associated.
resource_tagsArbitrary tags on clusters set by users, usually as key-value pairs.
Tag NameTag Description
organization_idThe unique identifier of the organization.
project_idThe unique identifier of the project.
project_nameThe name of the project.
organization_nameThe name of the organization.
Tag NameTag Description
credit_coupon_codeCode of any coupon or credit applied to this cost entry (if available).
product_nameName of the specific product being billed (for example, “North America Bandwidth”).
product_groupGroup or category of the product, such as “Full Site Delivery”.
product_lineLine of products to which this item belongs, for example, “Network Services”.
usage_typeType of usage being billed (for example, “Bandwidth”).
regionRegion where the service usage occurred (for example, “North America”).
service_nameName of the service associated with this cost entry, often matching the product_name.
usage_type_cdCode or label representing the type of usage, such as “North America Bandwidth”.
plan_nameName of the plan under which this service falls, often matching “product_line”.
Tag NameTag Description
account_sidAlphanumeric string identifying the Twilio account.
categoryThe category of usage. For more information, see Usage Categories.
count_unitThe units in which count is measured, such as calls for calls or messages for SMS.
usage_unitThe units in which usage is measured, such as minutes for calls or messages for SMS.

Further reading

PREVIEWING: rtrieu/product-analytics-ui-changes