Sync resources across Datadog organizations

Overview

Use the datadog-sync-cli tool to copy your dashboards, monitors and other configurations from your primary Datadog account to your secondary Datadog account.

You can determine the frequency and timing of syncing based on your business requirements. However, regular syncing is essential to ensure that your secondary account is up-to-date in the event of an outage.

Datadog recommends syncing your accounts on a daily basis.

Note: The datadog-sync-cli tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync command.

Setup

The datadog-sync-cli tool can be installed from:

Installing from source

Installing from source requires Python v3.9+

  1. Clone the project repo and CD into the directory

    git clone https://github.com/DataDog/datadog-sync-cli.git
    cd datadog-sync-cli
    
  2. Install datadog-sync-cli tool using pip

    pip install .
    
  3. Invoke the cli tool using

    datadog-sync <command> <options>
    

Installing from releases

  1. Download the executable from the Releases page

  2. Provide the executable with executable permission

    chmod +x datadog-sync-cli-{system-name}-{machine-type}
    
  3. Move the executable to your bin directory

    sudo mv datadog-sync-cli-{system-name}-{machine-type} /usr/local/bin/datadog-sync
    
  4. Invoke the CLI tool using

    datadog-sync <command> <options>
    
  1. Download the executable with extension .exe from the Releases page

  2. Add the directory containing the exe file to your path

  3. Invoke the CLI tool in cmd/powershell using the file name and omitting the extension:

    datadog-sync-cli-windows-amd64 <command> <options>
    

Installing using Docker

  1. Clone the project repo and CD into the directory

    git clone https://github.com/DataDog/datadog-sync-cli.git 
    cd datadog-sync-cli
    
  2. Build the provided Dockerfile

    docker build . -t datadog-sync
    
  3. Run the Docker image using entrypoint below:

     docker run --rm -v <PATH_TO_WORKING_DIR>:/datadog-sync:rw \
       -e DD_SOURCE_API_KEY=<DATADOG_API_KEY> \
       -e DD_SOURCE_APP_KEY=<DATADOG_APP_KEY> \
       -e DD_SOURCE_API_URL=<DATADOG_API_URL> \
       -e DD_DESTINATION_API_KEY=<DATADOG_API_KEY> \
       -e DD_DESTINATION_APP_KEY=<DATADOG_APP_KEY> \
       -e DD_DESTINATION_API_URL=<DATADOG_API_URL> \
       datadog-sync:latest <command> <options>
    

The docker run command mounts a specified <PATH_TO_WORKING_DIR> working directory to the container.

Usage

These are the Available URL’s for the source and destination API URLs when syncing your organization:

SiteSite URLSite ParameterLocation
US1https://app.datadoghq.comdatadoghq.comUS
US3https://us3.datadoghq.comus3.datadoghq.comUS
US5https://us5.datadoghq.comus5.datadoghq.comUS
EU1https://app.datadoghq.eudatadoghq.euEU (Germany)
US1-FEDhttps://app.ddog-gov.comddog-gov.comUS
AP1https://ap1.datadoghq.comap1.datadoghq.comJapan

For all available regions, see Getting Started with Datadog Sites.

Syncing your resources

  1. Run the import command to read the specified resources from the source organization and store them locally into JSON files in the directory resources/source.

  2. Run the sync command which will use the stored files from previous import command to create/modify the resources on the destination organization. The pushed resources are saved in the directory resources/destination.

    • (unless --force-missing-dependencies flag is passed)(WHAT IS THIS REFERENING?)
  3. The migrate command will run an import followed immediately by a sync.

  4. The reset command will delete resources at the destination; however, by default it backs up those resources first and fails if it cannot. You can (but probably shouldn’t) skip the backup by using the –do-not-backup flag.

Note: The tool uses the resources directory as the source of truth for determining what resources need to be created and modified. Hence, this directory should not be removed or corrupted.

Example usage:

# Import resources from parent organization and store them locally
$ datadog-sync import \
    --source-api-key="..." \
    --source-app-key="..." \
    --source-api-url="https://api.datadoghq.com" # this is an example of a source url, yours may be different

> 2024-03-14 14:53:54,280 - INFO - Starting import...
> ...
> 2024-03-14 15:00:46,100 - INFO - Finished import

# Check diff output to see what resources will be created/modified
$ datadog-sync diffs \
    --destination-api-key="..." \
    --destination-app-key="..." \
    --destination-api-url="https://api.datadoghq.eu" #this is an example of a destination url, yours may be different

> 2024-03-14 15:46:22,014 - INFO - Starting diffs...
> ...
> 2024-03-14 14:51:15,379 - INFO - Finished diffs

# Sync the resources to the child organization from locally stored files and save the output locally
$ datadog-sync sync \
    --destination-api-key="..." \
    --destination-app-key="..." \
    --destination-api-url="https://api.datadoghq.eu"

> 2024-03-14 14:55:56,535 - INFO - Starting sync...
> ...
> 2024-03-14 14:56:00,797 - INFO - Finished sync: 1 successes, 0 errors

Further Reading

PREVIEWING: ida.adjivon/DOCS-10140-datadog-sync-cli