Sync resources across Datadog organizations
Overview
Use the datadog-sync-cli
tool to copy your dashboards, monitors and other configurations from your primary Datadog account to your secondary Datadog account.
You can determine the frequency and timing of syncing based on your business requirements. However, regular syncing is essential to ensure that your secondary account is up-to-date in the event of an outage.
Datadog recommends syncing your accounts on a daily basis.
Note: The datadog-sync-cli
tool is used to migrate resources across organizations, regardless of datacenter. It cannot, nor is it intended to, transfer any ingested data, such as logs, metrics etc. The source organization will not be modified, but the destination organization will have resources created and updated by the sync
command.
Setup
The datadog-sync-cli
tool can be installed from:
Installing from source
Installing from source requires Python v3.9+
Clone the project repo and CD into the directory
git clone https://github.com/DataDog/datadog-sync-cli.git
cd datadog-sync-cli
Install datadog-sync-cli
tool using pip
Invoke the cli tool using
datadog-sync <command> <options>
Installing from releases
Download the executable from the Releases page
Provide the executable with executable permission
chmod +x datadog-sync-cli-{system-name}-{machine-type}
Move the executable to your bin directory
sudo mv datadog-sync-cli-{system-name}-{machine-type} /usr/local/bin/datadog-sync
Invoke the CLI tool using
datadog-sync <command> <options>
Download the executable with extension .exe
from the Releases page
Add the directory containing the exe file to your path
Invoke the CLI tool in cmd/powershell
using the file name and omitting the extension:
datadog-sync-cli-windows-amd64 <command> <options>
Installing using Docker
Clone the project repo and CD into the directory
git clone https://github.com/DataDog/datadog-sync-cli.git
cd datadog-sync-cli
Build the provided Dockerfile
docker build . -t datadog-sync
Run the Docker image using entrypoint below:
docker run --rm -v <PATH_TO_WORKING_DIR>:/datadog-sync:rw \
-e DD_SOURCE_API_KEY=<DATADOG_API_KEY> \
-e DD_SOURCE_APP_KEY=<DATADOG_APP_KEY> \
-e DD_SOURCE_API_URL=<DATADOG_API_URL> \
-e DD_DESTINATION_API_KEY=<DATADOG_API_KEY> \
-e DD_DESTINATION_APP_KEY=<DATADOG_APP_KEY> \
-e DD_DESTINATION_API_URL=<DATADOG_API_URL> \
datadog-sync:latest <command> <options>
The docker run command mounts a specified <PATH_TO_WORKING_DIR> working directory to the container.
Usage
These are the Available URL’s for the source and destination API URLs when syncing your organization:
Site | Site URL | Site Parameter | Location |
---|
US1 | https://app.datadoghq.com | datadoghq.com | US |
US3 | https://us3.datadoghq.com | us3.datadoghq.com | US |
US5 | https://us5.datadoghq.com | us5.datadoghq.com | US |
EU1 | https://app.datadoghq.eu | datadoghq.eu | EU (Germany) |
US1-FED | https://app.ddog-gov.com | ddog-gov.com | US |
AP1 | https://ap1.datadoghq.com | ap1.datadoghq.com | Japan |
For all available regions, see Getting Started with Datadog Sites.
Syncing your resources
Run the import
command to read the specified resources from the source organization and store them locally into JSON files in the directory resources/source
.
Run the sync
command which will use the stored files from previous import
command to create/modify the resources on the destination organization. The pushed resources are saved in the directory resources/destination.
- (unless
--force-missing-dependencies flag is passed
)(WHAT IS THIS REFERENING?
)
The migrate command will run an import
followed immediately by a sync
.
The reset command will delete resources at the destination; however, by default it backs up those resources first and fails if it cannot. You can (but probably shouldn’t) skip the backup by using the –do-not-backup flag.
Note: The tool uses the resources directory as the source of truth for determining what resources need to be created and modified. Hence, this directory should not be removed or corrupted.
Example usage:
# Import resources from parent organization and store them locally
$ datadog-sync import \
--source-api-key="..." \
--source-app-key="..." \
--source-api-url="https://api.datadoghq.com" # this is an example of a source url, yours may be different
> 2024-03-14 14:53:54,280 - INFO - Starting import...
> ...
> 2024-03-14 15:00:46,100 - INFO - Finished import
# Check diff output to see what resources will be created/modified
$ datadog-sync diffs \
--destination-api-key="..." \
--destination-app-key="..." \
--destination-api-url="https://api.datadoghq.eu" #this is an example of a destination url, yours may be different
> 2024-03-14 15:46:22,014 - INFO - Starting diffs...
> ...
> 2024-03-14 14:51:15,379 - INFO - Finished diffs
# Sync the resources to the child organization from locally stored files and save the output locally
$ datadog-sync sync \
--destination-api-key="..." \
--destination-app-key="..." \
--destination-api-url="https://api.datadoghq.eu"
> 2024-03-14 14:55:56,535 - INFO - Starting sync...
> ...
> 2024-03-14 14:56:00,797 - INFO - Finished sync: 1 successes, 0 errors
Further Reading
Additional helpful documentation, links, and articles: