Overview
The pipelines and processors outlined in this documentation are specific to on-premises logging environments. To aggregate, process, and route cloud-based logs, see
Log Management Pipelines.
In Observability Pipelines, a pipeline is a sequential path with three types of components: source, processors, and destinations. The Observability Pipeline source receives logs from your log source (for example, the Datadog Agent). The processors enrich and transform your data, and the destination is where your processed logs are sent. For some templates, your logs are sent to more than one destination. For example, if you use the Archive Logs template, your logs are sent to a cloud storage provider and another specified destination.
Set up a pipeline
You can use Datadog API to create a pipeline. After the pipeline has been created, install the Worker to start sending logs through the pipeline.
Pipelines created using the API are read-only in the UI. Use the update a pipeline endpoint to make any changes to an existing pipeline.
See Advanced Configurations for bootstrapping options.
Index your Worker logs
Make sure your Worker logs are indexed in Log Management for optimal functionality. The logs provide deployment information, such as Worker status, version, and any errors, that is shown in the UI. The logs are also helpful for troubleshooting Worker or pipelines issues. All Worker logs have the tag source:op_worker
.
Clone a pipeline
- Navigate to [Observability Pipelines][4].
- Select the pipeline you want to clone.
- Click the cog at the top right side of the page, then select Clone.
Delete a pipeline
- Navigate to [Observability Pipelines][4].
- Select the pipeline you want to delete.
- Click the cog at the top right side of the page, then select Delete.
Note: You cannot delete an active pipeline. You must stop all Workers for a pipeline before you can delete it.
Further Reading
Additional helpful documentation, links, and articles: