- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
You can send custom pipelines through HTTP using the public API endpoint. For more information about how pipeline executions are modeled, see Pipeline Data Model and Execution Types.
Pipeline Visibility | Platform | Definition |
---|---|---|
Custom tags and measures at runtime | Custom tags and measures at runtime | Configure custom tags and measures at runtime. |
Manual steps | Manual steps | View manually triggered pipelines. |
Parameters | Parameters | Set custom parameters when a pipeline is triggered. |
Partial retries | Partial pipelines | View partially retried pipeline executions. |
Pipeline failure reasons | Pipeline failure reasons | Identify pipeline failure reasons from error messages. |
Queue time | Queue time | View the amount of time pipeline jobs sit in the queue before processing. |
To send pipeline events programmatically to Datadog, ensure that your DD_API_KEY
is configured.
Set the headers of your HTTP request:
DD-API-KEY
: Your Datadog API key.Content-Type
: application/json
.Prepare the payload body by entering information about the pipeline execution in a cURL command:
Parameter Name | Description | Example Value |
---|---|---|
Unique ID | The UUID of the pipeline run. The ID has to be unique across retries and pipelines, including partial retries. | b3262537-a573-44eb-b777-4c0f37912b05 |
Name | The name of the pipeline. All pipeline runs for the builds should have the same name. | Documentation Build |
Git Repository | The Git repository URL that triggered the pipeline. | https://github.com/Datadog/documentation |
Commit Author | The commit author email that triggered the pipeline. | contributor@github.com |
Commit SHA | The commit hash that triggered the pipeline. | cf852e17dea14008ac83036430843a1c |
Status | The final status of the pipeline. Allowed enum values: success , error , canceled , skipped , blocked , or running . | success |
Partial Retry | Whether or not the pipeline was a partial retry of a previous attempt. This field expects a boolean value (true or false ). A partial retry is one which only runs a subset of the original jobs. | false |
Start | Time when the pipeline run started (it should not include any queue time). The time format must be RFC3339. | 2024-08-22T11:36:29-07:00 |
End | Time when the pipeline run finished. The time format must be RFC3339. | 2024-08-22T14:36:00-07:00 |
URL | The URL to look at the pipeline in the CI provider UI. | http://your-ci-provider.com/pipeline/{pipeline-id} |
For example, this payload sends a CI pipeline event to Datadog:
curl -X POST "https://api.datadoghq.com/api/v2/ci/pipeline" \
-H "Content-Type: application/json" \
-H "DD-API-KEY: <YOUR_API_KEY>" \
-d @- << EOF
{
"data": {
"attributes": {
"resource": {
"level": "pipeline",
"unique_id": "b3262537-a573-44eb-b777-4c0f37912b05",
"name": "Documentation Build",
"git": {
"repository_url": "https://github.com/Datadog/documentation",
"author_email": "contributor@github.com",
"sha": "cf852e17dea14008ac83036430843a1c"
},
"status": "success",
"start": "2024-08-22T11:36:29-07:00",
"end": "2024-08-22T14:36:00-07:00",
"partial_retry": false,
"url": ""
}
},
"type": "cipipeline_resource_request"
}
}
EOF
After sending your pipeline event to Datadog, you can integrate additional event types such as stage
, job
, and step
. For more information, see the Send Pipeline Event endpoint.
The CI Pipeline List and Executions pages populate with data after the pipelines are accepted for processing.
The CI Pipeline List page shows data for only the default branch of each repository. For more information, see Search and Manage CI Pipelines.