- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
This page walks Technology Partners through creating a log pipeline. A log pipeline is required if your integration is sending in logs.
Use the Logs Ingestion HTTP endpoint to send logs to Datadog.
When creating a log pipeline, consider the following best practices:
source
tag to the integration name.source
tag is set to <integration_name>
and that the service
tag is set to the name of the service that produces the telemetry. For example, the service
tag can be used to differentiate logs by product line.For cases where there aren’t different services, set service
to the same value as source
. The source
and service
tags must be non-editable by the user because the tags are used to enable integration pipelines and dashboards. The tags can be set in the payload or through the query parameter, for example, ?ddsource=example&service=example
.The source
and service
tags must be in lowercase.http-intake.logs
.
.ddtags=<TAGS>
query parameter. See the Send Logs API documentation for examples.For information about becoming a Datadog Technology Partner, and gaining access to an integration development sandbox, read Build an Integration.
Logs sent to Datadog are processed in log pipelines to standardize them for easier search and analysis.
To set up a log pipeline:
source
tag that defines the log source for the Technology Partner’s logs. For example, source:okta
for the Okta integration. Note: Make sure that logs sent through the integration are tagged with the correct source tags before they are sent to Datadog.You can add processors within your pipelines to restructure your data and generate attributes.
Requirements:
status
of a log, or a category processor for statuses mapped to a range (as with HTTP status codes).network.client.ip
so Datadog can display Technology Partner logs in out-of-the-box dashboards. Remove original attributes when remapping by using preserveSource:false
to avoid duplicates.service
attribute or set it to the same value as the source
attribute.For a list of all log processors, see Processors.
Tip: Take the free course Going Deeper with Logs Processing for an overview on writing processors and leveraging standard attributes.
You can optionally create facets in the Log Explorer. Facets are specific attributes that can be used to filter and narrow down search results. While facets are not strictly necessary for filtering search results, they play a crucial role in helping users understand the available dimensions for refining their search.
Measures are a specific type of facet used for searches over a range. For example, adding a measure for latency duration allows users to search for all logs above a certain latency. Note: Define the unit of a measure facet based on what the attribute represents.
To add a facet or measure:
To help navigate the facet list, facets are grouped together. For fields specific to the integration logs, create a single group with the same name as the source
tag.
Guidelines
Requirements:
log
for attributes or tag
for tags.Datadog reviews the log integration based on the guidelines and requirements documented on this page and provides feedback to the Technology Partner through GitHub. In turn, the Technology Partner reviews and makes changes accordingly.
To start a review process, export your log pipeline and relevant custom facets using the Export icon on the Logs Configuration page.
Include sample raw logs with all the attributes you expect to be sent into Datadog by your integration. Raw logs comprise of the raw messages generated directly from the source before they have been ingested by Datadog.
Exporting your log pipeline includes two YAML files:
Note: Depending on your browser, you may need to adjust your settings to allow file downloads.
After you’ve downloaded these files, navigate to your integration’s pull request on GitHub and add them in the Assets > Logs directory. If a Logs folder does not exist yet, you can create one.
Validations are run automatically in your pull request.
Three common validation errors are:
id
field in both YAML files: Ensure that the id
field matches the app_id
field in your integration’s manifest.json
file to connect your pipeline to your integration.result
field in the YAML file containing the raw example logs.service
as a parameter, instead of sending it in the log payload, you must include the service
field below your log samples within the yaml file.Once validations pass, Datadog creates and deploys the new log integration assets. If you have any questions, add them as comments in your pull request. A Datadog team member will respond within 2-3 business days.