- 필수 기능
- 시작하기
- Glossary
- 표준 속성
- Guides
- Agent
- 통합
- 개방형텔레메트리
- 개발자
- Administrator's Guide
- API
- Datadog Mobile App
- CoScreen
- Cloudcraft
- 앱 내
- 서비스 관리
- 인프라스트럭처
- 애플리케이션 성능
- APM
- Continuous Profiler
- 스팬 시각화
- 데이터 스트림 모니터링
- 데이터 작업 모니터링
- 디지털 경험
- 소프트웨어 제공
- 보안
- AI Observability
- 로그 관리
- 관리
",t};e.buildCustomizationMenuUi=t;function n(e){let t='
",t}function s(e){let n=e.filter.currentValue||e.filter.defaultValue,t='${e.filter.label}
`,e.filter.options.forEach(s=>{let o=s.id===n;t+=``}),t+="${e.filter.label}
`,t+=`Scorecards are in Preview.
Datadog provides the following out-of-the-box scorecards based on a default set of rules: Production Readiness, Observability Best Practices, and Ownership & Documentation.
To select which of the out-of-the-box rules are evaluated for each of the default scorecards:
After the default scorecards are set up, the Scorecards page in the Software Catalog shows the list of out-of-the-box rules and the percentage of services passing those rules. Click on a rule to see more details about passing and failing services and the teams that own them.
The Production Readiness score for all services (unless otherwise indicated) is based on these rules:
The Observability Best Practices score is based on the following rules:
The Ownership & Documentation score is based on the following rules:
Each out-of-the-box scorecard (Production Readiness, Observability Best Practices, Ownership & Documentation) is made up of a default set of rules. These reflect pass-fail conditions and are automatically evaluated once per day. A service’s score against custom rules is based on outcomes sent using the Scorecards API. To exclude a particular custom rule from a service’s score calculation, set its outcome to skip
in the Scorecards API.
Individual rules may have restrictions based on data availability. For example, deployment-related rules rely on the availability of version tags through APM Unified Service Tagging.
Each rule lists a score for the percentage of services that are passing. Each scorecard has an overall score percentage that totals how many services are passing, across all rules—not how many services are passing all rules. Skipped and disabled rules are not included in this calculation.
You can group rules into levels to categorize them by their criticality. There are three predefined levels:
You can set levels for any out-of-the-box or custom rules. By default, rules without levels are automatically placed in level 3. You can change this default assignment by editing the rule.
You can group rules by scorecard or level in the Scorecards UI. In the Software Catalog, you can track how a specific service is progressing through each level. Each service starts at Level 0. The service progresses to Level 1 once it passes all level 1 rules until it reaches a Level 3 status.
Scopes allow you to define which entities a rule applies to, using metadata from entity definitions in Software Catalog. Without a scope defined, a rule applies to all defined services in the catalog. You can scope by any field within an entity definition, including team
, tier
, and custom tags.
By default, a service must match all specified conditions to be evaluated against the rule. You can use OR
statements to include multiple values for the same field.
You can set scopes for both out-of-the-box and custom rules. When you add a scope to a rule, any previously recorded outcomes for services that no longer match the scope are hidden from the UI and excluded from score calculations. If you later remove the scope, these outcomes reappear and are counted again.