Observability Pipelines

Observability Pipelines allows you to collect and process logs within your own infrastructure, and then route them to downstream integrations.

Note: This endpoint is in Preview.

POST https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines

Información general

Create a new pipeline.

Solicitud

Body Data (required)

Expand All

Campo

Tipo

Descripción

data [required]

object

Contains the pipeline’s ID, type, and configuration attributes.

attributes [required]

object

Defines the pipeline’s name and its components (sources, processors, and destinations).

config [required]

object

Specifies the pipeline's configuration, including its sources, processors, and destinations.

destinations [required]

[ <oneOf>]

A list of destination components where processed logs are sent.

Option 1

object

The datadog_logs destination forwards logs to Datadog Log Management.

id [required]

string

The unique identifier for this component.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The destination type. The value should always be datadog_logs. Allowed enum values: datadog_logs

default: datadog_logs

processors [required]

[ <oneOf>]

A list of processors that transform or enrich log data.

Option 1

object

The filter processor allows conditional processing of logs based on a Datadog search query. Logs that match the include query are passed through; others are discarded.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be filter. Allowed enum values: filter

default: filter

Option 2

object

The parse_json processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string.

field [required]

string

The name of the log field that contains a JSON string.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be parse_json. Allowed enum values: parse_json

default: parse_json

Option 3

object

The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert.

drop_events [required]

boolean

If set to true, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

ignore_when_missing_partitions

boolean

If true, the processor skips quota checks when partition fields are missing from the logs.

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

name [required]

string

Name for identifying the processor.

overrides

[object]

A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit.

fields [required]

[object]

A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced.

name [required]

string

The field name.

value [required]

string

The field value.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

partition_fields

[string]

A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values.

type [required]

enum

The processor type. The value should always be quota. Allowed enum values: quota

default: quota

Option 4

object

The add_fields processor adds static key-value fields to logs.

fields [required]

[object]

A list of static fields (key-value pairs) that is added to each log event processed by this component.

name [required]

string

The field name.

value [required]

string

The field value.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be add_fields. Allowed enum values: add_fields

default: add_fields

Option 5

object

The remove_fields processor deletes specified fields from logs.

fields [required]

[string]

A list of field names to be removed from each log event.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

The PipelineRemoveFieldsProcessor inputs.

type [required]

enum

The processor type. The value should always be remove_fields. Allowed enum values: remove_fields

default: remove_fields

Option 6

object

The rename_fields processor changes field names.

fields [required]

[object]

A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields.

destination [required]

string

The field name to assign the renamed value to.

preserve_source [required]

boolean

Indicates whether the original field, that is received from the source, should be kept (true) or removed (false) after renaming.

source [required]

string

The original field name in the log event that should be renamed.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be rename_fields. Allowed enum values: rename_fields

default: rename_fields

sources [required]

[ <oneOf>]

A list of configured data sources for the pipeline.

Option 1

object

The kafka source ingests data from Apache Kafka topics.

group_id [required]

string

Consumer group ID used by the Kafka client.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

librdkafka_options

[object]

Optional list of advanced Kafka client configuration options, defined as key-value pairs.

name [required]

string

The name of the librdkafka configuration option to set.

value [required]

string

The value assigned to the specified librdkafka configuration option.

sasl

object

Specifies the SASL mechanism for authenticating with a Kafka cluster.

mechanism

enum

SASL mechanism used for Kafka authentication. Allowed enum values: PLAIN,SCRAM-SHA-256,SCRAM-SHA-512

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

topics [required]

[string]

A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified.

type [required]

enum

The source type. The value should always be kafka. Allowed enum values: kafka

default: kafka

Option 2

object

The datadog_agent source collects logs from the Datadog Agent.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

type [required]

enum

The source type. The value should always be datadog_agent. Allowed enum values: datadog_agent

default: datadog_agent

name [required]

string

Name of the pipeline.

type [required]

string

The resource type identifier. For pipeline resources, this should always be set to pipelines.

default: pipelines

{
  "data": {
    "attributes": {
      "config": {
        "destinations": [
          {
            "id": "datadog-logs-destination",
            "inputs": [
              "filter-processor"
            ],
            "type": "datadog_logs"
          }
        ],
        "processors": [
          {
            "id": "filter-processor",
            "include": "service:my-service",
            "inputs": [
              "datadog-agent-source"
            ],
            "type": "filter"
          }
        ],
        "sources": [
          {
            "id": "datadog-agent-source",
            "type": "datadog_agent"
          }
        ]
      },
      "name": "Main Observability Pipeline"
    },
    "type": "pipelines"
  }
}

Respuesta

OK

Top-level schema representing a pipeline.

Expand All

Campo

Tipo

Descripción

data [required]

object

Contains the pipeline’s ID, type, and configuration attributes.

attributes [required]

object

Defines the pipeline’s name and its components (sources, processors, and destinations).

config [required]

object

Specifies the pipeline's configuration, including its sources, processors, and destinations.

destinations [required]

[ <oneOf>]

A list of destination components where processed logs are sent.

Option 1

object

The datadog_logs destination forwards logs to Datadog Log Management.

id [required]

string

The unique identifier for this component.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The destination type. The value should always be datadog_logs. Allowed enum values: datadog_logs

default: datadog_logs

processors [required]

[ <oneOf>]

A list of processors that transform or enrich log data.

Option 1

object

The filter processor allows conditional processing of logs based on a Datadog search query. Logs that match the include query are passed through; others are discarded.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be filter. Allowed enum values: filter

default: filter

Option 2

object

The parse_json processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string.

field [required]

string

The name of the log field that contains a JSON string.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be parse_json. Allowed enum values: parse_json

default: parse_json

Option 3

object

The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert.

drop_events [required]

boolean

If set to true, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

ignore_when_missing_partitions

boolean

If true, the processor skips quota checks when partition fields are missing from the logs.

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

name [required]

string

Name for identifying the processor.

overrides

[object]

A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit.

fields [required]

[object]

A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced.

name [required]

string

The field name.

value [required]

string

The field value.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

partition_fields

[string]

A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values.

type [required]

enum

The processor type. The value should always be quota. Allowed enum values: quota

default: quota

Option 4

object

The add_fields processor adds static key-value fields to logs.

fields [required]

[object]

A list of static fields (key-value pairs) that is added to each log event processed by this component.

name [required]

string

The field name.

value [required]

string

The field value.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be add_fields. Allowed enum values: add_fields

default: add_fields

Option 5

object

The remove_fields processor deletes specified fields from logs.

fields [required]

[string]

A list of field names to be removed from each log event.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

The PipelineRemoveFieldsProcessor inputs.

type [required]

enum

The processor type. The value should always be remove_fields. Allowed enum values: remove_fields

default: remove_fields

Option 6

object

The rename_fields processor changes field names.

fields [required]

[object]

A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields.

destination [required]

string

The field name to assign the renamed value to.

preserve_source [required]

boolean

Indicates whether the original field, that is received from the source, should be kept (true) or removed (false) after renaming.

source [required]

string

The original field name in the log event that should be renamed.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be rename_fields. Allowed enum values: rename_fields

default: rename_fields

sources [required]

[ <oneOf>]

A list of configured data sources for the pipeline.

Option 1

object

The kafka source ingests data from Apache Kafka topics.

group_id [required]

string

Consumer group ID used by the Kafka client.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

librdkafka_options

[object]

Optional list of advanced Kafka client configuration options, defined as key-value pairs.

name [required]

string

The name of the librdkafka configuration option to set.

value [required]

string

The value assigned to the specified librdkafka configuration option.

sasl

object

Specifies the SASL mechanism for authenticating with a Kafka cluster.

mechanism

enum

SASL mechanism used for Kafka authentication. Allowed enum values: PLAIN,SCRAM-SHA-256,SCRAM-SHA-512

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

topics [required]

[string]

A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified.

type [required]

enum

The source type. The value should always be kafka. Allowed enum values: kafka

default: kafka

Option 2

object

The datadog_agent source collects logs from the Datadog Agent.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

type [required]

enum

The source type. The value should always be datadog_agent. Allowed enum values: datadog_agent

default: datadog_agent

name [required]

string

Name of the pipeline.

id [required]

string

Unique identifier for the pipeline.

type [required]

string

The resource type identifier. For pipeline resources, this should always be set to pipelines.

default: pipelines

{
  "data": {
    "attributes": {
      "config": {
        "destinations": [
          {
            "id": "datadog-logs-destination",
            "inputs": [
              "filter-processor"
            ],
            "type": "datadog_logs"
          }
        ],
        "processors": [
          {
            "id": "filter-processor",
            "include": "service:my-service",
            "inputs": [
              "datadog-agent-source"
            ],
            "type": "filter"
          }
        ],
        "sources": [
          {
            "group_id": "consumer-group-0",
            "id": "kafka-source",
            "librdkafka_options": [
              {
                "name": "fetch.message.max.bytes",
                "value": "1048576"
              }
            ],
            "sasl": {
              "mechanism": "string"
            },
            "tls": {
              "ca_file": "string",
              "crt_file": "/path/to/cert.crt",
              "key_file": "string"
            },
            "topics": [
              "topic1",
              "topic2"
            ],
            "type": "kafka"
          }
        ]
      },
      "name": "Main Observability Pipeline"
    },
    "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
    "type": "pipelines"
  }
}

Bad Request

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Forbidden

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Conflict

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Too many requests

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Ejemplo de código

                          # Curl command
curl -X POST "https://api.ap1.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "filter-processor" ], "type": "datadog_logs" } ], "processors": [ { "id": "filter-processor", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "type": "filter" } ], "sources": [ { "id": "datadog-agent-source", "type": "datadog_agent" } ] }, "name": "Main Observability Pipeline" }, "type": "pipelines" } } EOF
// Create a new pipeline returns "OK" response

package main

import (
	"context"
	"encoding/json"
	"fmt"
	"os"

	"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
	"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)

func main() {
	body := datadogV2.ObservabilityPipelineCreateRequest{
		Data: datadogV2.ObservabilityPipelineCreateRequestData{
			Attributes: datadogV2.ObservabilityPipelineDataAttributes{
				Config: datadogV2.ObservabilityPipelineConfig{
					Destinations: []datadogV2.ObservabilityPipelineConfigDestinationItem{
						datadogV2.ObservabilityPipelineConfigDestinationItem{
							ObservabilityPipelineDatadogLogsDestination: &datadogV2.ObservabilityPipelineDatadogLogsDestination{
								Id: "datadog-logs-destination",
								Inputs: []string{
									"filter-processor",
								},
								Type: datadogV2.OBSERVABILITYPIPELINEDATADOGLOGSDESTINATIONTYPE_DATADOG_LOGS,
							}},
					},
					Processors: []datadogV2.ObservabilityPipelineConfigProcessorItem{
						datadogV2.ObservabilityPipelineConfigProcessorItem{
							ObservabilityPipelineFilterProcessor: &datadogV2.ObservabilityPipelineFilterProcessor{
								Id:      "filter-processor",
								Include: "service:my-service",
								Inputs: []string{
									"datadog-agent-source",
								},
								Type: datadogV2.OBSERVABILITYPIPELINEFILTERPROCESSORTYPE_FILTER,
							}},
					},
					Sources: []datadogV2.ObservabilityPipelineConfigSourceItem{
						datadogV2.ObservabilityPipelineConfigSourceItem{
							ObservabilityPipelineDatadogAgentSource: &datadogV2.ObservabilityPipelineDatadogAgentSource{
								Id:   "datadog-agent-source",
								Type: datadogV2.OBSERVABILITYPIPELINEDATADOGAGENTSOURCETYPE_DATADOG_AGENT,
							}},
					},
				},
				Name: "Main Observability Pipeline",
			},
			Type: "pipelines",
		},
	}
	ctx := datadog.NewDefaultContext(context.Background())
	configuration := datadog.NewConfiguration()
	configuration.SetUnstableOperationEnabled("v2.CreatePipeline", true)
	apiClient := datadog.NewAPIClient(configuration)
	api := datadogV2.NewObservabilityPipelinesApi(apiClient)
	resp, r, err := api.CreatePipeline(ctx, body)

	if err != nil {
		fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.CreatePipeline`: %v\n", err)
		fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
	}

	responseContent, _ := json.MarshalIndent(resp, "", "  ")
	fmt.Fprintf(os.Stdout, "Response from `ObservabilityPipelinesApi.CreatePipeline`:\n%s\n", responseContent)
}

Instructions

First install the library and its dependencies and then save the example to main.go and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" go run "main.go"
// Create a new pipeline returns "OK" response

import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.ObservabilityPipelinesApi;
import com.datadog.api.client.v2.model.ObservabilityPipeline;
import com.datadog.api.client.v2.model.ObservabilityPipelineConfig;
import com.datadog.api.client.v2.model.ObservabilityPipelineConfigDestinationItem;
import com.datadog.api.client.v2.model.ObservabilityPipelineConfigProcessorItem;
import com.datadog.api.client.v2.model.ObservabilityPipelineConfigSourceItem;
import com.datadog.api.client.v2.model.ObservabilityPipelineCreateRequest;
import com.datadog.api.client.v2.model.ObservabilityPipelineCreateRequestData;
import com.datadog.api.client.v2.model.ObservabilityPipelineDataAttributes;
import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSource;
import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSourceType;
import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestination;
import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestinationType;
import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessor;
import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessorType;
import java.util.Collections;

public class Example {
  public static void main(String[] args) {
    ApiClient defaultClient = ApiClient.getDefaultApiClient();
    defaultClient.setUnstableOperationEnabled("v2.createPipeline", true);
    ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient);

    ObservabilityPipelineCreateRequest body =
        new ObservabilityPipelineCreateRequest()
            .data(
                new ObservabilityPipelineCreateRequestData()
                    .attributes(
                        new ObservabilityPipelineDataAttributes()
                            .config(
                                new ObservabilityPipelineConfig()
                                    .destinations(
                                        Collections.singletonList(
                                            new ObservabilityPipelineConfigDestinationItem(
                                                new ObservabilityPipelineDatadogLogsDestination()
                                                    .id("datadog-logs-destination")
                                                    .inputs(
                                                        Collections.singletonList(
                                                            "filter-processor"))
                                                    .type(
                                                        ObservabilityPipelineDatadogLogsDestinationType
                                                            .DATADOG_LOGS))))
                                    .processors(
                                        Collections.singletonList(
                                            new ObservabilityPipelineConfigProcessorItem(
                                                new ObservabilityPipelineFilterProcessor()
                                                    .id("filter-processor")
                                                    .include("service:my-service")
                                                    .inputs(
                                                        Collections.singletonList(
                                                            "datadog-agent-source"))
                                                    .type(
                                                        ObservabilityPipelineFilterProcessorType
                                                            .FILTER))))
                                    .sources(
                                        Collections.singletonList(
                                            new ObservabilityPipelineConfigSourceItem(
                                                new ObservabilityPipelineDatadogAgentSource()
                                                    .id("datadog-agent-source")
                                                    .type(
                                                        ObservabilityPipelineDatadogAgentSourceType
                                                            .DATADOG_AGENT)))))
                            .name("Main Observability Pipeline"))
                    .type("pipelines"));

    try {
      ObservabilityPipeline result = apiInstance.createPipeline(body);
      System.out.println(result);
    } catch (ApiException e) {
      System.err.println("Exception when calling ObservabilityPipelinesApi#createPipeline");
      System.err.println("Status code: " + e.getCode());
      System.err.println("Reason: " + e.getResponseBody());
      System.err.println("Response headers: " + e.getResponseHeaders());
      e.printStackTrace();
    }
  }
}

Instructions

First install the library and its dependencies and then save the example to Example.java and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" java "Example.java"
"""
Create a new pipeline returns "OK" response
"""

from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi
from datadog_api_client.v2.model.observability_pipeline_config import ObservabilityPipelineConfig
from datadog_api_client.v2.model.observability_pipeline_create_request import ObservabilityPipelineCreateRequest
from datadog_api_client.v2.model.observability_pipeline_create_request_data import (
    ObservabilityPipelineCreateRequestData,
)
from datadog_api_client.v2.model.observability_pipeline_data_attributes import ObservabilityPipelineDataAttributes
from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source import (
    ObservabilityPipelineDatadogAgentSource,
)
from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source_type import (
    ObservabilityPipelineDatadogAgentSourceType,
)
from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination import (
    ObservabilityPipelineDatadogLogsDestination,
)
from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination_type import (
    ObservabilityPipelineDatadogLogsDestinationType,
)
from datadog_api_client.v2.model.observability_pipeline_filter_processor import ObservabilityPipelineFilterProcessor
from datadog_api_client.v2.model.observability_pipeline_filter_processor_type import (
    ObservabilityPipelineFilterProcessorType,
)

body = ObservabilityPipelineCreateRequest(
    data=ObservabilityPipelineCreateRequestData(
        attributes=ObservabilityPipelineDataAttributes(
            config=ObservabilityPipelineConfig(
                destinations=[
                    ObservabilityPipelineDatadogLogsDestination(
                        id="datadog-logs-destination",
                        inputs=[
                            "filter-processor",
                        ],
                        type=ObservabilityPipelineDatadogLogsDestinationType.DATADOG_LOGS,
                    ),
                ],
                processors=[
                    ObservabilityPipelineFilterProcessor(
                        id="filter-processor",
                        include="service:my-service",
                        inputs=[
                            "datadog-agent-source",
                        ],
                        type=ObservabilityPipelineFilterProcessorType.FILTER,
                    ),
                ],
                sources=[
                    ObservabilityPipelineDatadogAgentSource(
                        id="datadog-agent-source",
                        type=ObservabilityPipelineDatadogAgentSourceType.DATADOG_AGENT,
                    ),
                ],
            ),
            name="Main Observability Pipeline",
        ),
        type="pipelines",
    ),
)

configuration = Configuration()
configuration.unstable_operations["create_pipeline"] = True
with ApiClient(configuration) as api_client:
    api_instance = ObservabilityPipelinesApi(api_client)
    response = api_instance.create_pipeline(body=body)

    print(response)

Instructions

First install the library and its dependencies and then save the example to example.py and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" python3 "example.py"
# Create a new pipeline returns "OK" response

require "datadog_api_client"
DatadogAPIClient.configure do |config|
  config.unstable_operations["v2.create_pipeline".to_sym] = true
end
api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new

body = DatadogAPIClient::V2::ObservabilityPipelineCreateRequest.new({
  data: DatadogAPIClient::V2::ObservabilityPipelineCreateRequestData.new({
    attributes: DatadogAPIClient::V2::ObservabilityPipelineDataAttributes.new({
      config: DatadogAPIClient::V2::ObservabilityPipelineConfig.new({
        destinations: [
          DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestination.new({
            id: "datadog-logs-destination",
            inputs: [
              "filter-processor",
            ],
            type: DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS,
          }),
        ],
        processors: [
          DatadogAPIClient::V2::ObservabilityPipelineFilterProcessor.new({
            id: "filter-processor",
            include: "service:my-service",
            inputs: [
              "datadog-agent-source",
            ],
            type: DatadogAPIClient::V2::ObservabilityPipelineFilterProcessorType::FILTER,
          }),
        ],
        sources: [
          DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSource.new({
            id: "datadog-agent-source",
            type: DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT,
          }),
        ],
      }),
      name: "Main Observability Pipeline",
    }),
    type: "pipelines",
  }),
})
p api_instance.create_pipeline(body)

Instructions

First install the library and its dependencies and then save the example to example.rb and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" rb "example.rb"
// Create a new pipeline returns "OK" response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI;
use datadog_api_client::datadogV2::model::ObservabilityPipelineConfig;
use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigDestinationItem;
use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigProcessorItem;
use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigSourceItem;
use datadog_api_client::datadogV2::model::ObservabilityPipelineCreateRequest;
use datadog_api_client::datadogV2::model::ObservabilityPipelineCreateRequestData;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDataAttributes;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSource;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSourceType;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestination;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestinationType;
use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessor;
use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessorType;

#[tokio::main]
async fn main() {
    let body =
        ObservabilityPipelineCreateRequest::new(
            ObservabilityPipelineCreateRequestData::new(
                ObservabilityPipelineDataAttributes::new(
                    ObservabilityPipelineConfig::new(
                        vec![
                            ObservabilityPipelineConfigDestinationItem::ObservabilityPipelineDatadogLogsDestination(
                                Box::new(
                                    ObservabilityPipelineDatadogLogsDestination::new(
                                        "datadog-logs-destination".to_string(),
                                        vec!["filter-processor".to_string()],
                                        ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS,
                                    ),
                                ),
                            )
                        ],
                        vec![
                            ObservabilityPipelineConfigProcessorItem::ObservabilityPipelineFilterProcessor(
                                Box::new(
                                    ObservabilityPipelineFilterProcessor::new(
                                        "filter-processor".to_string(),
                                        "service:my-service".to_string(),
                                        vec!["datadog-agent-source".to_string()],
                                        ObservabilityPipelineFilterProcessorType::FILTER,
                                    ),
                                ),
                            )
                        ],
                        vec![
                            ObservabilityPipelineConfigSourceItem::ObservabilityPipelineDatadogAgentSource(
                                Box::new(
                                    ObservabilityPipelineDatadogAgentSource::new(
                                        "datadog-agent-source".to_string(),
                                        ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT,
                                    ),
                                ),
                            )
                        ],
                    ),
                    "Main Observability Pipeline".to_string(),
                ),
                "pipelines".to_string(),
            ),
        );
    let mut configuration = datadog::Configuration::new();
    configuration.set_unstable_operation_enabled("v2.CreatePipeline", true);
    let api = ObservabilityPipelinesAPI::with_config(configuration);
    let resp = api.create_pipeline(body).await;
    if let Ok(value) = resp {
        println!("{:#?}", value);
    } else {
        println!("{:#?}", resp.unwrap_err());
    }
}

Instructions

First install the library and its dependencies and then save the example to src/main.rs and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" cargo run
/**
 * Create a new pipeline returns "OK" response
 */

import { client, v2 } from "@datadog/datadog-api-client";

const configuration = client.createConfiguration();
configuration.unstableOperations["v2.createPipeline"] = true;
const apiInstance = new v2.ObservabilityPipelinesApi(configuration);

const params: v2.ObservabilityPipelinesApiCreatePipelineRequest = {
  body: {
    data: {
      attributes: {
        config: {
          destinations: [
            {
              id: "datadog-logs-destination",
              inputs: ["filter-processor"],
              type: "datadog_logs",
            },
          ],
          processors: [
            {
              id: "filter-processor",
              include: "service:my-service",
              inputs: ["datadog-agent-source"],
              type: "filter",
            },
          ],
          sources: [
            {
              id: "datadog-agent-source",
              type: "datadog_agent",
            },
          ],
        },
        name: "Main Observability Pipeline",
      },
      type: "pipelines",
    },
  },
};

apiInstance
  .createPipeline(params)
  .then((data: v2.ObservabilityPipeline) => {
    console.log(
      "API called successfully. Returned data: " + JSON.stringify(data)
    );
  })
  .catch((error: any) => console.error(error));

Instructions

First install the library and its dependencies and then save the example to example.ts and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" tsc "example.ts"

Note: This endpoint is in Preview.

GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}

Información general

Get a specific pipeline by its ID.

Argumentos

Parámetros de ruta

Nombre

Tipo

Descripción

pipeline_id [required]

string

The ID of the pipeline to retrieve.

Respuesta

OK

Top-level schema representing a pipeline.

Expand All

Campo

Tipo

Descripción

data [required]

object

Contains the pipeline’s ID, type, and configuration attributes.

attributes [required]

object

Defines the pipeline’s name and its components (sources, processors, and destinations).

config [required]

object

Specifies the pipeline's configuration, including its sources, processors, and destinations.

destinations [required]

[ <oneOf>]

A list of destination components where processed logs are sent.

Option 1

object

The datadog_logs destination forwards logs to Datadog Log Management.

id [required]

string

The unique identifier for this component.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The destination type. The value should always be datadog_logs. Allowed enum values: datadog_logs

default: datadog_logs

processors [required]

[ <oneOf>]

A list of processors that transform or enrich log data.

Option 1

object

The filter processor allows conditional processing of logs based on a Datadog search query. Logs that match the include query are passed through; others are discarded.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be filter. Allowed enum values: filter

default: filter

Option 2

object

The parse_json processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string.

field [required]

string

The name of the log field that contains a JSON string.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be parse_json. Allowed enum values: parse_json

default: parse_json

Option 3

object

The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert.

drop_events [required]

boolean

If set to true, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

ignore_when_missing_partitions

boolean

If true, the processor skips quota checks when partition fields are missing from the logs.

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

name [required]

string

Name for identifying the processor.

overrides

[object]

A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit.

fields [required]

[object]

A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced.

name [required]

string

The field name.

value [required]

string

The field value.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

partition_fields

[string]

A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values.

type [required]

enum

The processor type. The value should always be quota. Allowed enum values: quota

default: quota

Option 4

object

The add_fields processor adds static key-value fields to logs.

fields [required]

[object]

A list of static fields (key-value pairs) that is added to each log event processed by this component.

name [required]

string

The field name.

value [required]

string

The field value.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be add_fields. Allowed enum values: add_fields

default: add_fields

Option 5

object

The remove_fields processor deletes specified fields from logs.

fields [required]

[string]

A list of field names to be removed from each log event.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

The PipelineRemoveFieldsProcessor inputs.

type [required]

enum

The processor type. The value should always be remove_fields. Allowed enum values: remove_fields

default: remove_fields

Option 6

object

The rename_fields processor changes field names.

fields [required]

[object]

A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields.

destination [required]

string

The field name to assign the renamed value to.

preserve_source [required]

boolean

Indicates whether the original field, that is received from the source, should be kept (true) or removed (false) after renaming.

source [required]

string

The original field name in the log event that should be renamed.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be rename_fields. Allowed enum values: rename_fields

default: rename_fields

sources [required]

[ <oneOf>]

A list of configured data sources for the pipeline.

Option 1

object

The kafka source ingests data from Apache Kafka topics.

group_id [required]

string

Consumer group ID used by the Kafka client.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

librdkafka_options

[object]

Optional list of advanced Kafka client configuration options, defined as key-value pairs.

name [required]

string

The name of the librdkafka configuration option to set.

value [required]

string

The value assigned to the specified librdkafka configuration option.

sasl

object

Specifies the SASL mechanism for authenticating with a Kafka cluster.

mechanism

enum

SASL mechanism used for Kafka authentication. Allowed enum values: PLAIN,SCRAM-SHA-256,SCRAM-SHA-512

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

topics [required]

[string]

A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified.

type [required]

enum

The source type. The value should always be kafka. Allowed enum values: kafka

default: kafka

Option 2

object

The datadog_agent source collects logs from the Datadog Agent.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

type [required]

enum

The source type. The value should always be datadog_agent. Allowed enum values: datadog_agent

default: datadog_agent

name [required]

string

Name of the pipeline.

id [required]

string

Unique identifier for the pipeline.

type [required]

string

The resource type identifier. For pipeline resources, this should always be set to pipelines.

default: pipelines

{
  "data": {
    "attributes": {
      "config": {
        "destinations": [
          {
            "id": "datadog-logs-destination",
            "inputs": [
              "filter-processor"
            ],
            "type": "datadog_logs"
          }
        ],
        "processors": [
          {
            "id": "filter-processor",
            "include": "service:my-service",
            "inputs": [
              "datadog-agent-source"
            ],
            "type": "filter"
          }
        ],
        "sources": [
          {
            "group_id": "consumer-group-0",
            "id": "kafka-source",
            "librdkafka_options": [
              {
                "name": "fetch.message.max.bytes",
                "value": "1048576"
              }
            ],
            "sasl": {
              "mechanism": "string"
            },
            "tls": {
              "ca_file": "string",
              "crt_file": "/path/to/cert.crt",
              "key_file": "string"
            },
            "topics": [
              "topic1",
              "topic2"
            ],
            "type": "kafka"
          }
        ]
      },
      "name": "Main Observability Pipeline"
    },
    "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
    "type": "pipelines"
  }
}

Forbidden

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Too many requests

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Ejemplo de código

                  # Path parameters
export pipeline_id="CHANGE_ME"
# Curl command
curl -X GET "https://api.ap1.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/${pipeline_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}"
"""
Get a specific pipeline returns "OK" response
"""

from os import environ
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi

# there is a valid "pipeline" in the system
PIPELINE_DATA_ID = environ["PIPELINE_DATA_ID"]

configuration = Configuration()
configuration.unstable_operations["get_pipeline"] = True
with ApiClient(configuration) as api_client:
    api_instance = ObservabilityPipelinesApi(api_client)
    response = api_instance.get_pipeline(
        pipeline_id=PIPELINE_DATA_ID,
    )

    print(response)

Instructions

First install the library and its dependencies and then save the example to example.py and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" python3 "example.py"
# Get a specific pipeline returns "OK" response

require "datadog_api_client"
DatadogAPIClient.configure do |config|
  config.unstable_operations["v2.get_pipeline".to_sym] = true
end
api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new

# there is a valid "pipeline" in the system
PIPELINE_DATA_ID = ENV["PIPELINE_DATA_ID"]
p api_instance.get_pipeline(PIPELINE_DATA_ID)

Instructions

First install the library and its dependencies and then save the example to example.rb and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" rb "example.rb"
// Get a specific pipeline returns "OK" response

package main

import (
	"context"
	"encoding/json"
	"fmt"
	"os"

	"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
	"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)

func main() {
	// there is a valid "pipeline" in the system
	PipelineDataID := os.Getenv("PIPELINE_DATA_ID")

	ctx := datadog.NewDefaultContext(context.Background())
	configuration := datadog.NewConfiguration()
	configuration.SetUnstableOperationEnabled("v2.GetPipeline", true)
	apiClient := datadog.NewAPIClient(configuration)
	api := datadogV2.NewObservabilityPipelinesApi(apiClient)
	resp, r, err := api.GetPipeline(ctx, PipelineDataID)

	if err != nil {
		fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.GetPipeline`: %v\n", err)
		fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
	}

	responseContent, _ := json.MarshalIndent(resp, "", "  ")
	fmt.Fprintf(os.Stdout, "Response from `ObservabilityPipelinesApi.GetPipeline`:\n%s\n", responseContent)
}

Instructions

First install the library and its dependencies and then save the example to main.go and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" go run "main.go"
// Get a specific pipeline returns "OK" response

import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.ObservabilityPipelinesApi;
import com.datadog.api.client.v2.model.ObservabilityPipeline;

public class Example {
  public static void main(String[] args) {
    ApiClient defaultClient = ApiClient.getDefaultApiClient();
    defaultClient.setUnstableOperationEnabled("v2.getPipeline", true);
    ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient);

    // there is a valid "pipeline" in the system
    String PIPELINE_DATA_ID = System.getenv("PIPELINE_DATA_ID");

    try {
      ObservabilityPipeline result = apiInstance.getPipeline(PIPELINE_DATA_ID);
      System.out.println(result);
    } catch (ApiException e) {
      System.err.println("Exception when calling ObservabilityPipelinesApi#getPipeline");
      System.err.println("Status code: " + e.getCode());
      System.err.println("Reason: " + e.getResponseBody());
      System.err.println("Response headers: " + e.getResponseHeaders());
      e.printStackTrace();
    }
  }
}

Instructions

First install the library and its dependencies and then save the example to Example.java and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" java "Example.java"
// Get a specific pipeline returns "OK" response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI;

#[tokio::main]
async fn main() {
    // there is a valid "pipeline" in the system
    let pipeline_data_id = std::env::var("PIPELINE_DATA_ID").unwrap();
    let mut configuration = datadog::Configuration::new();
    configuration.set_unstable_operation_enabled("v2.GetPipeline", true);
    let api = ObservabilityPipelinesAPI::with_config(configuration);
    let resp = api.get_pipeline(pipeline_data_id.clone()).await;
    if let Ok(value) = resp {
        println!("{:#?}", value);
    } else {
        println!("{:#?}", resp.unwrap_err());
    }
}

Instructions

First install the library and its dependencies and then save the example to src/main.rs and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" cargo run
/**
 * Get a specific pipeline returns "OK" response
 */

import { client, v2 } from "@datadog/datadog-api-client";

const configuration = client.createConfiguration();
configuration.unstableOperations["v2.getPipeline"] = true;
const apiInstance = new v2.ObservabilityPipelinesApi(configuration);

// there is a valid "pipeline" in the system
const PIPELINE_DATA_ID = process.env.PIPELINE_DATA_ID as string;

const params: v2.ObservabilityPipelinesApiGetPipelineRequest = {
  pipelineId: PIPELINE_DATA_ID,
};

apiInstance
  .getPipeline(params)
  .then((data: v2.ObservabilityPipeline) => {
    console.log(
      "API called successfully. Returned data: " + JSON.stringify(data)
    );
  })
  .catch((error: any) => console.error(error));

Instructions

First install the library and its dependencies and then save the example to example.ts and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" tsc "example.ts"

Note: This endpoint is in Preview.

PUT https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}

Información general

Update a pipeline.

Argumentos

Parámetros de ruta

Nombre

Tipo

Descripción

pipeline_id [required]

string

The ID of the pipeline to update.

Solicitud

Body Data (required)

Expand All

Campo

Tipo

Descripción

data [required]

object

Contains the pipeline’s ID, type, and configuration attributes.

attributes [required]

object

Defines the pipeline’s name and its components (sources, processors, and destinations).

config [required]

object

Specifies the pipeline's configuration, including its sources, processors, and destinations.

destinations [required]

[ <oneOf>]

A list of destination components where processed logs are sent.

Option 1

object

The datadog_logs destination forwards logs to Datadog Log Management.

id [required]

string

The unique identifier for this component.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The destination type. The value should always be datadog_logs. Allowed enum values: datadog_logs

default: datadog_logs

processors [required]

[ <oneOf>]

A list of processors that transform or enrich log data.

Option 1

object

The filter processor allows conditional processing of logs based on a Datadog search query. Logs that match the include query are passed through; others are discarded.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be filter. Allowed enum values: filter

default: filter

Option 2

object

The parse_json processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string.

field [required]

string

The name of the log field that contains a JSON string.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be parse_json. Allowed enum values: parse_json

default: parse_json

Option 3

object

The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert.

drop_events [required]

boolean

If set to true, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

ignore_when_missing_partitions

boolean

If true, the processor skips quota checks when partition fields are missing from the logs.

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

name [required]

string

Name for identifying the processor.

overrides

[object]

A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit.

fields [required]

[object]

A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced.

name [required]

string

The field name.

value [required]

string

The field value.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

partition_fields

[string]

A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values.

type [required]

enum

The processor type. The value should always be quota. Allowed enum values: quota

default: quota

Option 4

object

The add_fields processor adds static key-value fields to logs.

fields [required]

[object]

A list of static fields (key-value pairs) that is added to each log event processed by this component.

name [required]

string

The field name.

value [required]

string

The field value.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be add_fields. Allowed enum values: add_fields

default: add_fields

Option 5

object

The remove_fields processor deletes specified fields from logs.

fields [required]

[string]

A list of field names to be removed from each log event.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

The PipelineRemoveFieldsProcessor inputs.

type [required]

enum

The processor type. The value should always be remove_fields. Allowed enum values: remove_fields

default: remove_fields

Option 6

object

The rename_fields processor changes field names.

fields [required]

[object]

A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields.

destination [required]

string

The field name to assign the renamed value to.

preserve_source [required]

boolean

Indicates whether the original field, that is received from the source, should be kept (true) or removed (false) after renaming.

source [required]

string

The original field name in the log event that should be renamed.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be rename_fields. Allowed enum values: rename_fields

default: rename_fields

sources [required]

[ <oneOf>]

A list of configured data sources for the pipeline.

Option 1

object

The kafka source ingests data from Apache Kafka topics.

group_id [required]

string

Consumer group ID used by the Kafka client.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

librdkafka_options

[object]

Optional list of advanced Kafka client configuration options, defined as key-value pairs.

name [required]

string

The name of the librdkafka configuration option to set.

value [required]

string

The value assigned to the specified librdkafka configuration option.

sasl

object

Specifies the SASL mechanism for authenticating with a Kafka cluster.

mechanism

enum

SASL mechanism used for Kafka authentication. Allowed enum values: PLAIN,SCRAM-SHA-256,SCRAM-SHA-512

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

topics [required]

[string]

A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified.

type [required]

enum

The source type. The value should always be kafka. Allowed enum values: kafka

default: kafka

Option 2

object

The datadog_agent source collects logs from the Datadog Agent.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

type [required]

enum

The source type. The value should always be datadog_agent. Allowed enum values: datadog_agent

default: datadog_agent

name [required]

string

Name of the pipeline.

id [required]

string

Unique identifier for the pipeline.

type [required]

string

The resource type identifier. For pipeline resources, this should always be set to pipelines.

default: pipelines

{
  "data": {
    "attributes": {
      "config": {
        "destinations": [
          {
            "id": "updated-datadog-logs-destination-id",
            "inputs": [
              "filter-processor"
            ],
            "type": "datadog_logs"
          }
        ],
        "processors": [
          {
            "id": "filter-processor",
            "include": "service:my-service",
            "inputs": [
              "datadog-agent-source"
            ],
            "type": "filter"
          }
        ],
        "sources": [
          {
            "id": "datadog-agent-source",
            "type": "datadog_agent"
          }
        ]
      },
      "name": "Updated Pipeline Name"
    },
    "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
    "type": "pipelines"
  }
}

Respuesta

OK

Top-level schema representing a pipeline.

Expand All

Campo

Tipo

Descripción

data [required]

object

Contains the pipeline’s ID, type, and configuration attributes.

attributes [required]

object

Defines the pipeline’s name and its components (sources, processors, and destinations).

config [required]

object

Specifies the pipeline's configuration, including its sources, processors, and destinations.

destinations [required]

[ <oneOf>]

A list of destination components where processed logs are sent.

Option 1

object

The datadog_logs destination forwards logs to Datadog Log Management.

id [required]

string

The unique identifier for this component.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The destination type. The value should always be datadog_logs. Allowed enum values: datadog_logs

default: datadog_logs

processors [required]

[ <oneOf>]

A list of processors that transform or enrich log data.

Option 1

object

The filter processor allows conditional processing of logs based on a Datadog search query. Logs that match the include query are passed through; others are discarded.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be filter. Allowed enum values: filter

default: filter

Option 2

object

The parse_json processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string.

field [required]

string

The name of the log field that contains a JSON string.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be parse_json. Allowed enum values: parse_json

default: parse_json

Option 3

object

The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert.

drop_events [required]

boolean

If set to true, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

ignore_when_missing_partitions

boolean

If true, the processor skips quota checks when partition fields are missing from the logs.

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

name [required]

string

Name for identifying the processor.

overrides

[object]

A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit.

fields [required]

[object]

A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced.

name [required]

string

The field name.

value [required]

string

The field value.

limit [required]

object

The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events.

enforce [required]

enum

Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: bytes,events

limit [required]

int64

The limit for quota enforcement.

partition_fields

[string]

A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values.

type [required]

enum

The processor type. The value should always be quota. Allowed enum values: quota

default: quota

Option 4

object

The add_fields processor adds static key-value fields to logs.

fields [required]

[object]

A list of static fields (key-value pairs) that is added to each log event processed by this component.

name [required]

string

The field name.

value [required]

string

The field value.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be add_fields. Allowed enum values: add_fields

default: add_fields

Option 5

object

The remove_fields processor deletes specified fields from logs.

fields [required]

[string]

A list of field names to be removed from each log event.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

The PipelineRemoveFieldsProcessor inputs.

type [required]

enum

The processor type. The value should always be remove_fields. Allowed enum values: remove_fields

default: remove_fields

Option 6

object

The rename_fields processor changes field names.

fields [required]

[object]

A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields.

destination [required]

string

The field name to assign the renamed value to.

preserve_source [required]

boolean

Indicates whether the original field, that is received from the source, should be kept (true) or removed (false) after renaming.

source [required]

string

The original field name in the log event that should be renamed.

id [required]

string

A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

include [required]

string

A Datadog search query used to determine which logs this processor targets.

inputs [required]

[string]

A list of component IDs whose output is used as the input for this component.

type [required]

enum

The processor type. The value should always be rename_fields. Allowed enum values: rename_fields

default: rename_fields

sources [required]

[ <oneOf>]

A list of configured data sources for the pipeline.

Option 1

object

The kafka source ingests data from Apache Kafka topics.

group_id [required]

string

Consumer group ID used by the Kafka client.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

librdkafka_options

[object]

Optional list of advanced Kafka client configuration options, defined as key-value pairs.

name [required]

string

The name of the librdkafka configuration option to set.

value [required]

string

The value assigned to the specified librdkafka configuration option.

sasl

object

Specifies the SASL mechanism for authenticating with a Kafka cluster.

mechanism

enum

SASL mechanism used for Kafka authentication. Allowed enum values: PLAIN,SCRAM-SHA-256,SCRAM-SHA-512

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

topics [required]

[string]

A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified.

type [required]

enum

The source type. The value should always be kafka. Allowed enum values: kafka

default: kafka

Option 2

object

The datadog_agent source collects logs from the Datadog Agent.

id [required]

string

The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components).

tls

object

Configuration for enabling TLS encryption.

ca_file

string

Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate.

crt_file [required]

string

Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services.

key_file

string

Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication.

type [required]

enum

The source type. The value should always be datadog_agent. Allowed enum values: datadog_agent

default: datadog_agent

name [required]

string

Name of the pipeline.

id [required]

string

Unique identifier for the pipeline.

type [required]

string

The resource type identifier. For pipeline resources, this should always be set to pipelines.

default: pipelines

{
  "data": {
    "attributes": {
      "config": {
        "destinations": [
          {
            "id": "datadog-logs-destination",
            "inputs": [
              "filter-processor"
            ],
            "type": "datadog_logs"
          }
        ],
        "processors": [
          {
            "id": "filter-processor",
            "include": "service:my-service",
            "inputs": [
              "datadog-agent-source"
            ],
            "type": "filter"
          }
        ],
        "sources": [
          {
            "group_id": "consumer-group-0",
            "id": "kafka-source",
            "librdkafka_options": [
              {
                "name": "fetch.message.max.bytes",
                "value": "1048576"
              }
            ],
            "sasl": {
              "mechanism": "string"
            },
            "tls": {
              "ca_file": "string",
              "crt_file": "/path/to/cert.crt",
              "key_file": "string"
            },
            "topics": [
              "topic1",
              "topic2"
            ],
            "type": "kafka"
          }
        ]
      },
      "name": "Main Observability Pipeline"
    },
    "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
    "type": "pipelines"
  }
}

Bad Request

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Forbidden

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Not Found

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Conflict

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Too many requests

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Ejemplo de código

                          # Path parameters
export pipeline_id="CHANGE_ME"
# Curl command
curl -X PUT "https://api.ap1.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/${pipeline_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config": { "destinations": [ { "id": "updated-datadog-logs-destination-id", "inputs": [ "filter-processor" ], "type": "datadog_logs" } ], "processors": [ { "id": "filter-processor", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "type": "filter" } ], "sources": [ { "id": "datadog-agent-source", "type": "datadog_agent" } ] }, "name": "Updated Pipeline Name" }, "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6", "type": "pipelines" } } EOF
// Update a pipeline returns "OK" response

package main

import (
	"context"
	"encoding/json"
	"fmt"
	"os"

	"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
	"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)

func main() {
	// there is a valid "pipeline" in the system
	PipelineDataID := os.Getenv("PIPELINE_DATA_ID")

	body := datadogV2.ObservabilityPipeline{
		Data: datadogV2.ObservabilityPipelineData{
			Attributes: datadogV2.ObservabilityPipelineDataAttributes{
				Config: datadogV2.ObservabilityPipelineConfig{
					Destinations: []datadogV2.ObservabilityPipelineConfigDestinationItem{
						datadogV2.ObservabilityPipelineConfigDestinationItem{
							ObservabilityPipelineDatadogLogsDestination: &datadogV2.ObservabilityPipelineDatadogLogsDestination{
								Id: "updated-datadog-logs-destination-id",
								Inputs: []string{
									"filter-processor",
								},
								Type: datadogV2.OBSERVABILITYPIPELINEDATADOGLOGSDESTINATIONTYPE_DATADOG_LOGS,
							}},
					},
					Processors: []datadogV2.ObservabilityPipelineConfigProcessorItem{
						datadogV2.ObservabilityPipelineConfigProcessorItem{
							ObservabilityPipelineFilterProcessor: &datadogV2.ObservabilityPipelineFilterProcessor{
								Id:      "filter-processor",
								Include: "service:my-service",
								Inputs: []string{
									"datadog-agent-source",
								},
								Type: datadogV2.OBSERVABILITYPIPELINEFILTERPROCESSORTYPE_FILTER,
							}},
					},
					Sources: []datadogV2.ObservabilityPipelineConfigSourceItem{
						datadogV2.ObservabilityPipelineConfigSourceItem{
							ObservabilityPipelineDatadogAgentSource: &datadogV2.ObservabilityPipelineDatadogAgentSource{
								Id:   "datadog-agent-source",
								Type: datadogV2.OBSERVABILITYPIPELINEDATADOGAGENTSOURCETYPE_DATADOG_AGENT,
							}},
					},
				},
				Name: "Updated Pipeline Name",
			},
			Id:   PipelineDataID,
			Type: "pipelines",
		},
	}
	ctx := datadog.NewDefaultContext(context.Background())
	configuration := datadog.NewConfiguration()
	configuration.SetUnstableOperationEnabled("v2.UpdatePipeline", true)
	apiClient := datadog.NewAPIClient(configuration)
	api := datadogV2.NewObservabilityPipelinesApi(apiClient)
	resp, r, err := api.UpdatePipeline(ctx, PipelineDataID, body)

	if err != nil {
		fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.UpdatePipeline`: %v\n", err)
		fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
	}

	responseContent, _ := json.MarshalIndent(resp, "", "  ")
	fmt.Fprintf(os.Stdout, "Response from `ObservabilityPipelinesApi.UpdatePipeline`:\n%s\n", responseContent)
}

Instructions

First install the library and its dependencies and then save the example to main.go and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" go run "main.go"
// Update a pipeline returns "OK" response

import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.ObservabilityPipelinesApi;
import com.datadog.api.client.v2.model.ObservabilityPipeline;
import com.datadog.api.client.v2.model.ObservabilityPipelineConfig;
import com.datadog.api.client.v2.model.ObservabilityPipelineConfigDestinationItem;
import com.datadog.api.client.v2.model.ObservabilityPipelineConfigProcessorItem;
import com.datadog.api.client.v2.model.ObservabilityPipelineConfigSourceItem;
import com.datadog.api.client.v2.model.ObservabilityPipelineData;
import com.datadog.api.client.v2.model.ObservabilityPipelineDataAttributes;
import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSource;
import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSourceType;
import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestination;
import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestinationType;
import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessor;
import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessorType;
import java.util.Collections;

public class Example {
  public static void main(String[] args) {
    ApiClient defaultClient = ApiClient.getDefaultApiClient();
    defaultClient.setUnstableOperationEnabled("v2.updatePipeline", true);
    ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient);

    // there is a valid "pipeline" in the system
    String PIPELINE_DATA_ID = System.getenv("PIPELINE_DATA_ID");

    ObservabilityPipeline body =
        new ObservabilityPipeline()
            .data(
                new ObservabilityPipelineData()
                    .attributes(
                        new ObservabilityPipelineDataAttributes()
                            .config(
                                new ObservabilityPipelineConfig()
                                    .destinations(
                                        Collections.singletonList(
                                            new ObservabilityPipelineConfigDestinationItem(
                                                new ObservabilityPipelineDatadogLogsDestination()
                                                    .id("updated-datadog-logs-destination-id")
                                                    .inputs(
                                                        Collections.singletonList(
                                                            "filter-processor"))
                                                    .type(
                                                        ObservabilityPipelineDatadogLogsDestinationType
                                                            .DATADOG_LOGS))))
                                    .processors(
                                        Collections.singletonList(
                                            new ObservabilityPipelineConfigProcessorItem(
                                                new ObservabilityPipelineFilterProcessor()
                                                    .id("filter-processor")
                                                    .include("service:my-service")
                                                    .inputs(
                                                        Collections.singletonList(
                                                            "datadog-agent-source"))
                                                    .type(
                                                        ObservabilityPipelineFilterProcessorType
                                                            .FILTER))))
                                    .sources(
                                        Collections.singletonList(
                                            new ObservabilityPipelineConfigSourceItem(
                                                new ObservabilityPipelineDatadogAgentSource()
                                                    .id("datadog-agent-source")
                                                    .type(
                                                        ObservabilityPipelineDatadogAgentSourceType
                                                            .DATADOG_AGENT)))))
                            .name("Updated Pipeline Name"))
                    .id(PIPELINE_DATA_ID)
                    .type("pipelines"));

    try {
      ObservabilityPipeline result = apiInstance.updatePipeline(PIPELINE_DATA_ID, body);
      System.out.println(result);
    } catch (ApiException e) {
      System.err.println("Exception when calling ObservabilityPipelinesApi#updatePipeline");
      System.err.println("Status code: " + e.getCode());
      System.err.println("Reason: " + e.getResponseBody());
      System.err.println("Response headers: " + e.getResponseHeaders());
      e.printStackTrace();
    }
  }
}

Instructions

First install the library and its dependencies and then save the example to Example.java and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" java "Example.java"
"""
Update a pipeline returns "OK" response
"""

from os import environ
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi
from datadog_api_client.v2.model.observability_pipeline import ObservabilityPipeline
from datadog_api_client.v2.model.observability_pipeline_config import ObservabilityPipelineConfig
from datadog_api_client.v2.model.observability_pipeline_data import ObservabilityPipelineData
from datadog_api_client.v2.model.observability_pipeline_data_attributes import ObservabilityPipelineDataAttributes
from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source import (
    ObservabilityPipelineDatadogAgentSource,
)
from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source_type import (
    ObservabilityPipelineDatadogAgentSourceType,
)
from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination import (
    ObservabilityPipelineDatadogLogsDestination,
)
from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination_type import (
    ObservabilityPipelineDatadogLogsDestinationType,
)
from datadog_api_client.v2.model.observability_pipeline_filter_processor import ObservabilityPipelineFilterProcessor
from datadog_api_client.v2.model.observability_pipeline_filter_processor_type import (
    ObservabilityPipelineFilterProcessorType,
)

# there is a valid "pipeline" in the system
PIPELINE_DATA_ID = environ["PIPELINE_DATA_ID"]

body = ObservabilityPipeline(
    data=ObservabilityPipelineData(
        attributes=ObservabilityPipelineDataAttributes(
            config=ObservabilityPipelineConfig(
                destinations=[
                    ObservabilityPipelineDatadogLogsDestination(
                        id="updated-datadog-logs-destination-id",
                        inputs=[
                            "filter-processor",
                        ],
                        type=ObservabilityPipelineDatadogLogsDestinationType.DATADOG_LOGS,
                    ),
                ],
                processors=[
                    ObservabilityPipelineFilterProcessor(
                        id="filter-processor",
                        include="service:my-service",
                        inputs=[
                            "datadog-agent-source",
                        ],
                        type=ObservabilityPipelineFilterProcessorType.FILTER,
                    ),
                ],
                sources=[
                    ObservabilityPipelineDatadogAgentSource(
                        id="datadog-agent-source",
                        type=ObservabilityPipelineDatadogAgentSourceType.DATADOG_AGENT,
                    ),
                ],
            ),
            name="Updated Pipeline Name",
        ),
        id=PIPELINE_DATA_ID,
        type="pipelines",
    ),
)

configuration = Configuration()
configuration.unstable_operations["update_pipeline"] = True
with ApiClient(configuration) as api_client:
    api_instance = ObservabilityPipelinesApi(api_client)
    response = api_instance.update_pipeline(pipeline_id=PIPELINE_DATA_ID, body=body)

    print(response)

Instructions

First install the library and its dependencies and then save the example to example.py and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" python3 "example.py"
# Update a pipeline returns "OK" response

require "datadog_api_client"
DatadogAPIClient.configure do |config|
  config.unstable_operations["v2.update_pipeline".to_sym] = true
end
api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new

# there is a valid "pipeline" in the system
PIPELINE_DATA_ID = ENV["PIPELINE_DATA_ID"]

body = DatadogAPIClient::V2::ObservabilityPipeline.new({
  data: DatadogAPIClient::V2::ObservabilityPipelineData.new({
    attributes: DatadogAPIClient::V2::ObservabilityPipelineDataAttributes.new({
      config: DatadogAPIClient::V2::ObservabilityPipelineConfig.new({
        destinations: [
          DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestination.new({
            id: "updated-datadog-logs-destination-id",
            inputs: [
              "filter-processor",
            ],
            type: DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS,
          }),
        ],
        processors: [
          DatadogAPIClient::V2::ObservabilityPipelineFilterProcessor.new({
            id: "filter-processor",
            include: "service:my-service",
            inputs: [
              "datadog-agent-source",
            ],
            type: DatadogAPIClient::V2::ObservabilityPipelineFilterProcessorType::FILTER,
          }),
        ],
        sources: [
          DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSource.new({
            id: "datadog-agent-source",
            type: DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT,
          }),
        ],
      }),
      name: "Updated Pipeline Name",
    }),
    id: PIPELINE_DATA_ID,
    type: "pipelines",
  }),
})
p api_instance.update_pipeline(PIPELINE_DATA_ID, body)

Instructions

First install the library and its dependencies and then save the example to example.rb and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" rb "example.rb"
// Update a pipeline returns "OK" response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI;
use datadog_api_client::datadogV2::model::ObservabilityPipeline;
use datadog_api_client::datadogV2::model::ObservabilityPipelineConfig;
use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigDestinationItem;
use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigProcessorItem;
use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigSourceItem;
use datadog_api_client::datadogV2::model::ObservabilityPipelineData;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDataAttributes;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSource;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSourceType;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestination;
use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestinationType;
use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessor;
use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessorType;

#[tokio::main]
async fn main() {
    // there is a valid "pipeline" in the system
    let pipeline_data_id = std::env::var("PIPELINE_DATA_ID").unwrap();
    let body =
        ObservabilityPipeline::new(
            ObservabilityPipelineData::new(
                ObservabilityPipelineDataAttributes::new(
                    ObservabilityPipelineConfig::new(
                        vec![
                            ObservabilityPipelineConfigDestinationItem::ObservabilityPipelineDatadogLogsDestination(
                                Box::new(
                                    ObservabilityPipelineDatadogLogsDestination::new(
                                        "updated-datadog-logs-destination-id".to_string(),
                                        vec!["filter-processor".to_string()],
                                        ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS,
                                    ),
                                ),
                            )
                        ],
                        vec![
                            ObservabilityPipelineConfigProcessorItem::ObservabilityPipelineFilterProcessor(
                                Box::new(
                                    ObservabilityPipelineFilterProcessor::new(
                                        "filter-processor".to_string(),
                                        "service:my-service".to_string(),
                                        vec!["datadog-agent-source".to_string()],
                                        ObservabilityPipelineFilterProcessorType::FILTER,
                                    ),
                                ),
                            )
                        ],
                        vec![
                            ObservabilityPipelineConfigSourceItem::ObservabilityPipelineDatadogAgentSource(
                                Box::new(
                                    ObservabilityPipelineDatadogAgentSource::new(
                                        "datadog-agent-source".to_string(),
                                        ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT,
                                    ),
                                ),
                            )
                        ],
                    ),
                    "Updated Pipeline Name".to_string(),
                ),
                pipeline_data_id.clone(),
                "pipelines".to_string(),
            ),
        );
    let mut configuration = datadog::Configuration::new();
    configuration.set_unstable_operation_enabled("v2.UpdatePipeline", true);
    let api = ObservabilityPipelinesAPI::with_config(configuration);
    let resp = api.update_pipeline(pipeline_data_id.clone(), body).await;
    if let Ok(value) = resp {
        println!("{:#?}", value);
    } else {
        println!("{:#?}", resp.unwrap_err());
    }
}

Instructions

First install the library and its dependencies and then save the example to src/main.rs and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" cargo run
/**
 * Update a pipeline returns "OK" response
 */

import { client, v2 } from "@datadog/datadog-api-client";

const configuration = client.createConfiguration();
configuration.unstableOperations["v2.updatePipeline"] = true;
const apiInstance = new v2.ObservabilityPipelinesApi(configuration);

// there is a valid "pipeline" in the system
const PIPELINE_DATA_ID = process.env.PIPELINE_DATA_ID as string;

const params: v2.ObservabilityPipelinesApiUpdatePipelineRequest = {
  body: {
    data: {
      attributes: {
        config: {
          destinations: [
            {
              id: "updated-datadog-logs-destination-id",
              inputs: ["filter-processor"],
              type: "datadog_logs",
            },
          ],
          processors: [
            {
              id: "filter-processor",
              include: "service:my-service",
              inputs: ["datadog-agent-source"],
              type: "filter",
            },
          ],
          sources: [
            {
              id: "datadog-agent-source",
              type: "datadog_agent",
            },
          ],
        },
        name: "Updated Pipeline Name",
      },
      id: PIPELINE_DATA_ID,
      type: "pipelines",
    },
  },
  pipelineId: PIPELINE_DATA_ID,
};

apiInstance
  .updatePipeline(params)
  .then((data: v2.ObservabilityPipeline) => {
    console.log(
      "API called successfully. Returned data: " + JSON.stringify(data)
    );
  })
  .catch((error: any) => console.error(error));

Instructions

First install the library and its dependencies and then save the example to example.ts and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" tsc "example.ts"

Note: This endpoint is in Preview.

DELETE https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}

Información general

Delete a pipeline.

Argumentos

Parámetros de ruta

Nombre

Tipo

Descripción

pipeline_id [required]

string

The ID of the pipeline to delete.

Respuesta

OK

Forbidden

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Not Found

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Conflict

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Too many requests

API error response.

Expand All

Campo

Tipo

Descripción

errors [required]

[string]

A list of errors.

{
  "errors": [
    "Bad Request"
  ]
}

Ejemplo de código

                  # Path parameters
export pipeline_id="CHANGE_ME"
# Curl command
curl -X DELETE "https://api.ap1.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/${pipeline_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}"
"""
Delete a pipeline returns "OK" response
"""

from os import environ
from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi

# there is a valid "pipeline" in the system
PIPELINE_DATA_ID = environ["PIPELINE_DATA_ID"]

configuration = Configuration()
configuration.unstable_operations["delete_pipeline"] = True
with ApiClient(configuration) as api_client:
    api_instance = ObservabilityPipelinesApi(api_client)
    api_instance.delete_pipeline(
        pipeline_id=PIPELINE_DATA_ID,
    )

Instructions

First install the library and its dependencies and then save the example to example.py and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" python3 "example.py"
# Delete a pipeline returns "OK" response

require "datadog_api_client"
DatadogAPIClient.configure do |config|
  config.unstable_operations["v2.delete_pipeline".to_sym] = true
end
api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new

# there is a valid "pipeline" in the system
PIPELINE_DATA_ID = ENV["PIPELINE_DATA_ID"]
api_instance.delete_pipeline(PIPELINE_DATA_ID)

Instructions

First install the library and its dependencies and then save the example to example.rb and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" rb "example.rb"
// Delete a pipeline returns "OK" response

package main

import (
	"context"
	"fmt"
	"os"

	"github.com/DataDog/datadog-api-client-go/v2/api/datadog"
	"github.com/DataDog/datadog-api-client-go/v2/api/datadogV2"
)

func main() {
	// there is a valid "pipeline" in the system
	PipelineDataID := os.Getenv("PIPELINE_DATA_ID")

	ctx := datadog.NewDefaultContext(context.Background())
	configuration := datadog.NewConfiguration()
	configuration.SetUnstableOperationEnabled("v2.DeletePipeline", true)
	apiClient := datadog.NewAPIClient(configuration)
	api := datadogV2.NewObservabilityPipelinesApi(apiClient)
	r, err := api.DeletePipeline(ctx, PipelineDataID)

	if err != nil {
		fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.DeletePipeline`: %v\n", err)
		fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r)
	}
}

Instructions

First install the library and its dependencies and then save the example to main.go and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" go run "main.go"
// Delete a pipeline returns "OK" response

import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.ObservabilityPipelinesApi;

public class Example {
  public static void main(String[] args) {
    ApiClient defaultClient = ApiClient.getDefaultApiClient();
    defaultClient.setUnstableOperationEnabled("v2.deletePipeline", true);
    ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient);

    // there is a valid "pipeline" in the system
    String PIPELINE_DATA_ID = System.getenv("PIPELINE_DATA_ID");

    try {
      apiInstance.deletePipeline(PIPELINE_DATA_ID);
    } catch (ApiException e) {
      System.err.println("Exception when calling ObservabilityPipelinesApi#deletePipeline");
      System.err.println("Status code: " + e.getCode());
      System.err.println("Reason: " + e.getResponseBody());
      System.err.println("Response headers: " + e.getResponseHeaders());
      e.printStackTrace();
    }
  }
}

Instructions

First install the library and its dependencies and then save the example to Example.java and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" java "Example.java"
// Delete a pipeline returns "OK" response
use datadog_api_client::datadog;
use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI;

#[tokio::main]
async fn main() {
    // there is a valid "pipeline" in the system
    let pipeline_data_id = std::env::var("PIPELINE_DATA_ID").unwrap();
    let mut configuration = datadog::Configuration::new();
    configuration.set_unstable_operation_enabled("v2.DeletePipeline", true);
    let api = ObservabilityPipelinesAPI::with_config(configuration);
    let resp = api.delete_pipeline(pipeline_data_id.clone()).await;
    if let Ok(value) = resp {
        println!("{:#?}", value);
    } else {
        println!("{:#?}", resp.unwrap_err());
    }
}

Instructions

First install the library and its dependencies and then save the example to src/main.rs and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" cargo run
/**
 * Delete a pipeline returns "OK" response
 */

import { client, v2 } from "@datadog/datadog-api-client";

const configuration = client.createConfiguration();
configuration.unstableOperations["v2.deletePipeline"] = true;
const apiInstance = new v2.ObservabilityPipelinesApi(configuration);

// there is a valid "pipeline" in the system
const PIPELINE_DATA_ID = process.env.PIPELINE_DATA_ID as string;

const params: v2.ObservabilityPipelinesApiDeletePipelineRequest = {
  pipelineId: PIPELINE_DATA_ID,
};

apiInstance
  .deletePipeline(params)
  .then((data: any) => {
    console.log(
      "API called successfully. Returned data: " + JSON.stringify(data)
    );
  })
  .catch((error: any) => console.error(error));

Instructions

First install the library and its dependencies and then save the example to example.ts and run following commands:

    
DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comddog-gov.com" DD_API_KEY="<API-KEY>" DD_APP_KEY="<APP-KEY>" tsc "example.ts"

PREVIEWING: flavien/gcp-se-terraform-doc