Cloud SIEM のための Google Cloud 構成ガイド
概要
Datadog Cloud SIEM は、Datadog で処理されたすべてのログに検出ルールを適用し、標的型攻撃や脅威インテリジェンスに記載された IP がシステムと通信している、あるいは安全でないリソース変更などの脅威を検出します。この脅威は、トリアージするためにセキュリティシグナルエクスプローラーでセキュリティシグナルとして表面化されます。
Use Google Cloud Dataflow and the Datadog template to forward logs from your Google Cloud services to Datadog. This guide walks you through the following steps so that you can start detecting threats with your Google Cloud audit logs:
- Data Access の監査ログを有効にする
- Create a Google Cloud publish/subscription (Pub/Sub) topic and pull subscription to receive logs from a configured log sink
- Create a custom Dataflow worker service account
- Create a log sink to publish logs to the Pub/Sub
- Create and run the Dataflow job
- Cloud SIEM でセキュリティシグナルのトリアージを行う
Collecting Google Cloud logs with a Pub/Sub Push subscription is in the process of being deprecated for the following reasons:
- If you have a Google Cloud VPC, the Push subscription cannot access endpoints outside the VPC.
- The Push subscription does not provide compression or batching of events, and so is only suitable for a low volume of logs.
Documentation for the Push subscription is only maintained for troubleshooting or modifying legacy setups. Use a Pull subscription with the Datadog Dataflow template to forward your Google Cloud logs to Datadog instead.
Data Access の監査ログを有効にする
- Navigate to the IAM & Admin Console > Audit Log.
- データアクセスログを有効にするサービスを選択します。
- Log Types パネルで、Admin Read、Data Read、Data Write を有効にします。
- Save をクリックします。
新サービスのデフォルト構成を変更する
If a new Google Cloud service is added, it inherits your default audit configuration.
新しい Google Cloud サービスに対して Data Access の監査ログがキャプチャされるようにするには、デフォルトの監査構成を変更します。
- Navigate to the IAM & Admin Console > Audit Log.
- Admin Read、Data Read、Data Write を有効にします。
- Save をクリックします。
Create a Google Cloud publish/subscription (Pub/Sub) system
- Navigate to Pub/Sub > Topics.
- Create Topic をクリックします。
- Enter a descriptive topic name. For example,
export-audit-logs-to-datadog
. - Leave Add a default subscription selected, which creates a subscription with default configuration values. The name of the subscription is automatically generated as your topic name with “-sub” appended to it. This subscription name is used when you create your Dataflow job later.
- 作成をクリックします。
Create an additional topic and subscription for outputDeadletterTopic parameter
Create an additional topic and default subscription to handle any log messages rejected by the Datadog API. This topic is used when you set up the Dataflow job later.
- Navigate back to Pub/Sub > Topics
- Create Topic をクリックします。
- Enter a descriptive topic name.
- Leave Add a default subscription selected.
- 作成をクリックします。
Warning: Pub/subs are subject to Google Cloud quotas and limitations. If the number of logs you have is higher than those limitations, Datadog recommends you split your logs over several topics. See Monitor the Log Forwarding for information on how to set up a monitor to notify when you are close to those limits.
Create a secret in Secret Manager
Datadog recommends creating a secret in Secret Manager with your valid Datadog API key value. This secret is used when you set up the Dataflow job later.
- Navigate to Security > Secret Manager.
- Click Create Secret.
- Enter a name for the secret.
- Copy your Datadog API key and paste it into the Secret value section.
- Optionally, set the other configurations based on your use case.
- Click Create Secret.
Create a custom Dataflow worker service account
The default behavior for Dataflow pipeline workers is to use your project’s Compute Engine default service account, which grants permissions to all resources in the project. If you are forwarding logs from a production environment, create a custom worker service account with only the necessary roles and permissions, and assign this service account to your Dataflow pipeline workers.
Note: If you are not creating a custom service account for the Dataflow pipeline workers, ensure that the default Compute Engine service account has the required permissions below.
- Navigate to Google Cloud’s Service Account page.
- Select your project.
- Click Create Service Account.
- Enter a descriptive name for the service account.
- Click Create and Continue.
- Add the following roles:
Required permissions
Role | Path | Description |
---|
Dataflow Admin | roles/dataflow.admin | このサービスアカウントが Dataflow の管理者タスクを実行することを許可します。 |
Dataflow Worker | roles/dataflow.worker | Allow this service account to perform Dataflow job operations |
Pub/Sub Viewer | roles/pubsub.viewer | Allow this service account to view messages from the Pub/Sub subscription with your Google Cloud logs |
Pub/Sub Subscriber | roles/pubsub.subscriber | Allow this service account to consume messages from the Pub/Sub subscription with your Google Cloud logs |
Pub/Sub Publisher | roles/pubsub.publisher | Allow this service account to publish failed messages to a separate subscription, which allows for analysis or resending the logs |
Secret Manager Secret Accessor | roles/secretmanager.secretAccessor | Allow this service account to access the Datadog API key in Secret Manager |
Storage Object Admin | roles/storage.objectAdmin | Allow this service account to read and write to the Cloud Storage bucket specified for staging files |
- Continue Continue.
- Done をクリックします。
Create a log sink to publish logs to the Pub/Sub
- Navigate to Google Cloud’s Logs Explorer.
- 左サイドメニューの Log Router を選択します。
- Create Sink をクリックします。
- Enter a descriptive name for the sink.
- Next をクリックします。
- In the Select Sink Service dropdown menu, select Cloud Pub/Sub topic.
Note: The Cloud Pub/Sub topic can be located in a different project. - Select a Cloud Pub/Sub topic で、先ほど作成した Pub/Sub を選択します。
- Next をクリックします。
- Enter an inclusion filter for the logs you want to send to Datadog.
- Next をクリックします。
- Optionally, enter an exclusion filter to exclude logs you do not want sent to Datadog.
- Create Sink をクリックします。
Note: You can create multiple exports from Google Cloud Logging to the same Pub/Sub topic with different sinks.
Dataflow ジョブを作成して実行する
- Navigate to Google Cloud Dataflow.
- Click Create job from template.
- Enter a name for the job.
- Select a regional endpoint.
- In the Dataflow template dropdown menu, select Pub/Sub to Datadog.
- In Required Parameters section:
a. In the Pub/Sub input subscription dropdown menu, select the default subscription that was created earlier when you created a new Pub/Sub system.
b. Datadog Logs API URL フィールドに以下の値を入力します。Note: Ensure that the Datadog site selector on the right of this documentation page is set to your Datadog site before copying the URL above.
c. In the Output deadletter Pub/Sub topic field, select the additional topic you created earlier for receiving messages rejected by the Datadog API.
d. Specify a path for temporary files in your storage bucket in the Temporary location field. - 先ほど Datadog API キー値用の シークレットを Secret Manager で作成した場合:
a. Click Optional Parameters to see the additional fields.
b. Enter the resource name of the secret in the Google Cloud Secret Manager ID field.
To get the resource name, go to your secret in Secret Manager. Click on your secret. Click on the three dots under Action and select Copy resource name.
c. Enter SECRET_MANAGER
in the Source of the API key passed field. - If you are not using a secret for your Datadog API key value:
- Recommended:
- Set
Source of API key passed
to KMS
. - Set
Google Cloud KMS key for the API key
to your Cloud KMS key ID. - Set
Logs API Key
to the encrypted API key.
- Not recommended:
Source of API key passed
set to PLAINTEXT
with Logs API Key
set to the plaintext API key.
- See Template parameters in the Dataflow template for details on other available options.
- If you created a custom worker service account, select it in the Service account email dropdown menu.
- Click Run Job.
Datadog Log Explorer で Cloud Pub/Sub トピックに配信された新規ログイベントを確認します。
Cloud SIEM でセキュリティシグナルのトリアージを行う
Cloud SIEM は、設定した Google Cloud の監査ログを含む、処理されたすべてのログに対して、すぐに検出ルールを適用します。検出ルールで脅威が検出されると、セキュリティシグナルが生成され、セキュリティシグナルエクスプローラーで確認することができます。
参考資料