Log Workspaces

Log Workspaces is in private beta.

Request Access

Overview

During an incident investigation, you might need to run complex queries, such as combining attributes from multiple log sources or transforming log data, to analyze your logs. Use Log Workspaces to run queries to:

  • Correlate multiple data sources
  • Aggregate multiple levels of data
  • Join data across multiple log sources and other datasets
  • Extract data or add a calculated field at query time
  • Add visualizations for your transformed datasets

Create a workspace and add a data source

You can create a workspace from the Workspaces page or from the Log Explorer.

On the Log Workspaces page:

  1. Click New Workspace.
  2. Click the Data source tile.
  3. Enter a query. The reserved attributes of the filtered logs are added as columns.

In the Log Explorer:

  1. Enter a query.
  2. Click More, next to Download as CSV, and select Open in Workspace.
  3. The workspace adds the log query to a data source cell. By default, the columns in Log Explorer are added to the data source cell.

Add a column to your workspace

An example workspace cell, with an open detail side panel that highlights the option to add an attribute as a column

In addition to the default columns, you can add your own columns to your workspace:

  1. From your workspace cell, click on a log to open the detail side panel.
  2. Click the attribute you want to add as a column.
  3. From the pop up option, select Add “@your_column " to “your workspace” dataset.

Analyze, transform, and visualize your logs

You can add the following cells to:

  • Include additional data sources such as reference tables
  • Use SQL to join data
  • Transform, correlate, and visualize the data

Cells that depend on other cells are automatically updated when one of the cells it depends on is changed.

At the bottom of your workspace, click any of the cell tiles to add it to your workspace. After adding a cell, you can click the dataset on the left side of your workspace page to go directly to that cell.

Data source cell

You can add a logs query or a reference table as a data source.

  1. Click on the Data source tile.
    • To add a reference table:
      1. Select Reference table in the Data source dropdown.
      2. Select the reference table you want to use.
    • To add a logs data source:
      1. Enter a query. The reserved attributes of the filtered logs are added as columns.
      2. Click datasource_x at the top of the cell to rename the data source.
      3. Click Columns to see the columns available. Click as for a column to add an alias.
      4. To add additional columns to the dataset:
        a. Click on a log.
        b. Click the cog next to the facet you want to add as a column.
        c. Select Add…to…dataset.
  2. Click the download icon to export the dataset as a CSV.

Analysis cell

  1. Click the Analysis tile to add a cell and use SQL to query the data from any of the data sources. You can use natural language or SQL to query your data . An example using natural language: select only timestamp, customer id, transaction id from the transaction logs.
  2. If you are using SQL, click Run to run the SQL commands.
  3. Click the download icon to export the dataset as a CSV.

Visualization cell

Add the Visualization cell to display your data as a:

  • Table
  • Top list
  • Timeseries
  • Treemap
  • Pie chart
  • Scatterplot
  1. Click the Visualization tile.
  2. Select the data source you want to visualize in the Source dataset dropdown menu.
  3. Select your visualization method in the Visualize as dropdown menu.
  4. Enter a filter if you want to filter to a subset of the data. For example, status:error. If you are using an analysis cell as your data source, you can also filter the data in SQL first.
  5. If you want to group your data, click Add Aggregation and select the information you want to group by.
  6. Click the download button to export the data as a CSV.

Transformation cell

Click the Transformation tile to add a cell for filtering, aggregating, and extracting data.

  1. Click the Transformation tile.
  2. Select the data source you want to transform in the Source dataset dropdown menu.
  3. Click the plus icon to add a Filter, Parse, or Aggregate function.
    • For Filter, add a filter query for the dataset.
    • For Parse, enter grok syntax to extract data into a separate column. In the from dropdown menu, select the column the data is getting extracted from. See the column extraction example.
    • For Aggregate, select what you want to group the data by in the dropdown menus.
    • For Limit, enter the number of rows of the dataset you want to display.
  4. Click the download icon to export the dataset into a CSV.

Column extraction example

The following is an example dataset:

timestamphostmessage
May 29 11:09:28.000shopist.internalSubmitted order for customer 21392
May 29 10:59:29.000shopist.internalSubmitted order for customer 38554
May 29 10:58:54.000shopist.internalSubmitted order for customer 32200

Use the following grok syntax to extract the customer ID from the message and add it to a new column called customer_id:

Submitted order for customer %{notSpace:customer_id}`

This is the resulting dataset in the transformation cell after the extraction:

timestamphostmessagecustomer_id
May 29 11:09:28.000shopist.internalSubmitted order for customer 2139221392
May 29 10:59:29.000shopist.internalSubmitted order for customer 3855438554
May 29 10:58:54.000shopist.internalSubmitted order for customer 3220032200

Text cell

Click the Text cell to add a markdown cell so you can add information and notes.

An example workspace

The workspace datasets

This example workspace has:

  • Three data sources:

    • trade_start_logs
    • trade_execution_logs
    • trading_platform_users
  • Three derived datasets, which are the results of data that has been transformed from filtering, grouping, or querying using SQL:

    • parsed_execution_logs
    • transaction_record
    • transaction_record_with_names
  • One treemap visualization.

This diagram shows the different transformation and analysis cells the data sources go through.

A flowchart showing the steps that the data sources go through

Example walkthrough

The example starts off with two logs data sources:

  • trade_start_logs
  • trade_execution_logs

The next cell in the workspace is the transform cell parsed_execution_logs. It uses the following grok parsing syntax to extract the transaction ID from the message column of the trade_execution_logs dataset and adds the transaction ID to a new column called transaction_id.

transaction %{notSpace:transaction_id}

An example of the resulting parsed_execution_logs dataset:

timestamphostmessagetransaction_id
May 29 11:09:28.000shopist.internalExecuting trade for transaction 5651956519
May 29 10:59:29.000shopist.internalExecuting trade for transaction 2326923269
May 29 10:58:54.000shopist.internalExecuting trade for transaction 9687096870
May 31 12:20:01.152shopist.internalExecuting trade for transaction 8020780207

The analysis cell transaction_record uses the following SQL command to select specific columns from the trade_start_logs dataset and the trade_execution_logs, renames the status INFO to OK, and then joins the two datasets.

SELECT
    start_logs.timestamp,
    start_logs.customer_id,
    start_logs.transaction_id,
    start_logs.dollar_value,
    CASE
        WHEN executed_logs.status = 'INFO' THEN 'OK'
        ELSE executed_logs.status
    END AS status
FROM
    trade_start_logs AS start_logs
JOIN
    trade_execution_logs AS executed_logs
ON
    start_logs.transaction_id = executed_logs.transaction_id;

An example of the resulting transaction_record dataset:

timestampcustomer_idtransaction_iddollar_valuestatus
May 29 11:09:28.00092446085cc56c-a54f838.32OK
May 29 10:59:29.00078037b1fad476-fd4f479.96OK
May 29 10:58:54.00047694cb23d1a7-c0cb703.71OK
May 31 12:20:01.152802072c75b835-4194386.21ERROR

Then the reference table trading_platform_users is added as a data source:

customer_namecustomer_idaccount_status
Meghan Key92446verified
Anthony Gill78037verified
Tanya Mejia47694verified
Michael Kaiser80207fraudulent

The analysis cell transaction_record_with_names runs the following SQL command to take the customer name and account status from trading_platform_users, appending it as columns, and then joins it with the transaction_records dataset:

SELECT tr.timestamp, tr.customer_id, tpu.customer_name, tpu.account_status, tr.transaction_id, tr.dollar_value, tr.status
FROM transaction_record AS tr
LEFT JOIN trading_platform_users AS tpu ON tr.customer_id = tpu.customer_id;

An example of the resulting transaction_record_with_names dataset:

timestampcustomer_idcustomer_nameaccount_statustransaction_iddollar_valuestatus
May 29 11:09:28.00092446Meghan Keyverified085cc56c-a54f838.32OK
May 29 10:59:29.00078037Anthony Gillverifiedb1fad476-fd4f479.96OK
May 29 10:58:54.00047694Tanya Mejiaverifiedcb23d1a7-c0cb703.71OK
May 31 12:20:01.15280207Michael Kaiserfraudulent2c75b835-4194386.21ERROR

Finally, a treemap visualization cell is created with the transaction_record_with_names dataset filtered for status:error logs and grouped by dollar_value, account_status, and customer_name.

The workspace datasets

Further reading

Additional helpful documentation, links, and articles:

PREVIEWING: mcretzman/DOCS-9337-add-cloud-info-byoti