Skip to main content

Operations

Why Would You Use Operations APIs?

The Operations APIs allow you to report operational changes that were made to a given Dataset or Table using the 'Operation' concept. These operations may be viewed on the Dataset Profile (e.g. as last modified time), accessed via the DataHub GraphQL API, or used as inputs to DataHub Cloud Freshness Assertions.

Goal Of This Guide

This guide will show you how to report and query Operations for a Dataset.

Supported Sources

Some ingestion sources can automatically capture operations from native audit logs, query history, table history, or object timestamps. The table below lists sources that emit dataset-level DataHub operation aspects during ingestion.

SourceNotes
ABS Data LakeEnabled by default as UPDATE operations from blob timestamps.
BigQueryEnabled by default via usage extraction, can be disabled via usage.include_operational_stats.
ClickHouse clickhouseOptionally enabled via include_query_log_operations.
DatabricksEnabled by default via usage extraction, can be disabled via include_operational_stats.
Delta LakeEnabled by default from Delta table history.
DremioOptionally enabled via include_query_lineage; generated from Dremio job history.
Fabric OneLakeOptionally enabled via usage.include_usage_statistics and usage.include_operational_stats.
GlueEnabled by default from Glue table created and last modified timestamps.
OracleOptionally enabled via include_query_usage and include_operational_stats.
RedshiftOptionally enabled via include_usage_statistics; controlled by include_operational_stats.
S3 / Local FilesEnabled by default as UPDATE operations from object timestamps.
SalesforceEnabled by default from Salesforce object created and last modified timestamps.
SnowflakeEnabled by default, can be disabled via configuration include_operational_stats.
SQL QueriesParsed from non-SELECT SQL queries.
TeradataOptionally enabled via include_usage_statistics; controlled by usage.include_operational_stats.

Prerequisites

For this tutorial, you need to deploy DataHub Quickstart and ingest sample data. For detailed steps, please refer to DataHub Quickstart Guide.

note

Before reporting operations for a dataset, you need to ensure the targeted dataset is already present in DataHub.

Report Operations

You can use report dataset operations to DataHub using the following APIs.

mutation reportOperation {
reportOperation(
input: {
urn: "urn:li:dataset:(urn:li:dataPlatform:hive,fct_users_created,PROD)"
operationType: INSERT
sourceType: DATA_PROCESS
}
)
}

Where supported operation types include

  • INSERT
  • UPDATE
  • DELETE
  • CREATE
  • ALTER
  • DROP
  • CUSTOM

If you want to report an operation that happened at a specific time, you can also optionally provide the timestampMillis field. If not provided, the current server time will be used as the operation time.

If you see the following response, the operation was successful:

{
"data": {
"reportOperation": true
},
"extensions": {}
}

Read Operations

You can use read dataset operations to DataHub using the following APIs.

query dataset {
dataset(urn: "urn:li:dataset:(urn:li:dataPlatform:hive,fct_users_created,PROD)") {
operations(
limit: 10, filter: [], startTimeMillis: <start-timestamp-ms>, endTimeMillis: <end-timestamp-ms>
) {
timestampMillis
operationType
sourceType
}
}
}

Where startTimeMillis and endTimeMillis are optional. By default, operations are sorted by time descending.

If you see the following response, the operation was successful:

{
"data": {
"dataset": {
"operations": [
{
"timestampMillis": 1231232332,
"operationType": "INSERT",
"sourceType": "DATA_PROCESS"
}
]
}
},
"extensions": {}
}

Expected Outcomes of Reporting Operations

Reported Operations will appear when displaying the Last Updated time for a Dataset on their DataHub Profile. They will also be used when selecting the DataHub Operation source type under the Advanced settings of a Freshness Assertion.