Metabase
Overview
Metabase is a business intelligence and analytics platform. Learn more in the official Metabase documentation.
The DataHub integration for Metabase covers BI entities such as dashboards, charts, datasets, and related ownership context. Depending on module capabilities, it can also capture features such as lineage, usage, profiling, ownership, tags, and stateful deletion detection.
Concept Mapping
While the specific concept mapping is still pending, this shows the generic concept mapping in DataHub.
| Source Concept | DataHub Concept | Notes |
|---|---|---|
| Platform/account/project scope | Platform Instance, Container | Organizes assets within the platform context. |
| Core technical asset (for example table/view/topic/file) | Dataset | Primary ingested technical asset. |
| Schema fields / columns | SchemaField | Included when schema extraction is supported. |
| Ownership and collaboration principals | CorpUser, CorpGroup | Emitted by modules that support ownership and identity metadata. |
| Dependencies and processing relationships | Lineage edges | Available when lineage extraction is supported and enabled. |
Module metabase
Important Capabilities
| Capability | Status | Notes |
|---|---|---|
| Detect Deleted Entities | ✅ | Enabled by default via stateful ingestion. |
| Platform Instance | ✅ | Enabled by default. |
| Table-Level Lineage | ✅ | Supported by default. |
Overview
The metabase module ingests metadata from Metabase into DataHub. It is intended for production ingestion workflows and module-specific capabilities are documented below.
Compatibility
Metabase version v0.48.3
Prerequisites
Before running ingestion, ensure network connectivity to the source, valid authentication credentials, and read permissions for metadata APIs required by this module.
Install the Plugin
pip install 'acryl-datahub[metabase]'
Starter Recipe
Check out the following recipe to get started with ingestion! See below for full configuration options.
For general pointers on writing and running a recipe, see our main recipe guide.
source:
type: metabase
config:
connect_uri: "http://localhost:3000"
username: "user@example.com"
password: "${METABASE_PASSWORD}"
sink:
# sink configs
Config Details
- Options
- Schema
Note that a . is used to denote nested fields in the YAML recipe.
| Field | Description |
|---|---|
api_key One of string(password), null | Metabase API key. If provided, the username and password will be ignored. Recommended method. Default: None |
connect_uri string | Metabase host URL. Default: localhost:3000 |
convert_urns_to_lowercase boolean | Whether to convert dataset urns to lowercase. Default: False |
database_alias_map One of object, null | Database name map to use when constructing dataset URN. Default: None |
database_id_to_instance_map One of string, null | Custom mappings between metabase database id and DataHub platform instance Default: None |
default_schema string | Default schema name to use when schema is not provided in an SQL query Default: public |
display_uri One of string, null | optional URL to use in links (if connect_uri is only for ingestion) Default: None |
engine_platform_map One of string, null | Custom mappings between metabase database engines and DataHub platforms Default: None |
exclude_other_user_collections boolean | Flag that if true, exclude other user collections Default: False |
password One of string(password), null | Metabase password, used when an API key is not provided. Default: None |
platform_instance_map One of string, null | A holder for platform -> platform_instance mappings to generate correct dataset urns Default: None |
username One of string, null | Metabase username, used when an API key is not provided. Default: None |
env string | The environment that all assets produced by this connector belong to Default: PROD |
stateful_ingestion One of StatefulStaleMetadataRemovalConfig, null | Default: None |
stateful_ingestion.enabled boolean | Whether or not to enable stateful ingest. Default: True if a pipeline_name is set and either a datahub-rest sink or datahub_api is specified, otherwise False Default: False |
stateful_ingestion.fail_safe_threshold number | Prevents large amount of soft deletes & the state from committing from accidental changes to the source configuration if the relative change percent in entities compared to the previous state is above the 'fail_safe_threshold'. Default: 75.0 |
stateful_ingestion.remove_stale_metadata boolean | Soft-deletes the entities present in the last successful run but missing in the current run with stateful_ingestion enabled. Default: True |
The JSONSchema for this configuration is inlined below.
{
"$defs": {
"StatefulStaleMetadataRemovalConfig": {
"additionalProperties": false,
"description": "Base specialized config for Stateful Ingestion with stale metadata removal capability.",
"properties": {
"enabled": {
"default": false,
"description": "Whether or not to enable stateful ingest. Default: True if a pipeline_name is set and either a datahub-rest sink or `datahub_api` is specified, otherwise False",
"title": "Enabled",
"type": "boolean"
},
"remove_stale_metadata": {
"default": true,
"description": "Soft-deletes the entities present in the last successful run but missing in the current run with stateful_ingestion enabled.",
"title": "Remove Stale Metadata",
"type": "boolean"
},
"fail_safe_threshold": {
"default": 75.0,
"description": "Prevents large amount of soft deletes & the state from committing from accidental changes to the source configuration if the relative change percent in entities compared to the previous state is above the 'fail_safe_threshold'.",
"maximum": 100.0,
"minimum": 0.0,
"title": "Fail Safe Threshold",
"type": "number"
}
},
"title": "StatefulStaleMetadataRemovalConfig",
"type": "object"
}
},
"additionalProperties": false,
"properties": {
"convert_urns_to_lowercase": {
"default": false,
"description": "Whether to convert dataset urns to lowercase.",
"title": "Convert Urns To Lowercase",
"type": "boolean"
},
"stateful_ingestion": {
"anyOf": [
{
"$ref": "#/$defs/StatefulStaleMetadataRemovalConfig"
},
{
"type": "null"
}
],
"default": null
},
"env": {
"default": "PROD",
"description": "The environment that all assets produced by this connector belong to",
"title": "Env",
"type": "string"
},
"platform_instance_map": {
"anyOf": [
{
"additionalProperties": {
"type": "string"
},
"type": "object"
},
{
"type": "null"
}
],
"default": null,
"description": "A holder for platform -> platform_instance mappings to generate correct dataset urns",
"title": "Platform Instance Map"
},
"connect_uri": {
"default": "localhost:3000",
"description": "Metabase host URL.",
"title": "Connect Uri",
"type": "string"
},
"display_uri": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"default": null,
"description": "optional URL to use in links (if `connect_uri` is only for ingestion)",
"title": "Display Uri"
},
"username": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"default": null,
"description": "Metabase username, used when an API key is not provided.",
"title": "Username"
},
"password": {
"anyOf": [
{
"format": "password",
"type": "string",
"writeOnly": true
},
{
"type": "null"
}
],
"default": null,
"description": "Metabase password, used when an API key is not provided.",
"title": "Password"
},
"api_key": {
"anyOf": [
{
"format": "password",
"type": "string",
"writeOnly": true
},
{
"type": "null"
}
],
"default": null,
"description": "Metabase API key. If provided, the username and password will be ignored. Recommended method.",
"title": "Api Key"
},
"database_alias_map": {
"anyOf": [
{
"additionalProperties": true,
"type": "object"
},
{
"type": "null"
}
],
"default": null,
"description": "Database name map to use when constructing dataset URN.",
"title": "Database Alias Map"
},
"engine_platform_map": {
"anyOf": [
{
"additionalProperties": {
"type": "string"
},
"type": "object"
},
{
"type": "null"
}
],
"default": null,
"description": "Custom mappings between metabase database engines and DataHub platforms",
"title": "Engine Platform Map"
},
"database_id_to_instance_map": {
"anyOf": [
{
"additionalProperties": {
"type": "string"
},
"type": "object"
},
{
"type": "null"
}
],
"default": null,
"description": "Custom mappings between metabase database id and DataHub platform instance",
"title": "Database Id To Instance Map"
},
"default_schema": {
"default": "public",
"description": "Default schema name to use when schema is not provided in an SQL query",
"title": "Default Schema",
"type": "string"
},
"exclude_other_user_collections": {
"default": false,
"description": "Flag that if true, exclude other user collections",
"title": "Exclude Other User Collections",
"type": "boolean"
}
},
"title": "MetabaseConfig",
"type": "object"
}
Capabilities
Use the Important Capabilities table above as the source of truth for supported features and whether additional configuration is required.
Collection
The connector uses Metabase collection APIs to discover collection hierarchy and list dashboards within each collection:
Dashboard
Dashboard details are extracted from:
This includes titles, descriptions, ownership context, and dashboard URL metadata.
Chart
Chart metadata is extracted from:
This captures chart definitions, query relationships, and chart-level attributes used for DataHub Chart entities.
Database Mapping
Metabase databases will be mapped to a DataHub platform based on the engine listed in the
api/database response. This mapping can be
customized by using the engine_platform_map config option. For example, to map databases using the athena engine to
the underlying datasets in the glue platform, the following snippet can be used:
engine_platform_map:
athena: glue
DataHub will try to determine database name from Metabase api/database
payload. However, the name can be overridden from database_alias_map for a given database connected to Metabase.
If several platform instances with the same platform (e.g. from several distinct clickhouse clusters) are present in DataHub, the mapping between database id in Metabase and platform instance in DataHub may be configured with the following map:
database_id_to_instance_map:
"42": platform_instance_in_datahub
The key in this map must be string, not integer although Metabase API provides id as number.
If database_id_to_instance_map is not specified, platform_instance_map is used for platform instance mapping. If none of the above are specified, platform instance is not used when constructing urn when searching for dataset relations.
If needed it is possible to exclude collections from other users by setting the following configuration:
exclude_other_user_collections: true
Limitations
Module behavior is constrained by source APIs, permissions, and metadata exposed by the platform. Refer to capability notes for unsupported or conditional features.
Troubleshooting
If ingestion fails, validate credentials, permissions, connectivity, and scope filters first. Then review ingestion logs for source-specific errors and adjust configuration accordingly.
Code Coordinates
- Class Name:
datahub.ingestion.source.metabase.MetabaseSource - Browse on GitHub
If you've got any questions on configuring ingestion for Metabase, feel free to ping us on our Slack.
This page is auto-generated from the underlying source code. To make changes, please edit the relevant source files in the metadata-ingestion directory.
Tip: For quick typo fixes or documentation updates, you can click the ✏️ Edit icon directly in the GitHub UI to open a Pull Request. For larger changes and PR naming conventions, please refer to our Contributing Guide.