JSON Schemas
Overview
Json Schema is a storage and lakehouse platform. Learn more in the official Json Schema documentation.
The DataHub integration for Json Schema covers file/lakehouse metadata entities such as datasets, paths, and containers. Depending on module capabilities, it can also capture features such as lineage, usage, profiling, ownership, tags, and stateful deletion detection.
Concept Mapping
While the specific concept mapping is still pending, this shows the generic concept mapping in DataHub.
| Source Concept | DataHub Concept | Notes |
|---|---|---|
| Platform/account/project scope | Platform Instance, Container | Organizes assets within the platform context. |
| Core technical asset (for example table/view/topic/file) | Dataset | Primary ingested technical asset. |
| Schema fields / columns | SchemaField | Included when schema extraction is supported. |
| Ownership and collaboration principals | CorpUser, CorpGroup | Emitted by modules that support ownership and identity metadata. |
| Dependencies and processing relationships | Lineage edges | Available when lineage extraction is supported and enabled. |
Module json-schema
Important Capabilities
| Capability | Status | Notes |
|---|---|---|
| Descriptions | ✅ | Extracts descriptions at top level and field level. |
| Detect Deleted Entities | ✅ | With stateful ingestion enabled, will remove entities from DataHub if they are no longer present in the source. |
| Extract Ownership | ❌ | Does not currently support extracting ownership. |
| Extract Tags | ❌ | Does not currently support extracting tags. |
| Platform Instance | ✅ | Supports platform instance via config. |
| Schema Metadata | ✅ | Extracts schemas, following references. |
Overview
The json-schema module ingests metadata from Json Schema into DataHub. It is intended for production ingestion workflows and module-specific capabilities are documented below.
Prerequisites
Before running ingestion, ensure network connectivity to the source, valid authentication credentials, and read permissions for metadata APIs required by this module.
Install the Plugin
pip install 'acryl-datahub[json-schema]'
Starter Recipe
Check out the following recipe to get started with ingestion! See below for full configuration options.
For general pointers on writing and running a recipe, see our main recipe guide.
pipeline_name: json_schema_ingestion
source:
type: json-schema
config:
path: <path_to_json_file_or_directory or url> # e.g. https://json.schemastore.org/petstore-v1.0.json
platform: <choose a platform that you want schemas to live under> # e.g. schemaregistry
# platform_instance: <add a platform_instance if there are multiple schema repositories>
stateful_ingestion:
enabled: true # recommended to have this turned on
# sink configs if needed
Config Details
- Options
- Schema
Note that a . is used to denote nested fields in the YAML recipe.
| Field | Description |
|---|---|
path ✅ One of string(file-path), string(directory-path), string(uri) | Set this to a single file-path or a directory-path (for recursive traversal) or a remote url. e.g. https://json.schemastore.org/petstore-v1.0.json |
platform ✅ string | Set this to a platform that you want all schemas to live under. e.g. schemaregistry / schemarepo etc. |
platform_instance One of string, null | The instance of the platform that all assets produced by this recipe belong to. This should be unique within the platform. See https://docs.datahub.com/docs/platform-instances/ for more details. Default: None |
use_id_as_base_uri boolean | When enabled, uses the $id field in the json schema as the base uri for following references. Default: False |
env string | The environment that all assets produced by this connector belong to Default: PROD |
uri_replace_pattern One of URIReplacePattern, null | Use this if URI-s need to be modified during reference resolution. Simple string match - replace capabilities are supported. Default: None |
uri_replace_pattern.match ❓ string | Pattern to match on uri-s as part of reference resolution. See replace field |
uri_replace_pattern.replace ❓ string | Pattern to replace with as part of reference resolution. See match field |
stateful_ingestion One of StatefulStaleMetadataRemovalConfig, null | Default: None |
stateful_ingestion.enabled boolean | Whether or not to enable stateful ingest. Default: True if a pipeline_name is set and either a datahub-rest sink or datahub_api is specified, otherwise False Default: False |
stateful_ingestion.fail_safe_threshold number | Prevents large amount of soft deletes & the state from committing from accidental changes to the source configuration if the relative change percent in entities compared to the previous state is above the 'fail_safe_threshold'. Default: 75.0 |
stateful_ingestion.remove_stale_metadata boolean | Soft-deletes the entities present in the last successful run but missing in the current run with stateful_ingestion enabled. Default: True |
The JSONSchema for this configuration is inlined below.
{
"$defs": {
"StatefulStaleMetadataRemovalConfig": {
"additionalProperties": false,
"description": "Base specialized config for Stateful Ingestion with stale metadata removal capability.",
"properties": {
"enabled": {
"default": false,
"description": "Whether or not to enable stateful ingest. Default: True if a pipeline_name is set and either a datahub-rest sink or `datahub_api` is specified, otherwise False",
"title": "Enabled",
"type": "boolean"
},
"remove_stale_metadata": {
"default": true,
"description": "Soft-deletes the entities present in the last successful run but missing in the current run with stateful_ingestion enabled.",
"title": "Remove Stale Metadata",
"type": "boolean"
},
"fail_safe_threshold": {
"default": 75.0,
"description": "Prevents large amount of soft deletes & the state from committing from accidental changes to the source configuration if the relative change percent in entities compared to the previous state is above the 'fail_safe_threshold'.",
"maximum": 100.0,
"minimum": 0.0,
"title": "Fail Safe Threshold",
"type": "number"
}
},
"title": "StatefulStaleMetadataRemovalConfig",
"type": "object"
},
"URIReplacePattern": {
"additionalProperties": false,
"properties": {
"match": {
"description": "Pattern to match on uri-s as part of reference resolution. See replace field",
"title": "Match",
"type": "string"
},
"replace": {
"description": "Pattern to replace with as part of reference resolution. See match field",
"title": "Replace",
"type": "string"
}
},
"required": [
"match",
"replace"
],
"title": "URIReplacePattern",
"type": "object"
}
},
"additionalProperties": false,
"properties": {
"env": {
"default": "PROD",
"description": "The environment that all assets produced by this connector belong to",
"title": "Env",
"type": "string"
},
"platform_instance": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"default": null,
"description": "The instance of the platform that all assets produced by this recipe belong to. This should be unique within the platform. See https://docs.datahub.com/docs/platform-instances/ for more details.",
"title": "Platform Instance"
},
"stateful_ingestion": {
"anyOf": [
{
"$ref": "#/$defs/StatefulStaleMetadataRemovalConfig"
},
{
"type": "null"
}
],
"default": null
},
"path": {
"anyOf": [
{
"format": "file-path",
"type": "string"
},
{
"format": "directory-path",
"type": "string"
},
{
"format": "uri",
"minLength": 1,
"type": "string"
}
],
"description": "Set this to a single file-path or a directory-path (for recursive traversal) or a remote url. e.g. https://json.schemastore.org/petstore-v1.0.json",
"title": "Path"
},
"platform": {
"description": "Set this to a platform that you want all schemas to live under. e.g. schemaregistry / schemarepo etc.",
"title": "Platform",
"type": "string"
},
"use_id_as_base_uri": {
"default": false,
"description": "When enabled, uses the `$id` field in the json schema as the base uri for following references.",
"title": "Use Id As Base Uri",
"type": "boolean"
},
"uri_replace_pattern": {
"anyOf": [
{
"$ref": "#/$defs/URIReplacePattern"
},
{
"type": "null"
}
],
"default": null,
"description": "Use this if URI-s need to be modified during reference resolution. Simple string match - replace capabilities are supported."
}
},
"required": [
"path",
"platform"
],
"title": "JsonSchemaSourceConfig",
"type": "object"
}
Configuration Notes
- You must provide a
platformfield. Most organizations have custom project names for their schema repositories, so you can pick whatever name makes sense. For example, you might want to call your schema platform schemaregistry. After picking a custom platform, you can use the put platform command to register your custom platform into DataHub.
Capabilities
Use the Important Capabilities table above as the source of truth for supported features and whether additional configuration is required.
Limitations
Module behavior is constrained by source APIs, permissions, and metadata exposed by the platform. Refer to capability notes for unsupported or conditional features.
Troubleshooting
If ingestion fails, validate credentials, permissions, connectivity, and scope filters first. Then review ingestion logs for source-specific errors and adjust configuration accordingly.
Code Coordinates
- Class Name:
datahub.ingestion.source.schema.json_schema.JsonSchemaSource - Browse on GitHub
If you've got any questions on configuring ingestion for JSON Schemas, feel free to ping us on our Slack.
This page is auto-generated from the underlying source code. To make changes, please edit the relevant source files in the metadata-ingestion directory.
Tip: For quick typo fixes or documentation updates, you can click the ✏️ Edit icon directly in the GitHub UI to open a Pull Request. For larger changes and PR naming conventions, please refer to our Contributing Guide.