Forum Discussion
Scan Databricks data source at the schema level rather than the catalog level
Does anyone know how to scan Databricks scoped down to a specific schema?
I seem to only be able to scan entire catalogs which is too large of a scope.
7 Replies
- NickDoughty
Microsoft
Today to scope Purview scans for Databricks you will need to use the Databricks connector instead of the Databricks Unity Catalog connector. Connect to and manage Azure Databricks in Microsoft Purview | Microsoft Learn The UC connector will only scope down to a catalog level today.
- Bernice9Copper Contributor
Are you referring to the Hive Metastore method for scoping a scan to a specific schema or table?
Is there a private preview feature that we could be granted for "Scan Unity Catalog at the schema level" that we could please receive? I appreciate your help
- NickDoughty
Microsoft
Yes it is called the hive metastore in the docs but in the Purview datamap it is just called Azure Databricks. There is no preview for lower than catalog level today it is getting prioritized before we can add it to the roadmap.
- Bernice9Copper Contributor
This is for the Unity Catalog method of scanning and not the Hive Metastore. Does anyone have any insight?
- Bernice9Copper Contributor
Does anyone have any insight about this?