Pinned Posts
Forum Widgets
Latest Discussions
Filter Fabric objects
On one side I can understand point a connector at Fabric and let it run and ingest into Purview. However, I would like to manage what does get pulled into Purview from Fabric. There are Dev/Test workspaces with reports and dashboards that are not preferred to be ingested, as well as Lakehouses and Warehouses with the same concept of Dev/Test/Prod. Is there a way to control the objects that are ingested via the Fabric connector? The other thought I had was maybe individual connectors like SQL and PBI, but there isn't anything that I can see. Thoughts or direction?JBNFMJan 15, 2026Copper Contributor6Views0likes0CommentsTwo sensitivity labels on PDF file
Hi everyone, First time poster here. We encountered an interesting issue yesterday where we had a user come to us with a PDF that had two sensitivity labels attached. In Purview activity explorer, we can see the file hit the DLP policy and the two labels, but when trying to replicate the issue cannot do it, or see how this has been done. Has anyone else encountered a similar issue? We were able to remove labels in our PDF editor but in Office suite once a label is applied, I could not see a way to remove it. We tried applying a label to a Doc file, converting to PDF and then seeing if it was there where it was being asked for another label but it was not, it just let us change the original. Many thanks in advance!courtney_greenJan 14, 2026Copper Contributor24Views0likes1CommentInformation Scanner - SQL connection fails
Hello everyone, we are currently deploying the information scanner. The issue appeared after the scanner was already installed successfully SQL Server is running on a custom TCP port (49999), encrypted connection, and the scanner database is existing with the correct owner (service account). We also acquired the Entra token Error Failed to access scanner database. Verify the database is up and running and can be accessed by scanner service account and by the currently logged in user that executes the command. Troubleshooting steps taken: Diag show: Invalid database schema or cannot access the scanner DB. To update the database schema, run Update-ScannerDatabase. Make sure all nodes run the same MIP client version. SQL error: Message Could not obtain information about Windows NT group/user 'Domain\scanaccount', error code 0x5. Update-ScannerDatabase executed - same error Login to SQL Servers are successful SQL CMD: sqlcmd -S SQL.company.de,4321 -E -N -Q "SELECT @@VERSION" ## Worked Other configs: Tried to reregister database multiple times / service account is sysadmin at SQL server (shared) SQL DB Alias used instead of Port / SQL Browser did not work Allowed everything through firewall on SQL server - still fail 4h of troubleshooting gone by - and i am stuck - what can i do next? BR StephanStephanGeeJan 14, 2026Iron Contributor14Views0likes0CommentsService Domain restrictions
I’m currently implementing an Endpoint DLP policy to enforce service domain restrictions. The goal is to prevent users from uploading documents to non-corporate domains and only allow uploads to a specific allow-list (authorized domains), we only use Microsoft Edge I have the basic configuration working, but I have a few questions about behaviors I’m seeing: Dynamic Groups: Is it supported to use Microsoft 365 Dynamic Groups for the policy scope/assignment? File Types: How can I make the policy target all file types? Currently, I'm managing this via a defined list of extensions, but I'd like to cover everything. Copy/Paste vs. Upload (The main issue): When I drag and drop or use the "Upload" button from File Explorer to a blocked domain, the action is blocked as expected. However, if I copy and paste the file (or content) directly into the website, it bypasses the block and uploads successfully. Why does this happen? Policy Activation: It seems documents only pick up the policy restrictions after they are modified. Is this the expected behavior? Any recommendations or insights on what I might be missing would be appreciated. Thanks!Melvin_Maldonado03Jan 12, 2026Copper Contributor3Views0likes0CommentsMicrosoft Purview Unified Catalog – Draft Data Product Visibility (RBAC)
I have three Entra ID security groups that must be able to see all data products across the estate, including Draft, Unpublished, Published, and Retired: Purview.Admin.Team Purview.Data.Governance Purview.Data.Architecture.Team What I tested I tested assigning these groups to the available Microsoft Purview Unified Catalog roles at both application and governance‑domain scope, including Global Catalog Reader / domain reader roles Governance Domain Owner Data Governance Administrator Data Product Owner Data Steward Observed results Reader roles and Data Governance Administrator allowed users to see the list of data products but not Draft / Unpublished items. Governance Domain Owner and Data Product Owner allowed draft visibility but grant ownership/control. Only assigning the groups as Data Steward on each governance domain consistently allowed visibility of all data product lifecycle states (Draft, Unpublished, Published, Retired) without granting ownership. Current understanding Draft and Unpublished data products are only visible to users assigned domain‑level governance roles Data Steward is the least‑privileged role that provides draft visibility To achieve estate‑wide draft visibility, the groups must be assigned as Data Steward on every governance domain Application‑level roles alone (including Data Governance Administrator) are insufficient Question (seeking confirmation) Is this understanding and solution correct and aligned with Microsoft’s intended Purview Unified Catalog RBAC design, or is there an alternative supported way to provide read‑only draft data product visibility without assigning Data Steward per governance domain?sashakorniakUKJan 09, 2026Brass Contributor43Views0likes0CommentsMicrosoft Purview Client side labeling issue
Hello Everyone, I hope this message finds you well. I wanted to share some observations and seek your guidance on an issue I'm encountering with sensitivity label recommendations in Outlook. I have created a label with auto-labeling (Client side) enabled and configured it to identify sensitive information types (SITs) such as SSN and credit card details (Instance count 1- ANY). The curious part is, when I attach a Notepad file in Outlook that contains SSN and credit card information, I do not receive any sensitivity label recommendations in both Outlook desktop and web versions. However, if I paste the same content directly into the email body, I do receive the respective sensitivity label recommendation. Moreover, when I attach a Word document (not labeled) that contains SSN and credit card information, Outlook does not show any recommendation either. Interestingly, if the Word document detects the sensitive content and recommends a label, and I then save the document with the recommended label, attaching it back to Outlook does trigger the label recommendation. Could you please clarify if this behavior is by design or if there might be a missing configuration on my end? Your insights would be greatly appreciated. Thank you!Afsar_ShariffJan 08, 2026Brass Contributor70Views0likes2CommentsData Governance... who, how, why?
In our organization, we’ve defined the teams responsible for Data Security (Cybersecurity) and Data Compliance (Records Management). However, there is still uncertainty around which department should own and manage Data Governance. How is is permissioned?tmartinovvJan 08, 2026Copper Contributor84Views1like5CommentsRequest for Advice on Managing Shared Glossary Terms in Microsoft Purview
Hi everyone, I'm looking for guidance from others who are working with Microsoft Purview (Unified Catalog), especially around glossary and governance domain design. Scenario; I have multiple governance domains, all within the same Purview tenant. We have some core business concepts from the conceptual data models for example, a term like “PARTY” that are needed in every governance domain. However, based on Microsoft’s documentation: Glossary terms can only be created inside a specific governance domain, not at a tenant‑wide or global level. The Enterprise Glossary is only a consolidated view, not a place to create global terms or maintain a single shared version. It simply displays all terms from all domains in one list. If the same term is needed across domains, Purview requires separate term objects in each domain. Consistency must therefore be managed manually (by re‑creating the term in each domain) or by importing/exporting via CSV or automation (API/PyApacheAtlas)? This leads to questions about maintainability; especially when we want one consistent definition across all domains. What I'm hoping to understand from others: How are you handling shared enterprise concepts – enterprise and conceptual data models that need to appear in multiple governance domains? Are you duplicating terms in each domain and synchronising them manually or via automation? Have you adopted a “central domain” for hosting enterprise‑standard terms and then linking or referencing them in other domains? Is there any better pattern you’ve found to avoid fragmentation and to ensure consistent definitions across domains? Any advice, lessons learned, or examples of how you’ve structured glossary governance in Purview would be really helpful. this is be a primary ORKS - Establish a unified method to consistently link individual entities (e.g., PARTY) to their associated PII‑classified column‑level data assets in Microsoft Purview, ensuring sensitive data is accurately identified, governed, and monitored across all domains. – I.e CDE to Glossary terms Thanks in advance!sashakorniakUKDec 30, 2025Brass Contributor44Views0likes0CommentsLineage in Purview - Fabric
Dears, I was checking the lineage which is collected in Microsoft Fabric by Purview and I saw this limitations on the Microsoft page : I have some sources like : Dataverse (I am doing a Dataverse Direct Shortcut to bring the information from Dataverse into Fabric - Raw Layer) Sharepoint (I am doing a Fabric Shortcut into sharepoint. - Fabric Raw Layer) to bring the files into Fabric SQL Server MI ( I am doing a Fabric mirroring - Fabric Raw Layer) I did not undertand very well the lineage exceptions mentioned above. Let me ask this questions please: 1) Suppose I have a table which came from SQL Server MI. This table was first ingested into Fabric Raw Layer, then into Bronze Layer, then into Fabric Silver Layer and After to Fabric Gold Layer. Each layer has its own fabric workspace. When I bring the metadata via Purview , into the Purview Unified Catalog do I get automatic lineage between Silver layer and Gold layer for that table (suppose the table is named table 1). Or this is not possible because : 1.1 ) They are different workspaces 1.2) The table was originated outside Fabric , in the SQL Server MI? 2) Suppose I have a table in Silver layer (in the silver layer workspace), then, in the gold layer, I create a shortcut to that table. And then I use a notebook to transform that table into another table in the same gold layer. Will I have lineage between the Silver table and the new transformed table in gold layer? 3) To make the relationship of lineage between tables which is not create automatically I was thinking in using Purview Rest APIs. Which API shall I use specifically for this? Thanks, Pedro Thanks, Pedroriver200Dec 29, 2025Copper Contributor71Views0likes1CommentMIP SDK cannot read file labels if a message was encrypted by Outlook Classic.
C++ application uses MIP SDK version 1.14.108. The application does Office files decryption and labels reading. The problem with labels reading is observed. Steps to reproduce: Create a docx file with a label which does not impose encryption. Open Outlook Classic, compose email, attach the document from 1, click Encrypt, send. During message sending our application intercepts encrypted by Outlook docx file in temporary folder C:\Users\UserName\AppData\Local\Temp Application decrypts the intercepted file using mipns::FileHandler::RemoveProtection. Visual inspection demonstrates that decryption runs successfully. Then a separate FileHandler for decrypted file is created, and mipns::FileHandler::GetLabel() returns an empty label. It means that the label was lost during decryption. Upon visual inspection of the decrypted file via Word we can see that the label is missing. Also, we do not see MSIP_Label* entries in meta data (File -> Info -> Properties -> Advanced Properties -> Custom). Here is a fragment of MIP SDK reducted log during file handler creation ================= file_engine_impl.cpp:327 "Creating file handler for: [D:\GitRepos\ ...reducted]" mipns::FileEngineImpl::CreateFileHandlerImpl gsf_utils.cpp:50 "Initialized GSF" `anonymous-namespace'::InitGsfHelper data_spaces.cpp:415 "No LabelInfo stream was found. No v1 custom properties" mipns::DataSpaces::GetLabelInfoStream data_spaces.cpp:428 "No LabelInfo stream was found. No v1 custom properties" mipns::DataSpaces::GetXmlPropertiesV1 file_format_base.cpp:155 "Getting protection from input..." mipns::FileFormatBase::GetProtection license_parser.cpp:233 "XPath returned no results" `anonymous-namespace'::GetXmlNodesFromPath license_parser.cpp:233 "XPath returned no results" `anonymous-namespace'::GetXmlNodesFromPath license_parser.cpp:299 "GetAppDataNode - Failed to get ID in PL app data section, parsing failed" `anonymous-namespace'::GetAppDataNode api_log_cache.cpp:58 "{{============== API CACHED LOGS BEGIN ============}}" mipns::ApiLogCache::LogAllMessages file_engine_impl.cpp:305 "Starting API call: file_create_file_handler_async scenarioId=89fd6484-7db7-4f68-8cf7-132f87825a26" mipns::FileEngineImpl::CreateFileHandlerAsync 37948 default_task_dispatcher_delegate.cpp:83 "Executing task 'ApiObserver-0' on a new detached thread" mipns::DefaultTaskDispatcherDelegate::ExecuteTaskOnIndependentThread 37948 file_engine_impl.cpp:305 "Ended API call: file_create_file_handler_async" mipns::FileEngineImpl::CreateFileHandlerAsync 37948 file_engine_impl.cpp:305 "Starting API task: file_create_file_handler_async scenarioId=89fd6484-7db7-4f68-8cf7-132f87825a26" mipns::FileEngineImpl::CreateFileHandlerAsync file_engine_impl.cpp:327 "Creating file handler for: [D:\GitRepos\...reducted....docx]" mipns::FileEngineImpl::CreateFileHandlerImpl file_format_factory_impl.cpp:88 "Create File Format. Extension: [.docx]" mipns::FileFormatFactoryImpl::Create file_format_base.cpp:363 "V1 metadata is not supported for file extension .docx. Setting metadata version to 0" mipns::FileFormatBase::CalculateMetadataVersion compound_file.cpp:183 "Open compound file for read" mipns::CompoundFile::OpenRead gsf_utils.cpp:50 "Initialized GSF" `anonymous-namespace'::InitGsfHelper compound_file_storage_impl.cpp:351 "Get Metadata" mipns::CompoundFileStorageImpl::GetMetadata compound_file_storage_impl.cpp:356 "No Metadata, not creating GSF object" mipns::CompoundFileStorageImpl::GetMetadata metadata.cpp:119 "Create Metadata" mipns::Metadata::Metadata metadata.cpp:136 "Got [0] properties from DocumentSummaryInformation" mipns::Metadata::GetProperties compound_file_storage_impl.cpp:351 "Get Metadata" mipns::CompoundFileStorageImpl::GetMetadata compound_file_storage_impl.cpp:356 "No Metadata, not creating GSF object" mipns::CompoundFileStorageImpl::GetMetadata metadata.cpp:119 "Create Metadata" mipns::Metadata::Metadata metadata.cpp:136 "Got [0] properties from DocumentSummaryInformation" mipns::Metadata::GetProperties =================oleg_leDec 18, 2025Copper Contributor94Views0likes1Comment
Resources
Tags
- purview130 Topics
- microsoft purview77 Topics
- Information Protection21 Topics
- Sensitivity Labels20 Topics
- ediscovery16 Topics
- Azure Purview15 Topics
- data loss prevention13 Topics
- Retention Policy12 Topics
- endpoint dlp11 Topics
- api10 Topics