Accuracy of Purview Content Explorer for retention labels

Steel Contributor

I am finding regular discrepancies between the numbers for retention labels that appear in Microsoft Purview's Content Explorer, and the outcome of a Content Search for the same labels, and wonder if others are seeing this too. I'm seeing this in both my Dev tenant and reported to me by several organisations I'm working with. 

 

For example, Content Explorer here tells me there are 138 items, split between Exchange and SharePoint. 'Drilling down' in Exchange, I see that one mailbox allegedly has 9 labels and another has only 1. Where does it get 94 from? 

PurviewContentExplorerResults.png

 

Compare the above with a Content Search, that returns 78 results only in SharePoint, the only place where the label was actually applied, confirmed from the exported results. 

PurviewContentSearchResults.png

 

Why the difference? Or is the AI/ML-driven Content Explorer making some assumptions that are completely wrong? 

 

 

 

 

 

4 Replies

Hello Andrew,

I would also love to know the answer to that, not necessarily for Retention labels but for anything that Content explorer counts.

I see 3 sets of different numbers and no idea which to trust:

-> high level number on the left

-> drilling down on the right u get the number reduced

-> using the new PowerShell command for Content explorer shows yet other numbers.

 

Microsoft does put some disclaimers when u drill down in the locations:

  • "The actual number of items in this site/folder might be different from the calculated number that's displayed on the left"
  • "The number of site/folder items listed below and on the left is a calculation and may not match the total number of actual items you'll see when you open a specific site/folder."
  • "The number of Teams items listed below and on the left is a calculation and may not match the total number of actual items you'll see when you open a specific Teams."

but i cannot find these explained anywhere in the documentation...

Thanks Teo

For two of the other classifiers, SITs and trainable, I can kind of understand that the numbers may not be accurate, but with both retention and sensitivity labels, I don't really understand why it doesn't show the actual number assigned. 

 

Hi Andrew

 

Unfortunately, I don't have an answer for you but I am experiencing same, and coincidently have an open support ticket with Microsoft to see if they can provide any insight. 

 

If I get anything useful back I'll share it here.

No love on the support ticket; I'm working from a demo tenant and there's no associated experience for viewing retention labels via content explorer - no support given 😞