Forum Discussion
Accuracy of Purview Content Explorer for retention labels
Hello Andrew,
I would also love to know the answer to that, not necessarily for Retention labels but for anything that Content explorer counts.
I see 3 sets of different numbers and no idea which to trust:
-> high level number on the left
-> drilling down on the right u get the number reduced
-> using the new PowerShell command for Content explorer shows yet other numbers.
Microsoft does put some disclaimers when u drill down in the locations:
- "The actual number of items in this site/folder might be different from the calculated number that's displayed on the left"
- "The number of site/folder items listed below and on the left is a calculation and may not match the total number of actual items you'll see when you open a specific site/folder."
- "The number of Teams items listed below and on the left is a calculation and may not match the total number of actual items you'll see when you open a specific Teams."
but i cannot find these explained anywhere in the documentation...
- AndrewWarlandJan 03, 2024Steel Contributor
Thanks Teo
For two of the other classifiers, SITs and trainable, I can kind of understand that the numbers may not be accurate, but with both retention and sensitivity labels, I don't really understand why it doesn't show the actual number assigned.
- IM_ShaneJan 04, 2024Copper Contributor
Hi Andrew
Unfortunately, I don't have an answer for you but I am experiencing same, and coincidently have an open support ticket with Microsoft to see if they can provide any insight.
If I get anything useful back I'll share it here.
- IM_ShaneJan 10, 2024Copper ContributorNo love on the support ticket; I'm working from a demo tenant and there's no associated experience for viewing retention labels via content explorer - no support given 😞