Jan 20 2021 09:12 AM
Back in October, I created a script that dumps Daily Microsoft Stream Audit data into CSVs so that PowerBI Reports can be created. It was working fine until the 16th of December and now seems to be very inconsistently returning 1000s of duplicate records with a ResultIndex of -1. A correct return of data usually returns a ResultIndex going from 1 to however many records. Does anyone know why this may be occurring?
Do {
try{
$currentResults = Search-UnifiedAuditLog -StartDate $startDate -EndDate $enddate -SessionId $sessionName -SessionCommand ReturnLargeSet -ResultSize 1000 -RecordType MicrosoftStream -Operations StreamCreateVideo,StreamEditVideo,StreamDeleteVideo,StreamInvokeVideoUpload,StreamInvokeVideoDownload,StreamEditVideoPermissions,StreamInvokeVideoView,StreamInvokeVideoShare,StreamInvokeVideoLike,StreamInvokeVideoUnLike,StreamCreateVideoComment,StreamDeleteVideoComment,StreamInvokeVideoTextTrackUpload,StreamInvokeVideoThumbnailUpload,StreamDeleteVideoThumbnail,StreamInvokeVideoMakePublic,StreamInvokeVideoMakePrivate,StreamCreateGroup,StreamEditGroup,StreamDeleteGroup,StreamEditGroupMemberships,StreamCreateChannel,StreamEditChannel,StreamDeleteChannel
}catch [System.Exception] {
$errorCount++
$ErrorMessage = " Error: " + $_.Exception.Message
$logmsg = $ErrorMessage
Write-Output $logmsg
LogToFile -EntryType "ERROR" -Message $logmsg -Logfile $lgfile
}
if ($currentResults.Count -gt 0) {
$logmsg = (" Finished search #{1}, {2} records: {0} min" -f [math]::Round((New-TimeSpan -Start $scriptStart).TotalMinutes,4), $i, $currentResults.Count )
Write-Output $logmsg
LogToFile -EntryType "INFORMATION" -Message $logmsg -Logfile $lgfile
# Accumulate the data
$aggregateResults += $currentResults #Adds 1000 records at a time
# if the results are below 1000 then its time to stop the loop
if ($currentResults.Count -lt 1000) {
$currentResults = @()
} else {
$i++
}
}
} Until ($currentResults.Count -eq 0) # --- End of Session Search Loop --- #
Jan 21 2021 02:03 AM
@swhitestrath I've noted some issues with audit log retrieval recently too. Best idea is to log a support incident. If you don't, no engineer will ever look to figure out what's going wrong.
Jan 26 2021 12:44 AM - edited Jan 26 2021 12:45 AM
@Tony Redmond I have opened a ticket with Microsoft, will update if they come back with anything. In the mean time I have just updated my code to check for this -1 ResultIndex and if it detects it, it waits 2 mins and tries the command again.
Jan 26 2021 02:03 AM
@swhitestrath Just to be sure I understand, you see -1 returned in the search results as in $currentresults[0].resultindex?
This makes sense because ResultIndex tracks the position of the search in terms of retrieved results... So -1 gives you an indication that the search results returned aren't good.
Question: When you resume, does ResultIndex recommence at the right value? For instance, let's assume that you have retrieved 3000 records so far, the first record returned by the next successful call should have a value of 3001 in ResultIndex. Is that what you see?
Jan 26 2021 02:12 AM
@Tony Redmond I just reset everything and try again so the index returns to 1. The second time is usually successful.
if($currentResults[0].ResultIndex -eq "-1" -or $currentResults -eq $null){
$sessionName = (get-date -Format 'u')+'streamauditlog'+ $i
$aggregateResults =@()
if($currentResults[0].ResultIndex -eq "-1"){
$logmsg = "-1 Index Found Loop Restarted"
}
if($currentResults -eq $null){
$logmsg = "No Results returned, retrying..."
}
$errorCount++
LogToFile -EntryType "ERROR" -Message $logmsg -Logfile $lgfile
Start-Sleep -s 120
continue
}
Jan 26 2021 02:31 AM
@swhitestrath OK, so you basically conclude that -1 says that the search is bad and irrecoverable and therefore needs to be restarted. That's fair. I wonder if you throw away the page of duplicate results, pause, and then restart the same search it might restart properly. Did you try that?
Jan 26 2021 04:01 AM
@Tony Redmond Yeah basically, it seems to only return 1 page of correct results and then the rest of the pages are a bunch of duplicate entries with this -1 result index. Always conscious it could be something I am doing at my side but when it loops back around and tries again, it is the exact same command but it works, which makes me think its more something at Microsoft end.
Jan 26 2021 04:48 AM - edited Jan 26 2021 06:02 AM
@Tony Redmond I thought I would just run the command bare bones with no loops or other code and still get this -1 ResultIndex
Jan 26 2021 05:00 AM
@swhitestrath You don't really have any other levers to pull, so it makes sense to reset and restart when bad data is returned. At least you have a method to figure out when you're getting information you can't use. But it would be interesting if Microsoft Support can throw any further light onto this topic.
Mar 03 2021 04:13 AM
SolutionAug 03 2021 03:07 AM
Jun 01 2022 05:11 AM - edited Jun 01 2022 05:12 AM
Well its June 2022 and I am seeing the same issue with duplicate results. I implemented a simple while loop to check for it and repeat the run until the data is good. I set a max of 10 loops, but I have noticed that the data is OK on the 2nd query (but not always!).
#Attempt 1.
$AuditResults = Search-UnifiedAuditLog -StartDate $startOfMonth -EndDate $endOfMonth -SessionCommand ReturnLargeSet -resultsize 1000 -Operations FileDownloaded -SiteID "<Insert Site ID>" -FreeText Certification
#Additional Attempts
if ( $AuditResults.Count -gt 0 )
{
#We have some results. Check if the data is good.
while($AuditResults[0].ResultIndex -eq -1)
{
$AuditResults = Search-UnifiedAuditLog -StartDate $startOfMonth -EndDate $endOfMonth -SessionCommand ReturnLargeSet -resultsize 1000 -Operations FileDownloaded -SiteID "<insertSiteID>" -FreeText Certification
$loop++
#Prevent an infinite or very high loop by setting a max. of 10 iterations.
if($loop -gt 10) {
break
}
}
}
Mar 03 2021 04:13 AM
Solution