SOLVED

Searching Audit log Strange Behavior

Brass Contributor

Back in October, I created a script that dumps Daily Microsoft Stream Audit data into CSVs so that PowerBI Reports can be created. It was working fine until the 16th of December and now seems to be very inconsistently returning 1000s of duplicate records with a ResultIndex of -1. A correct return of data usually returns a ResultIndex going from 1 to however many records. Does anyone know why this may be occurring?

 

 

Do { 
	try{
        $currentResults = Search-UnifiedAuditLog -StartDate $startDate -EndDate $enddate -SessionId $sessionName -SessionCommand ReturnLargeSet -ResultSize 1000 -RecordType MicrosoftStream -Operations StreamCreateVideo,StreamEditVideo,StreamDeleteVideo,StreamInvokeVideoUpload,StreamInvokeVideoDownload,StreamEditVideoPermissions,StreamInvokeVideoView,StreamInvokeVideoShare,StreamInvokeVideoLike,StreamInvokeVideoUnLike,StreamCreateVideoComment,StreamDeleteVideoComment,StreamInvokeVideoTextTrackUpload,StreamInvokeVideoThumbnailUpload,StreamDeleteVideoThumbnail,StreamInvokeVideoMakePublic,StreamInvokeVideoMakePrivate,StreamCreateGroup,StreamEditGroup,StreamDeleteGroup,StreamEditGroupMemberships,StreamCreateChannel,StreamEditChannel,StreamDeleteChannel
    }catch [System.Exception] {
            $errorCount++
            $ErrorMessage = " Error: " + $_.Exception.Message 
            $logmsg = $ErrorMessage
            Write-Output $logmsg
            LogToFile -EntryType "ERROR" -Message $logmsg -Logfile $lgfile
    }
	if ($currentResults.Count -gt 0) {
        $logmsg = ("  Finished search #{1}, {2} records: {0} min" -f [math]::Round((New-TimeSpan -Start $scriptStart).TotalMinutes,4), $i, $currentResults.Count )
        Write-Output $logmsg
        LogToFile -EntryType "INFORMATION" -Message $logmsg -Logfile $lgfile
		# Accumulate the data
		$aggregateResults += $currentResults #Adds 1000 records at a time
		# if the results are below 1000 then its time to stop the loop
		if ($currentResults.Count -lt 1000) {
			$currentResults = @()
		} else {
			$i++
		}
	}
} Until ($currentResults.Count -eq 0) # --- End of Session Search Loop --- #

 

 

12 Replies

@swhitestrath I've noted some issues with audit log retrieval recently too. Best idea is to log a support incident. If you don't, no engineer will ever look to figure out what's going wrong.

@Tony Redmond I have opened a ticket with Microsoft, will update if they come back with anything. In the mean time I have just updated my code to check for this -1  ResultIndex and if it detects it, it waits 2 mins and tries the command again. 

@swhitestrath  Just to be sure I understand, you see -1 returned in the search results as in $currentresults[0].resultindex? 

 

This makes sense because ResultIndex tracks the position of the search in terms of retrieved results... So -1 gives you an indication that the search results returned aren't good.

 

Question: When you resume, does ResultIndex recommence at the right value? For instance, let's assume that you have retrieved 3000 records so far, the first record returned by the next successful call should have a value of 3001 in ResultIndex. Is that what you see?

 

 

@Tony Redmond I just reset everything and try again so the index returns to 1. The second time is usually successful. 

if($currentResults[0].ResultIndex -eq "-1" -or $currentResults -eq $null){
            $sessionName = (get-date -Format 'u')+'streamauditlog'+ $i
            $aggregateResults =@()
            if($currentResults[0].ResultIndex -eq "-1"){
                $logmsg = "-1 Index Found Loop Restarted"
            }
            if($currentResults -eq $null){
                $logmsg = "No Results returned, retrying..."
            }
            $errorCount++
            LogToFile -EntryType "ERROR" -Message $logmsg -Logfile $lgfile
            Start-Sleep -s 120
            continue
        }

 

@swhitestrath OK, so you basically conclude that -1 says that the search is bad and irrecoverable and therefore needs to be restarted. That's fair. I wonder if you throw away the page of duplicate results, pause, and then restart the same search it might restart properly. Did you try that?

@Tony Redmond Yeah basically, it seems to only return 1 page of correct results and then the rest of the pages are a bunch of duplicate entries with this -1 result index. Always conscious it could be something I am doing at my side but when it loops back around and tries again, it is the exact same command but it works, which makes me think its more something at Microsoft end. 

@Tony Redmond I thought I would just run the command bare bones with no loops or other code and still get this -1 ResultIndex 

 

@swhitestrath You don't really have any other levers to pull, so it makes sense to reset and restart when bad data is returned. At least you have a method to figure out when you're getting information you can't use. But it would be interesting if Microsoft Support can throw any further light onto this topic.

best response confirmed by swhitestrath (Brass Contributor)
Solution
Just thought I would give you an update on the response I got back from Microsoft Support.
"When the ResultCount is 0 or ResultIndex is -1, the search faced internal timeout.
Please ignore the results returned when the internal timeout occurs, and (wait 5 minutes then) try search again."

Pretty generic response but oh well.
Oh well, it's an answer...
Hi,
I found this thread by chance and I too have noticed duplicate records when exporting an audit log search into a spreadsheet
On a separate cheeky note, I want to monitor User creation, modification and deletion of Microsoft Teams and as I'm not familiar with building scripts I was wondering if you might share your script that loads audit log records into a csv file ?
- Frank

Well its June 2022 and I am seeing the same issue with duplicate results. I implemented a simple while loop to check for it and repeat the run until the data is good. I set a max of 10 loops, but I have noticed that the data is OK on the 2nd query (but not always!).

#Attempt 1.
$AuditResults = Search-UnifiedAuditLog -StartDate $startOfMonth -EndDate $endOfMonth -SessionCommand ReturnLargeSet -resultsize 1000 -Operations FileDownloaded -SiteID "<Insert Site ID>" -FreeText Certification
#Additional Attempts
if ( $AuditResults.Count -gt 0 )
{
#We have some results. Check if the data is good.
while($AuditResults[0].ResultIndex -eq -1)
{
$AuditResults = Search-UnifiedAuditLog -StartDate $startOfMonth -EndDate $endOfMonth -SessionCommand ReturnLargeSet -resultsize 1000 -Operations FileDownloaded -SiteID "<insertSiteID>" -FreeText Certification
$loop++
#Prevent an infinite or very high loop by setting a max. of 10 iterations.
if($loop -gt 10) {
break
}
}
}

1 best response

Accepted Solutions
best response confirmed by swhitestrath (Brass Contributor)
Solution
Just thought I would give you an update on the response I got back from Microsoft Support.
"When the ResultCount is 0 or ResultIndex is -1, the search faced internal timeout.
Please ignore the results returned when the internal timeout occurs, and (wait 5 minutes then) try search again."

Pretty generic response but oh well.

View solution in original post