SOLVED

Help with parameter for Search-UnifiedAuditLog

Iron Contributor

Hi,

 

Disclaimer: I am new to PowerShell, hence why I turn here for your input.

 

Background:

I'm creating a Power BI dashboard based on data exported from the O365 Audit Log. For the moment, I'm not using the recently launched API, but a daily scheduled PowerShell script to export the data. Everything works fine with one exception:

 

Issue:

The maximum number of objects that can be returned in one query is 5000 (see article for details). We have a fairly small tenant, so I have divided the cmdlet to run twice to cover 24h (AM+PM), that gives us 10k rows which usually is enough. Now the exception is when we run SharePoint backups via Metalogix. The operation is of course walking through every single file which results in logs well above 10k rows.

 

Solution?

What I want to achieve is to exclude the user account associated with the backup operation, to only return "relevant" objects/events. I was hoping to archive this by using the parameters provided in the cmdlet, but with my limited knowledge I can't figure it out. I can pass user ID's to include, but would this also allow me to exclude by user ID? If so, how?

 

I can always divide the cmdlet to run more than twice per 24h, allowing more objects to be returned, but I hope there is a better solution to this. 

 

https://technet.microsoft.com/en-us/library/mt238501%28v=exchg.160%29.aspx?f=255&MSPPError=-21472173...

 

Many thanks!

24 Replies

@Pontus T   Hello, I love your script its exactly what we have been trying to accomplish; however, I have found a problem.  It seems that many records are duplicated and even though i am sorting the records in the command by creation date, they seem to be out of order.  Its almost as if the file is appended about 100 records at a time and i notice the creation dates jumbled. for example the creation dates will be:

5/20/2019 0:19
5/19/2019 21:23
5/16/2019 22:40
5/16/2019 20:34
5/16/2019 20:30
5/16/2019 20:30
5/16/2019 20:30
5/16/2019 12:28
5/16/2019 12:26
5/24/2019 21:00
5/24/2019 20:59
5/24/2019 20:59
5/24/2019 20:58
5/24/2019 20:58
5/24/2019 20:57

This wouldn't be so bad as I can sort the data in Excel, but because there are duplicate records it seems there may be some overlap during the append.

@Rajiv ChokshiThanks Rajiv, that script functions much as the one in this post functions, however it has some more functionality with the logging that I like.  However, there is still an issue with duplicate records and I see that the append function still seems to write the data out of order.  When I look at creation times they are broken up; the numbers below show how the appending is broken.  The creation times should all be written sequentially.  Any idea why this is not happening? does it need a separate flag or pipe to fix this?

 

10

9

8

7

6

5

4

3

2

1

18

17

16

15

14

13

12

11

25

24

23

22

21

20

19

I was able to get this working. I had to add " | Sort-object CreationDate | " as a pipe reght before the export csv command. The default order is Ascending so that fixed the problem of the records out of order. As for the duplicate records, Looking in the office 365 service center it seems these records are already duplicated so its not a problem with overlap in the command. Furthermore does anybody know if its possible to download the audit logs and correct for time zone? it seems all the creation dates are listed in UTC rather than my local time zone as it is in the online interface.

@Pontus T 

Here is my approach to solve this problem, I had something alike and wanted to share it with you, there was a lot of chatter on one specific parameter rendering the 5000 limit useless, within the 24 hours that is, so I created a 4 hour iteration ignoring the bogus parameter, hopes it helps  you.

 

    # ignore command "set-whatever" over 1000 hits every x hours
    
    $global:day = (Get-Date)
    # set start date at midnight
$hours = $global:day.TimeOfDay.TotalMinutes
$startdate = $global:day.AddMinutes(- $hours)
    # set end date at midnight
    $enddate = $startdate.AddHours(24)
    #
    $logsearch=@()
    # iterate every x hours ignoring bogus operation
    $increment=4
    for($i=0; $i -le (24-$increment); $i=$i+$increment) { $i 
 
        $logsearch += Search-UnifiedAuditLog -StartDate $startdate.AddHours($i) -EndDate $startdate.AddHours(4+$i) -RecordType <searchtype> -SessionCommand ReturnLargeSet -resultsize 5000|? Operations -NotMatch "set-whatever"
    }
    # 

you could add a group of operation restrictions by using "-in" operator if you have a bunch.
#