SOLVED

Help with parameter for Search-UnifiedAuditLog

Iron Contributor

Hi,

 

Disclaimer: I am new to PowerShell, hence why I turn here for your input.

 

Background:

I'm creating a Power BI dashboard based on data exported from the O365 Audit Log. For the moment, I'm not using the recently launched API, but a daily scheduled PowerShell script to export the data. Everything works fine with one exception:

 

Issue:

The maximum number of objects that can be returned in one query is 5000 (see article for details). We have a fairly small tenant, so I have divided the cmdlet to run twice to cover 24h (AM+PM), that gives us 10k rows which usually is enough. Now the exception is when we run SharePoint backups via Metalogix. The operation is of course walking through every single file which results in logs well above 10k rows.

 

Solution?

What I want to achieve is to exclude the user account associated with the backup operation, to only return "relevant" objects/events. I was hoping to archive this by using the parameters provided in the cmdlet, but with my limited knowledge I can't figure it out. I can pass user ID's to include, but would this also allow me to exclude by user ID? If so, how?

 

I can always divide the cmdlet to run more than twice per 24h, allowing more objects to be returned, but I hope there is a better solution to this. 

 

https://technet.microsoft.com/en-us/library/mt238501%28v=exchg.160%29.aspx?f=255&MSPPError=-21472173...

 

Many thanks!

23 Replies

What you are looking for I guess is something like a "NOT" operator, which will allow you to exclude all actions from particular user/account. Unfortunately, I'm not aware of such for the UserIds parameter of the Search-UnifiedAuditLog cmdlet. I guess you can do it the other way around, use the UserIds parameter to explicitly list all users, apart for the service account used for backups.

 

Flagging @Tony Redmond to double-check.

Have you checked the parameter "SessionCommand" in the Search-UnifiedAuditLog cmdlet?. By using this you can get all the records. But you need to do the filters in the DB where you have stored the audit logs.

 

 

@Vasil Michev Yes, I NOT EQUAL operator would be the optimal thing, but I have not got that to work. I guess it's not supported as no similar functionality exists for the Audit Log interface in the Security & Complience Center. And to pass all users except one is not a great option unfortunately.

@NarasimaPerumal Chandramohan I have been fiddling around with that a bit but did not manage to get it to work. To me the instructions on how to use it are a bit vague. I don't know what they mean with Paging, and a full tenant backup is likely to require more than 50k objects to be returned.

 

However, it seems like I can use multiple session ID's and that each one will return 50k if I use the ReturnLastSet parameter. I think it's worth trying out again. I need to make sure that the rest of the script can handle the way objects are returned through paging, as it needs to convert from JSON to allow proper export.

 

Here is a bit of the script I use today:

 

$AuditOutputPM = Search-UnifiedAuditLog -StartDate "$StartDate 12:00 PM" -EndDate "$EndDate 23:59 PM" -ResultSize 5000
$AuditOutputAM = Search-UnifiedAuditLog -StartDate "$StartDate 00:01 AM" -EndDate "$EndDate 11:59 AM" -ResultSize 5000

$ConvertedOutputPM = $AuditOutputPM | Select-Object -ExpandProperty AuditData | ConvertFrom-Json
$ConvertedOutputAM = $AuditOutputAM | Select-Object -ExpandProperty AuditData | ConvertFrom-Json

$ConvertedOutputPM | Select-Object CreationTime,UserId,Operation,Workload,ObjectID,SiteUrl,SourceFileName,ClientIP,UserAgent  | Export-Csv $OutputFile -NoTypeInformation -Append
$ConvertedOutputAM | Select-Object CreationTime,UserId,Operation,Workload,ObjectID,SiteUrl,SourceFileName,ClientIP,UserAgent  | Export-Csv $OutputFile -NoTypeInformation -Append

I am unaware of a way to exclude accounts from a cmdlet pull for audit records. There might be an easier way for you to do this... Cogmotive Reports has a Discover and Audit module that works against the audit log records (in other words, the same data) and it comes with a pivot table capability that you might be able to do what you want. They also are able to provide more data than 5,000 records at a time because they store the audit log data in their own stores. You could try that...

@Tony Redmond Thank you for the tip about Cogmotive. I will have a look at the product for sure. However, I like the customizability that I get in Power BI, building interactive reports for different stakeholders. And I guess it's something about the challenge of building it yourself :)

best response confirmed by Pontus T (Iron Contributor)
Solution

@NarasimaPerumal Chandramohan thanks for pointing me in the right direction. I managed to solve it by using SessionID and SessionCommand. All I needed was a while loop that kept running until the variable taking the audit data returned null, and keep appending the export file in every loop run.

Remember that if you build a solution, you have to maintain the code... and things break, like APIs. That's one of the reasons why it is sometimes better to go with a commercial solution and let the ISV have the headache of maintaining the code and making sure that it works, even when the APIs are in flux (as does happen inside Office 365).

 

TR


@Pontus T wrote:

@Tony Redmond Thank you for the tip about Cogmotive. I will have a look at the product for sure. However, I like the customizability that I get in Power BI, building interactive reports for different stakeholders. And I guess it's something about the challenge of building it yourself :)


 

Is this possible with the native PowerShell command search-unifiedauditlog?

I am trying to figure out if it is possible to get a SINGLE report for all activities on a particular SPO site collection using the PowerShell command search-unifiedauditlog.

For example I want to know everything that happened on below site collection in the last  30 days https://contoso.sharepoint.com/sites/projectabc and this should include all of these : 

File and page activities

Folder activities

Sharing and access request activities

Synchronization activities

Site administration activities

 

Thanks in advance. I was able to do this with AdminDroid but looking to see if the native command can accomplish this.

@Rajiv Chokshi Sorry for the delayed reply. Yes, this should be possible. Anything you can query in the Audit log in the Security & Compliance Center, can be pulled from the Audit log using Search-UnifiedAuditLog, and exported to CSV. So start with going to the Office 365 Audit log and confirm that you can find the actions you are looking for.

Then you can start on designing the script. To get you started, below is a copy of the scripts that I am using. It is looping the cmdlet using the session parameter until all the records are exported to the CSV file. When I'm converting from JSON, and when exporting to CSV, I am selecting certain objects that are of value for my use case, but there are more object holding data that might be valuable for you. Taking away the selection should bring it all in. Also, I am searching 3 days back in time as I have noticed that some records take that long before they are saved to the audit log. You can experiment with this and compare the number of rows you get depending on how far back you set the interval.

 

I am far away from an expert on PowerShell, but with patience and help from our database engineer at work, we have created a Power BI dashboard that can display all audit log actions across the whole tenant in a user friendly and graphical way. It gets new data from the script on a daily basis, and as it is exported you will have no limits in searching back in time, compared to the (newly announced) 90 days that MSFT saves for the Audit log.

 

Please note that you should start by using a script to securely save (encrypt) your credentials, which in turn are called in the beginning of the second script (you only need to use this once).  You probably create a service account for this and set it up with minimal permissions needed (more under the "Before you begin" tab under this link). 

 

First script to store your credentials

 

param(
    [Parameter(Mandatory=$true)]
    [string]$Path,
    [Parameter(Mandatory=$true)]
    [string]$password
    )

    # Convert the password to a secure string
    $SecurePassword = $password | ConvertTo-SecureString -AsPlainText -Force
    
    # Store the credential in the path
    $SecurePassword | ConvertFrom-SecureString | Out-File $Path
    
    # Write What we did
    Write-Host "Wrote password to $path"    
    
<# 
.SYNOPSIS
Stores a password in a file on the local computer for retrevial by scripts.

.DESCRIPTION
Used for securely storing a password on a machine for use with automated scripts.

Takes a password and encrypts it using the local account, then stores that password in a file you specify.
Only the account that creates the output file can decrypt the password stored in the file.

.PARAMETER Path
Path and file name for the password file that will be created.

.PARAMETER Password
Plain text version of password.

.OUTPUTS
File Specified in Path variable will contain an encrypted version of the password.

.EXAMPLE
.\New-StoredCredential.ps1 -Path c:\scripts\O365Account.txt -Password "Password123"

Puts the encrypted version of Password123 into the c:\scripts\O365Account.txt file
#>

Second script to pull data from the audit log (you need to define your own variables in some places as I have removed mine from this copy)

 

# Set constants
$Username = "[YOUR ACCOUNT, RECOMMENDED TO USE THE PRINCIPLE OF LEAST PRIVILEGE]"
$PasswordPath = "[YOUR PATH DEFINED IN SCRIPT ABOVE]"

# Read the password from the file and convert to SecureString
Write-Host "Getting password from $passwordpath"
$SecurePassword = Get-Content $PasswordPath | ConvertTo-SecureString

# Build a Credential Object from the password file and the $username constant
$Credential = New-Object System.Management.Automation.PSCredential -ArgumentList $Username, $SecurePassword

# Open a session to Exchange Online were the audit log cmdlet is located
$O365Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $Credential -Authentication Basic -AllowRedirection
Import-PSsession $O365session

# Define dates for search. MSFT claims Windows short date format on the machine running the script should be used, but test proved "MM/dd/yyyy" must be used
$StartDate = (get-date).AddDays(-3).ToString("MM/dd/yyyy")
$EndDate = (get-date).AddDays(-2).ToString("MM/dd/yyyy")

# Define file name and path for csv export
$FileAppend = (get-date).AddDays(-3).ToString("yyyyMMdd")
# [string]$FileAppend = (Get-Date -Format yyyyMMdd)
$OutputFile = "[DEFINE OUTPUT FILE PATH]" + $FileAppend + ".csv"

# Set variable value to trigger loop below (can be anything)
$AuditOutput = 1

# Loop will run until $AuditOutput returns null which equals that no more event objects exists from the specified date
while ($AuditOutput) {

    # Search the defined date(s), SessionId + SessionCommand in combination with the loop will return and append 100 object per iteration until all objects are returned (minimum limit is 50k objects)
    $AuditOutput = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -SessionId "1" -SessionCommand ReturnLargeSet

    # Select and expand the nested object (AuditData) as it holds relevant reporting data. Convert output format from default JSON to enable export to csv
    $ConvertedOutput = $AuditOutput | Select-Object -ExpandProperty AuditData | ConvertFrom-Json

    # Export results exluding type information. Append rather than overwrite if the file exist in destination folder
    $ConvertedOutput | Select-Object CreationTime,UserId,Operation,Workload,ObjectID,SiteUrl,SourceFileName,ClientIP,UserAgent  | Export-Csv $OutputFile -NoTypeInformation -Append
}

# Close the sesssion to Exhange Online
Remove-PSSession $O365Session

I hope this helps to get you started. You can read more about the audit log and how you can limit it's scope if needed here.

 

One thing that strikes me about searching the audit log with PowerShell is that unless you are very careful with the search criteria, you're going to see a heap of data that you might not want to process. There are now so many workloads providing audit records to the data mart that you might be flooded with records you don't expect... So make sure that your search parameters are tuned to request data for the operations you want to examine in the period you need to see...

@Tony Redmond fully agree, especially if you have a large tenant / many users. I set my search date interval to 24 hours and have about 200 licensed users, and I am getting around 3000-4000 rows per day (or about 1.5 mb data). However, a majority of the records are associated with a services account crawling the tenant for daily backup. Additionally, we are only using Office 365 for SharePoint, so no Exchange, OneDrive or other service that would push the number of returned records to a whole new level.

Given what I see for Exchange Online events, if you added them into the mix and add some compliance like Office 365 retention policies, you might double the data you gather daily...

My needs are specifc to SharePoint online. Using the GUI, I am able to get below 5 different reports and I am working on figuring out the correct PowerShell parameters to use and combine all 5 using the append option to get EVERYTHING that happened on a SINGLE site collection into one report.

 

File and page activities
Folder activities
Sharing and access request activities
Synchronization activities
Site administration activities

@Rajiv Chokshi refering to the Technet article that I linked in the end of my answer; You can use the "Operation" parameter to define the specific activities that you are looking for, but I would probably recommend you try querying all of them first and then review and select the ones you care about. 

 

Using the "RecordType" parameter, I see that you can also filter the search to for example include things like:

  • SharePointFileOperation
  • SharePointSharingOperation

The same with the site collection filter. I do not know directly which parameter that works best for this. As you can see you have one called "SiteIds" which could work, but I'm not sure that the ID stays the same for all sites or sub sites of a site collection. Maybe it would be better to use "ObjectIds" which is queried as string and returns the object as a URL. So defining the site collection url + a wildcard characters should hopefully work. Something like:

 

 

SiteIds https://contoso.sharepoint.com/sites/sitecollectionname*

 

Hope this helps!

 

 

Thanks, I will try again using your tips. Appreciate your help.

The SiteIds parameter doesn't work, i tried passing in WebId, Site but i get error

The Site Id search is not yet supported.
+ CategoryInfo : InvalidArgument: (:) [Search-UnifiedAuditLog], NotSupportedException
+ FullyQualifiedErrorId : [Server=VI1P191MB0240,RequestId=800fa919-711b-47ee-9915-699b8684a804,TimeStamp=28/12/2017 11:30:37] [FailureCategory=Cmdlet-NotSupportedException] 518C5EC4,Microsoft.Exchange.Management.SystemConfigurationTasks.Searc
hUnifiedAuditLog
+ PSComputerName : outlook.office365.com

I believe the SiteIds parameter does work, you just need to use the Site GUID (which you can obtain from the Audit Log Search in the Security and Compliance Center).

SiteIds absolutely does work, but as you indicate, the problem is to find the GUID for the site. If you have an audit record for an action performed in the site, you can find it in the "more information" details for the record. You can then do something like:

 

PS C:\> Search-UnifiedAuditLog -SiteIds acfe74d8-edfb-436d-924b-e018666605ee -StartDate 1-dec-2017 -EndDate 18-apr-2018 | ft creat
iondate, userids, operations
1 best response

Accepted Solutions
best response confirmed by Pontus T (Iron Contributor)
Solution

@NarasimaPerumal Chandramohan thanks for pointing me in the right direction. I managed to solve it by using SessionID and SessionCommand. All I needed was a while loop that kept running until the variable taking the audit data returned null, and keep appending the export file in every loop run.

View solution in original post