Forum Discussion
Help with parameter for Search-UnifiedAuditLog
- Apr 28, 2017
NarasimaPerumal Chandramohan thanks for pointing me in the right direction. I managed to solve it by using SessionID and SessionCommand. All I needed was a while loop that kept running until the variable taking the audit data returned null, and keep appending the export file in every loop run.
Remember that if you build a solution, you have to maintain the code... and things break, like APIs. That's one of the reasons why it is sometimes better to go with a commercial solution and let the ISV have the headache of maintaining the code and making sure that it works, even when the APIs are in flux (as does happen inside Office 365).
TR
Pontus T wrote:TonyRedmond Thank you for the tip about Cogmotive. I will have a look at the product for sure. However, I like the customizability that I get in Power BI, building interactive reports for different stakeholders. And I guess it's something about the challenge of building it yourself :)
Is this possible with the native PowerShell command search-unifiedauditlog?
I am trying to figure out if it is possible to get a SINGLE report for all activities on a particular SPO site collection using the PowerShell command search-unifiedauditlog.
For example I want to know everything that happened on below site collection in the last 30 days https://contoso.sharepoint.com/sites/projectabc and this should include all of these :
File and page activities
Folder activities
Sharing and access request activities
Synchronization activities
Site administration activities
Thanks in advance. I was able to do this with AdminDroid but looking to see if the native command can accomplish this.
- Pontus TJul 20, 2017Iron Contributor
Rajiv Chokshi Sorry for the delayed reply. Yes, this should be possible. Anything you can query in the Audit log in the Security & Compliance Center, can be pulled from the Audit log using Search-UnifiedAuditLog, and exported to CSV. So start with going to the Office 365 Audit log and confirm that you can find the actions you are looking for.
Then you can start on designing the script. To get you started, below is a copy of the scripts that I am using. It is looping the cmdlet using the session parameter until all the records are exported to the CSV file. When I'm converting from JSON, and when exporting to CSV, I am selecting certain objects that are of value for my use case, but there are more object holding data that might be valuable for you. Taking away the selection should bring it all in. Also, I am searching 3 days back in time as I have noticed that some records take that long before they are saved to the audit log. You can experiment with this and compare the number of rows you get depending on how far back you set the interval.I am far away from an expert on PowerShell, but with patience and help from our database engineer at work, we have created a Power BI dashboard that can display all audit log actions across the whole tenant in a user friendly and graphical way. It gets new data from the script on a daily basis, and as it is exported you will have no limits in searching back in time, compared to the (newly announced) 90 days that MSFT saves for the Audit log.
Please note that you should start by using a script to securely save (encrypt) your credentials, which in turn are called in the beginning of the second script (you only need to use this once). You probably create a service account for this and set it up with minimal permissions needed (more under the "Before you begin" tab under https://support.office.com/en-us/article/Search-the-audit-log-in-the-Office-365-Security-Compliance-Center-0d4d0f35-390b-4518-800e-0c7ec95e946c?ui=en-US&rs=en-US&ad=US&fromAR=1#PickTab=BYB).
First script to store your credentials
param( [Parameter(Mandatory=$true)] [string]$Path, [Parameter(Mandatory=$true)] [string]$password ) # Convert the password to a secure string $SecurePassword = $password | ConvertTo-SecureString -AsPlainText -Force # Store the credential in the path $SecurePassword | ConvertFrom-SecureString | Out-File $Path # Write What we did Write-Host "Wrote password to $path" <# .SYNOPSIS Stores a password in a file on the local computer for retrevial by scripts. .DESCRIPTION Used for securely storing a password on a machine for use with automated scripts. Takes a password and encrypts it using the local account, then stores that password in a file you specify. Only the account that creates the output file can decrypt the password stored in the file. .PARAMETER Path Path and file name for the password file that will be created. .PARAMETER Password Plain text version of password. .OUTPUTS File Specified in Path variable will contain an encrypted version of the password. .EXAMPLE .\New-StoredCredential.ps1 -Path c:\scripts\O365Account.txt -Password "Password123" Puts the encrypted version of Password123 into the c:\scripts\O365Account.txt file #>Second script to pull data from the audit log (you need to define your own variables in some places as I have removed mine from this copy)
# Set constants $Username = "[YOUR ACCOUNT, RECOMMENDED TO USE THE PRINCIPLE OF LEAST PRIVILEGE]" $PasswordPath = "[YOUR PATH DEFINED IN SCRIPT ABOVE]" # Read the password from the file and convert to SecureString Write-Host "Getting password from $passwordpath" $SecurePassword = Get-Content $PasswordPath | ConvertTo-SecureString # Build a Credential Object from the password file and the $username constant $Credential = New-Object System.Management.Automation.PSCredential -ArgumentList $Username, $SecurePassword # Open a session to Exchange Online were the audit log cmdlet is located $O365Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $Credential -Authentication Basic -AllowRedirection Import-PSsession $O365session # Define dates for search. MSFT claims Windows short date format on the machine running the script should be used, but test proved "MM/dd/yyyy" must be used $StartDate = (get-date).AddDays(-3).ToString("MM/dd/yyyy") $EndDate = (get-date).AddDays(-2).ToString("MM/dd/yyyy") # Define file name and path for csv export $FileAppend = (get-date).AddDays(-3).ToString("yyyyMMdd") # [string]$FileAppend = (Get-Date -Format yyyyMMdd) $OutputFile = "[DEFINE OUTPUT FILE PATH]" + $FileAppend + ".csv" # Set variable value to trigger loop below (can be anything) $AuditOutput = 1 # Loop will run until $AuditOutput returns null which equals that no more event objects exists from the specified date while ($AuditOutput) { # Search the defined date(s), SessionId + SessionCommand in combination with the loop will return and append 100 object per iteration until all objects are returned (minimum limit is 50k objects) $AuditOutput = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -SessionId "1" -SessionCommand ReturnLargeSet # Select and expand the nested object (AuditData) as it holds relevant reporting data. Convert output format from default JSON to enable export to csv $ConvertedOutput = $AuditOutput | Select-Object -ExpandProperty AuditData | ConvertFrom-Json # Export results exluding type information. Append rather than overwrite if the file exist in destination folder $ConvertedOutput | Select-Object CreationTime,UserId,Operation,Workload,ObjectID,SiteUrl,SourceFileName,ClientIP,UserAgent | Export-Csv $OutputFile -NoTypeInformation -Append } # Close the sesssion to Exhange Online Remove-PSSession $O365SessionI hope this helps to get you started. You can read more about the audit log and how you can limit it's scope if needed https://technet.microsoft.com/en-us/library/mt238501%28v=exchg.160%29.aspx?f=255&MSPPError=-2147217396.
- benjoynerJun 04, 2019Copper Contributor
Pontus T Hello, I love your script its exactly what we have been trying to accomplish; however, I have found a problem. It seems that many records are duplicated and even though i am sorting the records in the command by creation date, they seem to be out of order. Its almost as if the file is appended about 100 records at a time and i notice the creation dates jumbled. for example the creation dates will be:
5/20/2019 0:19 5/19/2019 21:23 5/16/2019 22:40 5/16/2019 20:34 5/16/2019 20:30 5/16/2019 20:30 5/16/2019 20:30 5/16/2019 12:28 5/16/2019 12:26 5/24/2019 21:00 5/24/2019 20:59 5/24/2019 20:59 5/24/2019 20:58 5/24/2019 20:58 5/24/2019 20:57 This wouldn't be so bad as I can sort the data in Excel, but because there are duplicate records it seems there may be some overlap during the append.
- Rajiv ChokshiJun 05, 2019
Microsoft
benjoyner Would this help? https://blogs.msdn.microsoft.com/tehnoonr/2018/01/26/retrieving-office-365-audit-data-using-powershell/
- TonyRedmondJul 20, 2017MVP
One thing that strikes me about searching the audit log with PowerShell is that unless you are very careful with the search criteria, you're going to see a heap of data that you might not want to process. There are now so many workloads providing audit records to the data mart that you might be flooded with records you don't expect... So make sure that your search parameters are tuned to request data for the operations you want to examine in the period you need to see...
- Pontus TJul 20, 2017Iron Contributor
TonyRedmond fully agree, especially if you have a large tenant / many users. I set my search date interval to 24 hours and have about 200 licensed users, and I am getting around 3000-4000 rows per day (or about 1.5 mb data). However, a majority of the records are associated with a services account crawling the tenant for daily backup. Additionally, we are only using Office 365 for SharePoint, so no Exchange, OneDrive or other service that would push the number of returned records to a whole new level.