Almost 4 years ago, I wrote a script called GetTransactionLogStats.ps1 , and originally blogged about it here (where you can also download it). At the original time of writing, the primary purpose of the script was to collect transaction log generation data, and use that to determine the percentage of transaction logs generated per hour in an environment. This could in turn be used to populate the related fields in the Exchange Server Role Requirements Calculator.
Since originally writing this script, I’ve made some significant updates to how the script functions, and also added a handful of new features along the way, which were significant enough that I wanted to have a new post about it. Of note, the script now:
The script has the following requirements;
The script has the following parameters:
Runs the script in Gather mode, taking a single snapshot of the current log generation of all databases on servers server1 and server2:
PS C:\> .\GetTransactionLogStats.ps1 -Gather -TargetServers "server1","server2"
Runs the script in Gather mode, specifies an alternate directory to output LogStats.csv to, and resets the stats in LogStats.csv if it exists:
PS C:\> .\GetTransactionLogStats.ps1 -Gather -TargetServers "server1","server2" -WorkingDirectory "C:\GetTransactionLogStats" -ResetStats
Runs the script in Analyze mode:
PS C:\> .\GetTransactionLogStats.ps1 -Analyze
Runs the script in Analyze mode, and specifies an alternate directory to send the output files to:
PS C:\> .\GetTransactionLogStats.ps1 -Analyze -LogDirectoryOut "C:\GetTransactionLogStats\LogsOut"
When run in Gather mode, the log generation snapshots that are taken are sent to LogStats.csv. The following shows what this file looks like:
This is the primary output file for the script, and is what is used to populate the hourly generation rates in the Exchange Server Role Requirements Calculator. It consists of the following columns:
This file contains a heat map showing how many logs were generated for each database during the duration of the collection. This information can be used to figure out if databases, servers, or entire Database Availability Groups, are over or underutilized compared to their peers. It consists of the following columns:
This file is similar to LogGenByDB.csv, but shows the log generation rates per hour for each database. It consists of the following columns:
Since the script is designed to be run an hourly basis, the easiest way to accomplish that is to run the script via a Scheduled Task. The way I like to do that is to create a batch file which calls Powershell.exe and launches the script, and then create a Scheduled Task which runs the batch file. The following is an example of the command that should go in the batch file:
powershell.exe -noninteractive -noprofile -command “& {C:\LogStats\GetTransactionLogStats.ps1 -Gather -Targe
tServers “server1”,”server2” -WorkingDirectory C:\LogStats}”
In this example, the script is located in C:\LogStats. Note that I specified a WorkingDirectory of C:\LogStats so that if the Scheduled Task runs in an alternate location (by default C:\Windows\System32), the script knows where to find and where to write LogStats.csv. Also note that the command does not load any Exchange snapin, as the script doesn’t use any Exchange specific commands.
Mike Hendrickson
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.