RAID Array / Disk Performance with Bitlocker - unexpected results

Copper Contributor

I've doing some testing on a system to identify the impact that Bitlocker Encryption has on the read/write performance and the results have been interesting to say the least; I'm hoping someone can help to explain my findings.

 

I'm using an entry level server with quad core Xeon E-2224 CPU and 16GB RAM which has Windows Sever 2016 Standard installed - and it's a totally fresh Windows install with all of the latest relevant drivers are installed.

 

My storage array is configured through the integrated controller on the motherboard and is across 4 x 10TB WD Enterprise disks configured as follows:

 

RAID Level

RAID5

Legacy Disk Geometry (C/H/S)

65535/255/32

Strip Size

64 KiB

Full Stripe Size

192 KiB

Disk 0

WDC WD102KRYZ-0

Disk 1

WDC WD102KRYZ-0

Disk 2

WDC WD102KRYZ-0

Disk 3

WDC WD102KRYZ-0

 

I've provisioned a single large logical volume and I've created two partitions for my testing as follows:

 

Drive

Capacity

File System

Allocation Unit Size

Y:

100 GB

NTFS

4096 bytes

Z:

25,600 GB

NTFS

8192 bytes

 

I ran an array of tests (results attached) on the volumes before encryption, then reformatted and encrypted the volumes and same tests again.

 

The results have been very interesting, I was expecting to see a small performance hit across all tests due to Bitlocker, but what I found was that some tests had a drastic performance decrease, in some cases up to 50%, and surprisingly some tests showed an increase in performance, in one case an increase of 60%.

 

Can anyone help to explain this?  I'm thinking of running the tests again with another product to see if I get similar results, but what I'm using is fairly robust/reliable from my experience.

 

Findings are in the spreadsheet attached.

0 Replies