size bottleneck in script execution

Copper Contributor

So I got this code running in a network drive which is 100 GB. The code works on a small folder, where the duplicate items are moved properly. But as the folder gets larger it just exits out and does not move any of the duplicate file it finds. The group-object is unable to handle the cache of file hash.. I think... Anyway is there a way to fix the script to run on large disk space?

 

$DateBound = (Get-Date).AddDays(-10)
$otherFolder = ./DuplicateBackup
Get-ChildItem -Recurse -Filter *.* -File |
Where-Object { $_.LastAccessTime -lt $DateBound } |
Get-FileHash -ErrorAction SilentlyContinue |
Group-Object -Property Hash |
Where-Object { $_.Count -gt 1 } |
ForEach-Object {
$_.Group |
Select-Object -skip 1 |
Move-Item -Force -Destination $otherFolder
}

0 Replies