Why good scripts may start to fail on you for instance with timestamps like “01/01/0001 00:00:00”!
Published Feb 15 2019 03:52 AM 204 Views
First published on TECHNET on Sep 11, 2010
You may have scripts running fine since early DPM2007 days but start to show unexpected results with DPM2010. This blog explains what is likely happening and how to resolve that.

Typically information returned from cmdlets is assumed to be valid unless some error occurred, right? That is no longer a valid assumption for some cmdlets in DPM2010 if it ever was valid and may surface now were it did not before. We pick 1 sample to explain; the “ Get-DataSource ” cmdlet which returns data source objects. Data source objects have properties like ‘ OldestRecoverypoint’ , ‘ LatestRecoverypoint’ and ‘ TotalRecoverypoints’ that are useful for SLA and other monitoring purposes. However these properties are computed asynchronously and signal an event when done at which point property values are valid. This means when the “Get-Datasource” cmdlet returns objects, their property values may not all have computed yet and typically shows timestamps as mentioned in the title. Depending on your script flow and efficiency this may or may not occur with some or all objects. Yes I know;
“… ‘may or may not’ is hideous, how do I know if and when I can use these…?”

Let’s look into solutions…

From the above we understand that we need to catch those events signaling that property values are valid, so how is that done? Data source objects have event members one of which is called “ DataSourceChangedEvent ” and is signaled when values computation completes. PowerShell v2 and later provide cmdlets to work with events. Take a look at the sample script below; we start with “Disconnect-DPMserver” to clear caches, later more on that!
For each protection group we collect all data sources into the “$dss” collection. Subsequently for all data sources we register on the ‘DataSourceChanged’ event with an action block that only increases the global $RXcount variable by 1. We access the ‘ LatestRecoverypoint ’ property of all data sources to trigger events for all objects. Then we go into a wait loop until the expected number of events have been processed (1 for each data source) or 30 seconds elapsed to implement some limit and not wait forever. We check that we got all expected events after which we know all “$dss” objects contain valid property values and may continue using them. Finally we unregister all at once.

Disconnect-DPMserver #clear object caches
$dss = @(Get-ProtectionGroup (&hostname) | foreach {Get-Datasource $_})
$dss = $dss | ?{$_} #remove blanks
for ($i=0; $i -lt $dss.count;$i++) {
[void](Register-ObjectEvent $dss[$i] -EventName DataSourceChangedEvent -SourceIdentifier "TEV$i" -Action {
# touch properties to trigger events and wait for arrival
$dss | select latestrecoverypoint  > $null #do not use [void] coz does not trigger
$begin = get-date
$m = Measure-Command {
while (((Get-Date).subtract($begin).seconds -lt 30) -and ($RXcount -lt $dss.count) ) {sleep -Milliseconds 100}
if ($RXcount –lt $dss.count) {write-host “Less events arrived [$RXcount] than expected [$($dss.count]”}
Unregister-Event *

This approach is efficient when working with collections and just need to validate that computations for all objects in the collection completed. In other cases you may want to process inside the “-Action { …}” block and unregister only the event that triggered it. See “ Get-Help Register-ObjectEvent “ for more on this.
There is a drawback to the above flow, you cannot simply execute the above by connecting to remote DPM servers because by default event delivery is local to your Powershell session. In such cases you must use the “–Forward” with the “Register-Event” cmdlet. Then again remoting all of the above (see “Get-Help Invoke-Command”) consuming just the end result would be more efficient. I will soon post a blog that does this collecting recovery point status information across one or many DPM servers.

A small variation…

Rather then maintaining a count in the –Action {} block you can also add the sending object that signaled it into a separate collection. You then have a collection of only objects that got updated. This is more suitable if you plan to go ahead with what you have anyway regardless if all objects got signaled or not.

$global:RXobj = @()
for ($i = 0; $i -lt $dss.count;$i++) {
[void](Register-ObjectEvent $dss[$i] -EventName DataSourceChangedEvent -SourceIdentifier "TEV$i" -Action {
#Look at our own sourced events only
if ($Event.SourceIdentifier -match "TEV") {$global:RXobj += $event.Sender}
#touch properties to trigger events and wait for arrival
$dss | select latestrecoverypoint > $null #do not use [void] coz does not trigger
$begin = get-date
while (((Get-Date).subtract($begin).seconds -lt 10) -and ($RXobj.count -lt $dss.count) ) {sleep -Milliseconds 250}
Unregister-Event *

Alternative to this sample…

I can imagine you do not want to go into ‘eventing’ just yet. The alternative would be to get all recovery points for each data source, sort that on ‘ RepresentedPointInTime’ and you also have easy access to oldest, latest and total. Be aware with a few dozen data sources, a couple of recovery points per day and 14 days retention you quickly request thousands of objects just to get 3 values.

Some more on objects in DPM…

DPM caches objects for a variety of reasons and not each access of a property also produces an event. That’s why we start with “Disconnect-DPMserver” which clears object caches resulting in events being produced on 1st access. You can loop through an object collection all of which have an event registered (like above sample) but you cannot loop on the same DPM data source object and expect an event each time you access a property unless the object actually changed.
For those that are more familiar with using events; this is different to objects that change more regularly like the process object with rerouted standard output and the ‘OutputDataReceived’ event, each time the process writes standard output another event is generated.
Version history
Last update:
‎Mar 11 2019 08:32 AM
Updated by: