Possibility of Script that works with a load of data loop by loop - or from time to time

Brass Contributor

Hello Guys


Well, I don't know how to explain it, but I'll try, I hope I succeed ...


 Today I have a list of about 72,000 entries that I want to upload to my Exchange Online.


What I would like is for it to do for example... 5000, pause, +5000, pause, +5000, until the end.


An old Exchange module, for example, allows connections with a time of 30 minutes.


6. And with that, I know that there are - but I haven't seen - Script constructions that do more or less this... Loads 72,000, performs treatment for up to 30 minutes... then disconnects/reconnects and performs a new treatment for up to 30 minutes, and repeat this until the end, but always taking advantage of the same load. Somehow there's a mark on what's already been processed that it restarts every time from what hasn't.


I imagine that the concept, the idea, is the same.


I would like - if it's not too much trouble - to understand how to build, or find, a code snippet, an entire script that does something in this sense to deepen the study and adapt to my needs


Many, many thanks for everyone's attention and help!

2 Replies
Hey there!

In this case if you are sure the connection time limit is 30 minutes, I recommend using something that measures time.
This would be my approach:

- Create a PowerShell runspace and instance to run the workload asynchronously.
- Start a System.Diagnostics.StopWatch
- Enter a while loop that checks the execution of the PowerShell instance, and times out after 30 minutes.

However, there is a chance of your script terminate the execution in the middle of an item processing, which in this case it's better to use item count.

Here's a sketch. Mind that this was not tested, and you need to adjust to your needs.

$bigList = @(<# Your 72,000 items list #>)

# Creating a Runspace Pool to hold our PowerShell 'threads'.
$pool = [runspacefactory]::CreateRunspacePool()
$pool.ApartmentState = 'STA'

# Create a synchronized hash table to exchange data between the current and new threads.
$messenger = [hashtable]::Synchronized(@{
Processed = [System.Collections.ArrayList]@()
Pending = $null

while ($messenger.Processed.Count -lt $bigList.Count) {

# Separate the items you already processed. You might have to adjust this according
# to the complexity of your object list.
$messenger.Pending = $bigList | Where-Object { $_ -notin $messenger.Processed }

# Creating the PowerShell instance.
$powerShell = [powershell]::Create()
$powerShell.RunspacePool = $pool

param ($Messenger)

foreach ($item in $Messenger.Pending) {
Your logic goes here. We used the messenger to retrieve the pending items to process.
After you finish processing each item, add it to the 'Processed' property from the


# Passing the messenger as a parameter.
[void]$powerShell.AddParameter('Messenger', $messenger)

# Starting the instance asynchronously
$handle = $powerShell.BeginInvoke()

# Here is your main while loop. You can wait 'X' amount of time, or 'X' itens processed, while monitoring
# the messenger processed list. If you choose time, use a StopWatch.

$stopWatch = [System.Diagnostics.Stopwatch]::StartNew()
do {

# Check the instance execution.
if ($powerShell.InvocationStateInfo.State -ne 'Running') {

} while ($stopWatch.Elapsed.TotalMinutes -le 30)

# Dispose of the PowerShell instance. This is important to avoid resource leak.
# If you care about the outcome, assign a variable instead of [void]~.

# Clenup the pool as well



There are a few ways that you could approach this, depending on the input source, you could look to add a field/column to the source data/file that includes a Boolean "processed" flag, which your script would update after your script logic completes


An alternative method might be to load your input data, take a count of the total items/rows and maintain an "index" of the current item/row being processed... this index would get incremented after each item is processed