Forum Discussion
Externally trigger a .ps1 to exit
I'm running a .ps1 from Task Scheduler. If I stop the script from the Task Scheduler console with the End command, does that generate an event/trigger that the script can react to? (I'd like the script to shut down gracefully.) If I can't do that from the Task Scheduler console, is there another way to do it?
- taking action within the script when an external action has stopped the scheduled job - no, this isnt possible.
- Jonathan_AllenBrass Contributor
the ps1 that is stopped will exit with a non-success error but as you are not launching that script from a process that can handle that error code then you have no way to check the code and take any action. Also, you are ending the task and not the actual script itself so you don't have the separation needed - the task doesnt continue processing at all to handle the exit code.
if you start a cmd session and then type powershell you will be running a powershell session in a command prompt session.
If you now type exit 20 at the PS> prompt you will intentionally exit the powershell session with exit code 20 and revert to the command prompt session
At the prompt run the command echo %errorlevel%. The result will be 20 - the powershell session exit value. You can consume this in whatever logic you have in the calling process...
this value is passed between the sessions as the 'inner' powershell session closed gracefully with an exit code. the same isnt true for when notepad.exe is closed with Stop-Process.
even if you have a cmd/bat script launch your ps1, if you end the task then there is nothing running to handle any onward actions.
I guess we have reached the point to ask what you are actually doing and trying to achieve so that we can provide a better option for you rather than proceed down the rabbit hole of process launching and exiting.
- Mark_BlockCopper Contributor
Thanks for the quick response. Here's more detail.
I have a script in a .ps1 file. I run PowerShell.exe from a scheduled task and provide the .ps1 as a PowerShell command-line argument. The script is moving files with Get-ChildItem | ForEach-Object and may run for days. The script generates status messages via Write-Output. When the script has moved all the files, it provides a final message with stats and exits. If we need to stop the script before all files are moved, we lose the final statistics. I considered a "brute force" semaphore such as creating a file that the script checks for that signals it to shut down. Is there a more elegant solution? Environment variable? Some other sort of system trigger?
I'm not married to running the script from the Task Scheduler. (I just need the script to run without a logged-in session.)
Thanks in advance for you help.
- Jonathan_AllenBrass Contributor
OK, some thoughts:
- You say the script 'may run for days' is this because there are lots of files or they are large files or you have slow network ... etc? Can you set the job to run for a fixed length of time and just trigger it more frequently? This way you know it will more often stop gracefully?
- You are using Write-Output to add updates to the session as (i assume) files are moved. Why not switch this to writing your progress to a static file with Add-Content?$Date = "{0:yyyymmdd_HHmm}" -f (Get-Date) $Logfile = "[MySecurePathLocation]\FileMoverJob\$Date`_Process.txt" $Logfile $Msg = "{0:yyyymmdd_HHmmss}`tProcess starting." -f (Get-Date) $Msg | add-Content $Logfile #[getchilditem loop I assume] $Msg = "{0:yyyymmdd_HHmmss}`tMoving $ThisFile." -f (Get-Date) $Msg | add-Content $Logfile ##end of loop $Msg = "{0:yyyymmdd_HHmmss}`tProcess complete." -f (Get-Date) $Msg | add-Content $Logfile
This way if the process stops unexpectedly you will have a file with a log of what has been completed and the times it occurred...