I have a script that works great manually. However, when I schedule it in Task Scheduler, the script never ends and so the next time it tries to run, it fails because the prior instance is still running. The script itself takes a few seconds to complete the first time or when run manually. Here is the script:
$source = "\\server1\upload"
$destination = "\\server2\upload"
$logfile = "c:\Scripts\fileMover\log.txt"
$table = Get-ChildItem $source -include *
foreach ($file in $table){
$filename = $file.FullName
#write-host $filename
try
{
move-item -LiteralPath $filename -destination $destination -force
$body = "Successfully moved $filename to $destination"
$subject = "fileMover Succeeded"
}
catch
{
$body = "Failed to move $filename to $destination"
$subject = "fileMover Failed"
}
finally
{
$body | out-file $logfile -append -width 1000 -encoding ascii
Send-MailMessage -To "[email protected]" -From "[email protected]" -Subject $subject -SmtpServer "10.1.10.1" -Body $body
exit
}
}
exit
The script is scheduled with the following settings:
- Run whether user is logged or not (user account has been granted log in as a batch program privilege)
- Run with highest privileges
- Triggers Daily, every 2 minute
- Action: Start a Program powershell -file c:\Scripts\upload.ps1
As a workaround, I configured the task to automatically stop after 1 minute. However, I'm concerned that in certain circumstances -- such as a large number of large files -- the script may get terminated before completing fully.
The script needs to run every 2 minutes.
'Start-Transcript "C:\Scripts\Upload-Transcript.txt"
to the start of your script, and see if it's throwing errors when running via task scheduler, it may be prompting for email credentials or something, and getting stuck there. – Diandrous