I have a long running Torque/PBS job and I'd like to monitor output. But log file only gets copied after the job is finished. Is there a way to convince PBS to refresh it?
Unfortunately, AFAIK, that is not possible with PBS/Torque - the stdout/stderr streams are locally spooled on the execution host and then transferred to the submit host after the job has finished. You can redirect the standard output of the program to a file if you'd like to monitor it during the execution (it makes sense only if the execution and the sumit hosts share a common filesystem).
I suspect the rationale is that it allows for jobs to be executed on nodes that doesn't share their filesystem with the submit node.
-k
flag which is not very nice though - so I ended up capturing stdout outside the queue. :/ –
Abri bpeek
command on a MOAB/Torque system and frustratingly found none. –
Arria This is possible in TORQUE. If you have a shared filesystem you can set
$spool_as_final_name true
in the mom's config file. This will have the file write directly to the final output destination instead of spooling in the spool directory. Once you are set up with that you can tail -f the output file and monitor anything you want.
http://www.adaptivecomputing.com/resources/docs/torque/3-0-3/a.cmomconfig.php (search for spool_as_final_name
Unfortunately, AFAIK, that is not possible with PBS/Torque - the stdout/stderr streams are locally spooled on the execution host and then transferred to the submit host after the job has finished. You can redirect the standard output of the program to a file if you'd like to monitor it during the execution (it makes sense only if the execution and the sumit hosts share a common filesystem).
I suspect the rationale is that it allows for jobs to be executed on nodes that doesn't share their filesystem with the submit node.
-k
flag which is not very nice though - so I ended up capturing stdout outside the queue. :/ –
Abri bpeek
command on a MOAB/Torque system and frustratingly found none. –
Arria For me, ssh-ing to the node where the job is running and looking at files under /var/spool/torque/spool/
works, but it might be specific to this particular environment.
In case you submit a shell script you may also put these two commands in the beginning of the script.
exec 1>file.stdout
exec 2>file.stderr
This will put the output from stdout and stderr in the working directory of your job.
© 2022 - 2024 — McMap. All rights reserved.
bpeek
. – Losing