I'm running a lot of jobs with Sun Grid Engine. Since these are a jobs (~100000), I would like to use array jobs, which seems to be easier on the queue.
Another problem is that each jobs produces an stdout and stderr file, which I need to track error. If I define them in the qsub -t 1-100000 -o outputdir -e errordir
I will end up having directories with 100000 files in them, which is too much.
Is there a way to have each job write the output file to a directory (say, a directory which consists of the first 2 characters of the job ID, which is random hex letters; or the job number modulu 1000, or something of that sort).
Thanks
&>> logs/xyz.log
. You could even use environment variables likeJOB_ID
andSGE_TASK_ID
to build the filename, and Bash supports basic arithmetic for the modulo. – Aventine