Consider this simple example, probably a function you wrote a couple of times, but now abortable:
/**
*
* @param {number} delay
* @param {AbortSignal} [abortSignal]
* @returns {Promise<void>}
*/
export default function timeoutPromise(delay, abortSignal) {
return new Promise((resolve, reject) => {
if(abortSignal) {
abortSignal.throwIfAborted();
}
const timeout = setTimeout(() => {
resolve();
}, delay);
abortSignal.addEventListener("abort", () => {
clearTimeout(timeout);
reject(new Error("Aborted"));
});
});
}
The obvious issue is that this will not clear the eventListener if the timeout succeeds normally. It can be done, but it is quite ugly:
/**
*
* @param {number} delay
* @param {AbortSignal} [abortSignal]
* @returns {Promise<void>}
*/
export default function timeoutPromise(delay, abortSignal) {
return new Promise((resolve, reject) => {
// note: changed to reject() to get consistent behavior regardless of the signal state
if(abortSignal && abortSignal.aborted) {
reject(new Error("timeoutPromise aborted"));
}
let timeout = null;
function abortHandler() {
clearTimeout(timeout);
reject(new Error("timeoutPromise aborted"))
}
timeout = setTimeout(() => {
if(abortSignal) {
abortSignal.removeEventListener("abort", abortHandler);
}
resolve();
}, delay);
if(abortSignal) {
abortSignal.addEventListener("abort", abortHandler, {once: true});
}
});
}
That's... a lot of code for such a simple thing. Am I doing this right or is there a better way?
a lot of code
written slightly differently, it's effectively two extra lines of code – AnalisaanaliseabortSignal.throwIfAborted();
? And what's with theif (abortSignal)
in the first place? – Recoveryscheduler.postTask(cb, { delay, signal })
and it even returns a Promise. – Fccabort
event handler closes overtimeout
,reject
, and - depending on the promise implementation - even over the promise and its value. You don't want a long-running task with many small delays to accumulate many handlers on its abort signal - typically only the last one is actually able to abort the task. – RecoverythrowIfAborted
because it leads to inconsistent behavior. If the signal is aborted, it throws when creating the promise, if the signal were aboreted just a moment later, it would throw only onawait
. This is relevant when you're awaiting promises later and not the instant you create them, which is needed to execute tasks concurrently. Theif(abortSignal)
is there because the signal is intended to be optional. – Ripleysignal.throwIfAborted()
in the executor callback is exactly equivalent toif (signal.aborted) reject(signal.reason)
. "when you're awaiting promises later and not the instant you create them, which is needed to execute tasks concurrently" - no, it wouldn't make a difference. And you should just never do that anyway! Always usePromise.all
for concurrent tasks. – RecoveryPromise.all
then the total time for your work is the time to resolve the slowest promise plus the sum of time to handle the results. If you usePromise.race
to handle responses as they come, you can handle responses in parallel to waiting for the promises, but in that case you need to keep the unresolved promises somewhere. Just recently, I heard colleague talking about a bug when someone blindly followed thePromise.all
rule, causing significant bottleneck in the system. – RipleyPromise.all
, just put the handling code right where you make each request (Promise.all(vals.map(async v => { const response = await request(v); return handle(response); }))
). No need forPromise.race
or keeping unhandled promises around for some time. – RecoveryPromise.race
and replace the completed ones with new ones. That's probably not something typical webdev needs to worry about. I am talking millions of tasks, doing max N in parallel. – Ripley