IDisposable pattern is expensive to implement. I've counted 17 lines of code before even starting to actually dispose resources.
Eric Lippert recently wrote a blog post bringing up an interesting point: any time a finalizer runs, it is a bug. I think it make perfect sense. If the IDisposable pattern is always followed, Finalizer should always be suppressed. It will never have a chance to run. If we accept that finalizer run is a bug, then does it make sense to have a guideline to force developers to derive from the following abstract class and forbid directly implementing the IDisposable interface.
public abstract class AbstractDisaposableBase: IDisposable
{
~AbstractDisaposableBase()
{
ReportObjectLeak();
Dispose(false);
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected abstract void Dispose(bool disposing);
[Conditional("DEBUG")]
private void ReportObjectLeak()
{
//Debug.Assert(false, "leaked instance");
//throw new Exception("leaked instance");
}
}
The benefits are clear:
- Implementing dispose becomes easy and error free like below:
class MyClass1 : DisablableBase
{
protected override void Dispose(bool disposing)
{
//dispose both managed and unmamaged resources as though disposing==true
}
}
Not disposed object got reported
Disposable pattern is always followed
But, is there any problem with such a guideline?
One possible problem is that all disposable object will have a finalizer defined. But since the finalizer is always suppressed, there should not be any performance penalties.
What are your thoughts?
Dispose
without finalizer. – BaxterConditionalAttribute
on the finalizer itself, since classes with a finalizer defined imply some overhead. – NebulosityDispose() = 0
betrays that you're coming from C++. In C# we don't really deal with unmanaged resources anymore, so this is all academic. Just never write a ~destructor and you're good. Look up the SafeHandle class. – Khanate