When using System.Diagnostics
tracing, is there a significant (measurable) performance impact on not removing the "Default" trace listener on a production ASP.NET application in release mode, with the TRACE
constant defined at compilation time but with no debugger attached at runtime?
To clarify, the question is about additional impact of the "Default" trace listener on an application that is using other trace listeners, not about alternatives to System.Diagnostics tracing.
Are there any measures of the impact of the Default trace listener when there is no debugger attached? Are there any benchmarks already done of the impact in production of leaving out the "remove" element from a code such as this:
<configuration>
<system.diagnostics>
<trace autoflush="false" indentsize="4">
<listeners>
<remove name="Default" />
<add name="myListener" type="System.Diagnostics.TextWriterTraceListener" initializeData="c:\myListener.log" />
</listeners>
</trace>
</system.diagnostics>
</configuration>
This question is different from .NET Tracing: What is the “Default” listener? in the sense that that other question was focused on the impact of the Default listener when running under Visual Studio and updating a debugging UI, and this question is focused on release code in a production environment.