Reducing the overhead of a SQL Trace with filters
Asked Answered
L

4

6

We have a SQL 2000 server that has widely varied jobs that run at different times of day, or even different days of the month. Normally, we only use the SQL profiler to run traces for very short periods of time for performance troubleshooting, but in this case, that really wouldn't give me a good overall picture of the kinds of queries that are run against the database over the course of a day or week or month.

How can I minimize the performance overhead of a long-running SQL trace? I already know to:

  • Execute the trace server-side (sp_ create_trace), instead of using the SQL Profiler UI.
  • Trace to a file, and not to a database table (which would add extra overhead to the DB server).

My question really is about filters. If I add a filter to only log queries that run more than a certain duration or reads, it still has to examine all activity on the server to decide if it needs to log it, right? So even with that filter, is the trace going to create an unacceptable level of overhead for a server that is already on the edge of unacceptable performance?

Liberty answered 13/11, 2008 at 15:47 Comment(0)
L
2

I found an article that actually measures the performance impact of a SQL profiler session vs a server-side trace:

http://sqlblog.com/blogs/linchi_shea/archive/2007/08/01/trace-profiler-test.aspx

This really was my underlying question, how to make sure that I don't bog down my production server during a trace. It appears that if you do it correctly, there is minimal overhead.

Liberty answered 30/12, 2008 at 15:34 Comment(1)
You should edit your answer to include a summary of the findings from that post: running a server-side trace (that logs its data to a file on a disk separate from the disk where the SQL Server DB files are located) does not have much impact on server performance but [as stated in the comments] using more than 7 or 8 filters for a single trace event can have a negative impact.Jitney
F
2

Adding Filters does minimize the overhead of event collection and also prevents the server from logging transaction entries you don't need.

As for whether the trace is going to create an unacceptable level of overhead, you'll just have to test it out and stop it if there are additional complaints. Taking the hints of the DB Tuning Advisor with that production trace file could improve performance for everyone tomorrow though.

Flanders answered 13/11, 2008 at 16:11 Comment(0)
G
2

You actually should not have the server process the trace as that can cause problems: "When the server processes the trace, no event are dropped - even if it means sacrificing server performace to capture all the events. Whereas if Profiler is processing the trace, it will skip events if the server gets too busy." (From SQL 70-431 exam book best practices.)

Garnettgarnette answered 13/11, 2008 at 19:28 Comment(1)
So, if I read it right, run the Profiler GUI from a separate machine? This seems to fly in the face of most of the other advice I've seen out there.Liberty
L
2

I found an article that actually measures the performance impact of a SQL profiler session vs a server-side trace:

http://sqlblog.com/blogs/linchi_shea/archive/2007/08/01/trace-profiler-test.aspx

This really was my underlying question, how to make sure that I don't bog down my production server during a trace. It appears that if you do it correctly, there is minimal overhead.

Liberty answered 30/12, 2008 at 15:34 Comment(1)
You should edit your answer to include a summary of the findings from that post: running a server-side trace (that logs its data to a file on a disk separate from the disk where the SQL Server DB files are located) does not have much impact on server performance but [as stated in the comments] using more than 7 or 8 filters for a single trace event can have a negative impact.Jitney
N
0

It’s actually possible to collect more detailed measurements than you can collect from Profiler – and do it 24x7 across an entire instance -- without incurring any overhead. This avoids the necessity of figuring out ahead of time what you need to filter… which can be tricky.

Full disclosure: I work for one of the vendors who provide such tools… but whether you use ours or someone else’s… this may get you around the core issue here.

More info on our tool here http://bit.ly/aZKerz

Nessa answered 18/11, 2010 at 17:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.