What to look for when setting UpdateBatchSize
Asked Answered
R

1

2

I have a .NET application that is merging two datatables with a lot of rows (10,000+). There is a good chance of having a large number of update/inserts to perform to the SQL table when using the DataAdapter.Update command.

Right now, I have the Adapter UpdateBatchSize property set to 200. VS warns against setting this value too high because it may decrease performance. Ok, gotcha.

Performance wise, what should I look for when setting this property? No matter what, updating lots of rows will take a bunch of time. Running it on my machine (or on the DB server) doesn't -seem- to take that much time, but I am sure when the system is loaded down doing other items, this may be an issue.

Is there something I can look for in the Profiler? Doing a standard profiling, the Duration is usually 0. Sometimes is hits 1 or 2 (maybe 20 times overall) and out of about 20,000 updates, 3-4 hit 20. CPU is at 0 except for the a couple that hit 1-2. There are 2 records that go up to around 10. Reads are always 2 and Writes are always 0.

Revoice answered 14/4, 2009 at 1:21 Comment(0)
D
2

First thing is I'd make this setting configurable so you can test various values without compiling. The next thing is to watch the duration of your entire batch. If your happy with the performance don't change it. If your unhappy with the performance try and increase or decrease the setting to see how it behaves.

Danger answered 14/4, 2009 at 1:26 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.