Best way to determine the number of servers needed
Asked Answered
L

4

8

How much traffic can one web server handle? What's the best way to see if we're beyond that?

I have an ASP.Net application that has a couple hundred users. Aspects of it are fairly processor intensive, but thus far we have done fine with only one server to run both SqlServer and the site. It's running Windows Server 2003, 3.4 GHz with 3.5 GB of RAM.

But lately I've started to notice slows at various times, and I was wondering what's the best way to determine if the server is overloaded by the usage of the application or if I need to do something to fix the application (I don't really want to spend a lot of time hunting down little optimizations if I'm just expecting too much from the box).

Leary answered 12/9, 2008 at 23:17 Comment(0)
Y
7

What you need is some info on Capacity Planning..

Capacity planning is the process of planning for growth and forecasting peak usage periods in order to meet system and application capacity requirements. It involves extensive performance testing to establish the application's resource utilization and transaction throughput under load. First, you measure the number of visitors the site currently receives and how much demand each user places on the server, and then you calculate the computing resources (CPU, RAM, disk space, and network bandwidth) that are necessary to support current and future usage levels.

Younglove answered 12/9, 2008 at 23:31 Comment(0)
S
1

If you have access to some profiling tools (such as those in the Team Suite edition of Visual Studio) you can try to set up a testing server and running some synthetic requests against it and see if there's any specific part of the code taking unreasonably long to run. You should probably check some graphs of CPU and memory usage over time before doing this, to see if it can even be that. (A number alike to the UNIX "load average" could be a useful metric, I don't know if Windows has anything like it. Basically the average number of threads that want CPU time for every time-slice.)

Also check the obvious, that you aren't running out of bandwidth.

Schram answered 12/9, 2008 at 23:35 Comment(0)
T
1

Measure, measure, measure. Rico Mariani always says this, and he's right.

Measure req/sec, RAM, CPU, Sessions, etc.

You may come up with a caching strategy (Output caching, data caching, caching dependencies, and so on.)

See also how your SQL Server is doing... indexes are a good place to start but not the only thing to look at..

Thready answered 29/1, 2009 at 1:9 Comment(0)
J
0

On that hardware, a .NET application should be able to serve about 200-400 requests per second. If you have only a few hundred users, I doubt you are seeing even 2 requests per second, so I think you have a lot of capacity on that box, even with SQL server running.

Without know all of the details, I would say no, you will not see any performance improvement by adding servers.

By the way, if you're not using the Output Cache, I would start there.

Joub answered 13/9, 2008 at 2:50 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.