ASP.NET (MVC) Outputcache and concurrent requests
Asked Answered
A

5

9

Let's say that, theoratically, I have a page / controller action in my website that does some very heavy stuff. It takes about 10 seconds to complete it's operation.

Now, I use .NET's outputcache mechanism to cache it for 15 minutes (for examle, I use [OutputCache(Duration = 900)]) What happens if, after 15 minutes, the cache is expired and 100 users request the page again within those 10 seconds that it takes to do the heavy processing?

  1. The heavy stuff is done only the first time, and there is some locking mechanism so that the other 99 users will get the cache result
  2. The heavy stuff is done 100 times (and the server is crippled as it can take up to 100 * 10 seconds)

Easy question maybe, but I'm not 100% sure. I hope it is number one, though :-)

Thanks!

Axiology answered 29/1, 2010 at 14:15 Comment(0)
C
4

Well, it depends upon how you have IIS configured. If you have less than 100 worker threads (let's say, 50), then the "heavy stuff" is done 50 times, crippling your server, and then the remaining 50 requests will be served from cache.

But no, there is no "locking mechanism" on a cached action result; that would be counterproductive, for the most part.

Edit: I believe this to be true, but Nick's tests say otherwise, and I don't have time to test now. Try it yourself! The rest of the answer is not dependent on the above, though, and I think it's more important.

Generally speaking, however, no web request, cached or otherwise, should take 10 seconds to return. If I were in your shoes, I would look at somehow pre-computing the hard part of the request. You can still cache the action result if you want to cache the HTML, but it sounds like your problem is somewhat bigger than that.

You might also want to consider asynchronous controllers. Finally, note that although IIS and ASP.NET MVC will not lock on this heavy computation, you could. If you use asynchronous controllers combined with a lock on the computation, then you would get effectively the behavior you're asking for. I can't really say if that's the best solution without knowing more about what your doing.

Christadelphian answered 29/1, 2010 at 14:26 Comment(2)
Thanks heavens I really don't have a request that takes 10 seconds, but I greatly exaggerated to illustrate a point. I was just curious what would happen in such a scenario. Thanks! Might consider implementing async controllers though.Axiology
been a while... just did a test myself though and I'm sure it does not lock, like you said. Thanks.Axiology
A
3

It seems to lock here, doing a simple test:

<%@ OutputCache Duration="10" VaryByParam="*" %>

protected void Page_Load(object sender, EventArgs e)
{
    System.Threading.Thread.Sleep(new Random().Next(1000, 30000));
}

The first page hits the a breakpoint there, even though it's left sleeping...no other request hits a breakpoint in the Page_Load method...it waits for the first one to complete and returns that result to everyone who's requested that page.

Note: this was simpler to test in a webforms scenario, but given this is a shared aspect of the frameworks, you can do the same test in MVC with the same result.

Here's an alternative way to test:

<asp:Literal ID="litCount" runat="server" />

public static int Count = 0;

protected void Page_Load(object sender, EventArgs e)
{
  litCount.Text = Count++.ToString();
  System.Threading.Thread.Sleep(10000);
}

All pages queued up while the first request goes to sleep will have the same count output.

Asperges answered 29/1, 2010 at 14:31 Comment(10)
Are you testing on WebDev? It behaves very differently than IIS on a multi-core server.Christadelphian
@Craig: Testing using IIS 7.5, Windows 7 x64 on a quad-coreAsperges
That seems odd. Perhaps you're using debug mode? This really shouldn't lock. The other requests should get a cache miss.Christadelphian
@Craig: Nope, release mode...just attaching the debugger in IIS, but of course it's possible this is affecting the behavior. On the other side, this would be the optimal behavior, for IIS to have other requests wait and all get the result of the first hit that's processing. If you're specifying output cache you're kind of saying that you don't want this thing to process often, and IIS acting this way would accomplish that and still serve every the requests as quickly as possible (first request's thread should finish first, in most cases).Asperges
If the request is still in process, it's not at all obvious that the next request will be a cache hit even if the first request completes. I'd suggest testing with logging rather than the debugger.Christadelphian
@Craig: Can you explain a bit more what you mean? If a user makes request #2 that would hit the same output (as in the VarByWhatever matches up), shouldn't IIS wait for the request to complete and return that result to everyone who would have hit the cache had they hit the same thing moments later? It should be the same result. (If it's not, you shouldn't be using OutputCache) If it didn't do this and you got 1000 hits at once, you'd be processing the same result 1000 times, instead of once quickly and return to 1000 requests...makes sense that it would behave this way.Asperges
The cache can have a dependency. It can also be cleared by other code.Christadelphian
@Craig, That is a fair point, but they would all be asking for the page as it was at the instant they all went for it so still a maybe. I tried a similar test as you requested here, release mode, no debugging with a static int that gets incremented every page process...same result, the number only get bumped once. I'll update the answer to show that approach.Asperges
I believe that something isn't right here, but I don't have time to go through it myself. So +1 for testing, and I'll update my answer to reflect what you did.Christadelphian
@Craig: I appreciate the intelligent discussion. Please do test...as I'll be using this in the near future I'd be very curious if you get different behavior.Asperges
S
3

Old question, but I ran in to this problem, and did some investigation.

Example code:

public static int Count;
[OutputCache(Duration = 20, VaryByParam = "*")]
public ActionResult Test()
{
    var i = Int32.MaxValue;
    System.Threading.Thread.Sleep(4000);
    return Content(Count++);
}

Run it in one browser, and it seems to lock and wait.

Run it in different browsers (I tested in IE and firefox) and the requests are not put on hold.

So the "correct" behaviour has more to do with which browser you are using than the function in IIS.

Edit: To clarify - No lock. The server gets hit by all requests that manage to get in before the first result is cached, possibly resulting in a hard hit on the server for heavy requests. (Or if you call an external system, that system could be brought down if your server serves many requests...)

Sulla answered 11/12, 2012 at 11:46 Comment(0)
H
1

I made a small test that might help. I believe what I've discovered is that the uncached requests do not block, and each request that comes in while the cache is expired and before the task is completed ALSO trigger that task.

For example, the code below takes about 6-9 seconds on my system using Cassini. If you send two requests, approximately 2 seconds apart (i.e. two browser tabs), both will receive unique results. The last request to finish is also the response that gets cached for subsequent requests.

// CachedController.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;

namespace HttpCacheTest.Controllers
{
    public class CachedController : Controller
    {
        //
        // GET: /Cached/

        [OutputCache(Duration=20, VaryByParam="*")]
        public ActionResult Index()
        {
            var start = DateTime.Now;

            var i = Int32.MaxValue;
            while (i > 0)
            {
                i--;
            }
            var end = DateTime.Now;

            return Content( end.Subtract(start).ToString() );
        }

    }
}
Hominoid answered 24/5, 2011 at 20:42 Comment(0)
D
0

You should check this information here: "You have a single client making multiple concurrent requests to the server. The default behavior is that these requests will be serialized;"

So, if the concurrent request from the single client is serialized, the subsequent request will use the cache. That explain some behavior seem in some answer above (@mats-nilsson and @nick-craver)

The context that you showed us is multiple users, that will hit you Server in the same time, and you server will get busy until have completed at least one request and created the output cache, and use it for the next request. So if you want to serialize multiple users requesting the same resource, we need to understand how the serialized request works for single user. Is that what you want?

Discommode answered 21/1, 2016 at 13:4 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.