What is the best way to lock cache in asp.net?
Asked Answered
H

9

81

I know in certain circumstances, such as long running processes, it is important to lock ASP.NET cache in order to avoid subsequent requests by another user for that resource from executing the long process again instead of hitting the cache.

What is the best way in c# to implement cache locking in ASP.NET?

Henceforward answered 2/9, 2008 at 9:42 Comment(0)
A
117

Here's the basic pattern:

  • Check the cache for the value, return if its available
  • If the value is not in the cache, then implement a lock
  • Inside the lock, check the cache again, you might have been blocked
  • Perform the value look up and cache it
  • Release the lock

In code, it looks like this:

private static object ThisLock = new object();

public string GetFoo()
{

  // try to pull from cache here

  lock (ThisLock)
  {
    // cache was empty before we got the lock, check again inside the lock

    // cache is still empty, so retreive the value here

    // store the value in the cache here
  }

  // return the cached value here

}
Antimonic answered 2/9, 2008 at 17:12 Comment(3)
If the first load of the cache takes some minutes, is there still a way to access the entries already loaded? Lets say if I have GetFoo_AmazonArticlesByCategory(string categoryKey). I guess it would by something like a lock per categoryKey.Llano
Called "double-checked locking". en.wikipedia.org/wiki/Double-checked_lockingCognoscenti
This answer is very bad advice and is being implemented since 2008. Explanation: https://mcmap.net/q/258291/-what-is-the-best-way-to-lock-cache-in-asp-netVanish
H
32

For completeness a full example would look something like this.

private static object ThisLock = new object();
...
object dataObject = Cache["globalData"];
if( dataObject == null )
{
    lock( ThisLock )
    {
        dataObject = Cache["globalData"];

        if( dataObject == null )
        {
            //Get Data from db
             dataObject = GlobalObj.GetData();
             Cache["globalData"] = dataObject;
        }
    }
}
return dataObject;
Henceforward answered 3/9, 2008 at 8:17 Comment(5)
if( dataObject == null ) { lock(ThisLock) { if( dataObject == null ) // of course it's still null!Tevere
@Constantin: not really, someone might have updated the cache while you were waiting to acquire the lock()Zionism
@John Owen - after the lock statement you have to try to get the object from the cache again!Twoup
-1, code is wrong (read the other comments), why don't you fix it? People might try to use your example.Delaryd
This code is actually still wrong. You're returning globalObject in a scope where it doesn't actually exist. What should happen is that dataObject should be used inside the final null check, and globalObject doesn't event need to exist at all.Sclerodermatous
A
25

There is no need to lock the whole cache instance, rather we only need to lock the specific key that you are inserting for. I.e. No need to block access to the female toilet while you use the male toilet :)

The implementation below allows for locking of specific cache-keys using a concurrent dictionary. This way you can run GetOrAdd() for two different keys at the same time - but not for the same key at the same time.

using System;
using System.Collections.Concurrent;
using System.Web.Caching;

public static class CacheExtensions
{
    private static ConcurrentDictionary<string, object> keyLocks = new ConcurrentDictionary<string, object>();

    /// <summary>
    /// Get or Add the item to the cache using the given key. Lazily executes the value factory only if/when needed
    /// </summary>
    public static T GetOrAdd<T>(this Cache cache, string key, int durationInSeconds, Func<T> factory)
        where T : class
    {
        // Try and get value from the cache
        var value = cache.Get(key);
        if (value == null)
        {
            // If not yet cached, lock the key value and add to cache
            lock (keyLocks.GetOrAdd(key, new object()))
            {
                // Try and get from cache again in case it has been added in the meantime
                value = cache.Get(key);
                if (value == null && (value = factory()) != null)
                {
                    // TODO: Some of these parameters could be added to method signature later if required
                    cache.Insert(
                        key: key,
                        value: value,
                        dependencies: null,
                        absoluteExpiration: DateTime.Now.AddSeconds(durationInSeconds),
                        slidingExpiration: Cache.NoSlidingExpiration,
                        priority: CacheItemPriority.Default,
                        onRemoveCallback: null);
                }

                // Remove temporary key lock
                keyLocks.TryRemove(key, out object locker);
            }
        }

        return value as T;
    }
}
Afterbrain answered 12/8, 2016 at 0:14 Comment(6)
keyLocks.TryRemove(key, out locker) <= what is the use of it?Lockout
This is great. The whole point of locking the cache is to avoid duplicating the work done to get the value for that specific key. Locking the whole cache or even sections of it by class is silly. You want exactly this - a lock that says "I'm getting the value for <key> everyone else just wait for me." The extension method is slick too. Two great ideas in one! This should be the answer that people find. THANKS.Indispensable
@iMatoria, once there is something in the cache for that key, there is no point in keeping that lock object around or the entry in the dictionary of keys - it's a try remove because the lock might have already been removed from the dictionary by another thread that came first - all the threads that got locked waiting on that key are now in that code section where they just get the value from the cache, but there is no-longer a lock to remove.Indispensable
I like this approach much better than the accepted answer. Small note however: You first use cache.Key, then futher on you use HttpRuntime.Cache.Get.Tambourine
@MindaugasTvaronavicius Good catch, you are correct, its an edge case where the T2 and T3 are executing the factory method concurrently. Only where T1 previously executed the factory which returned null (so the value isn't cached). Otherwise T2 and T3 will just get the cached value concurrently (which should be safe). I guess the easy solution is to delete keyLocks.TryRemove(key, out locker) but then the ConcurrentDictionary could become a memory leak if using a large number of different keys. Otherwise add some logic to count locks for a key before removing, perhaps using a semaphore?Afterbrain
Good solution. The only caveat to newer developers is to be mindful of is key collision with a return value of different types You can try catch around 'return value as T'. You would have the problem without this extension method so you might want to prefix the passed key with a unique value (substring of a guid perhaps?).Castellanos
G
13

Just to echo what Pavel said, I believe this is the most thread safe way of writing it

private T GetOrAddToCache<T>(string cacheKey, GenericObjectParamsDelegate<T> creator, params object[] creatorArgs) where T : class, new()
    {
        T returnValue = HttpContext.Current.Cache[cacheKey] as T;
        if (returnValue == null)
        {
            lock (this)
            {
                returnValue = HttpContext.Current.Cache[cacheKey] as T;
                if (returnValue == null)
                {
                    returnValue = creator(creatorArgs);
                    if (returnValue == null)
                    {
                        throw new Exception("Attempt to cache a null reference");
                    }
                    HttpContext.Current.Cache.Add(
                        cacheKey,
                        returnValue,
                        null,
                        System.Web.Caching.Cache.NoAbsoluteExpiration,
                        System.Web.Caching.Cache.NoSlidingExpiration,
                        CacheItemPriority.Normal,
                        null);
                }
            }
        }

        return returnValue;
    }
Gracious answered 28/6, 2010 at 19:23 Comment(1)
'lock(this)` is is bad. You should use a dedicated lock object that is not visible to users of your class. Supposing, down the road, someone decides to use the cache object to lock with. They'll be unaware that it's being used internally to for locking purposes, which may lead to badness.Amadis
G
2

Craig Shoemaker has made an excellent show on asp.net caching: http://polymorphicpodcast.com/shows/webperformance/

Gimbals answered 2/9, 2008 at 10:0 Comment(0)
D
2

I have come up with the following extension method:

private static readonly object _lock = new object();

public static TResult GetOrAdd<TResult>(this Cache cache, string key, Func<TResult> action, int duration = 300) {
    TResult result;
    var data = cache[key]; // Can't cast using as operator as TResult may be an int or bool

    if (data == null) {
        lock (_lock) {
            data = cache[key];

            if (data == null) {
                result = action();

                if (result == null)
                    return result;

                if (duration > 0)
                    cache.Insert(key, result, null, DateTime.UtcNow.AddSeconds(duration), TimeSpan.Zero);
            } else
                result = (TResult)data;
        }
    } else
        result = (TResult)data;

    return result;
}

I have used both @John Owen and @user378380 answers. My solution allows you to store int and bool values within the cache aswell.

Please correct me if there's any errors or whether it can be written a little better.

Dare answered 18/4, 2014 at 12:56 Comment(2)
That's a default cache length of 5 miniutes (60 * 5 = 300 seconds).Dare
Great work, but I see one issue: if you have multiple caches, they will all share the same lock. To make it more robust, use a dictionary to retrieve the lock matched to the given cache.Greggrega
J
1

I saw one pattern recently called Correct State Bag Access Pattern, which seemed to touch on this.

I modified it a bit to be thread-safe.

http://weblogs.asp.net/craigshoemaker/archive/2008/08/28/asp-net-caching-and-performance.aspx

private static object _listLock = new object();

public List List() {
    string cacheKey = "customers";
    List myList = Cache[cacheKey] as List;
    if(myList == null) {
        lock (_listLock) {
            myList = Cache[cacheKey] as List;
            if (myList == null) {
                myList = DAL.ListCustomers();
                Cache.Insert(cacheKey, mList, null, SiteConfig.CacheDuration, TimeSpan.Zero);
            }
        }
    }
    return myList;
}
Jus answered 2/9, 2008 at 17:29 Comment(3)
Couldn't two threads both get a true result for (myList==null)? Then, both threads make a call to DAL.ListCustomers() and insert the results into cache.Averi
After the lock you need to check the cache again, not the local myList variableDelaryd
This was actually ok before your edit. No lock is needed if you use Insert to prevent exceptions, only if you want to make sure that DAL.ListCustomers is called once (though if the result is null, it'll be called every time).Choral
L
0

I modified @user378380's code for more flexibility. Instead of returning TResult now returns object for accepting different types in order. Also adding some parameters for flexibility. All the idea belongs to @user378380.

 private static readonly object _lock = new object();


//If getOnly is true, only get existing cache value, not updating it. If cache value is null then      set it first as running action method. So could return old value or action result value.
//If getOnly is false, update the old value with action result. If cache value is null then      set it first as running action method. So always return action result value.
//With oldValueReturned boolean we can cast returning object(if it is not null) appropriate type on main code.


 public static object GetOrAdd<TResult>(this Cache cache, string key, Func<TResult> action,
    DateTime absoluteExpireTime, TimeSpan slidingExpireTime, bool getOnly, out bool oldValueReturned)
{
    object result;
    var data = cache[key]; 

    if (data == null)
    {
        lock (_lock)
        {
            data = cache[key];

            if (data == null)
            {
                oldValueReturned = false;
                result = action();

                if (result == null)
                {                       
                    return result;
                }

                cache.Insert(key, result, null, absoluteExpireTime, slidingExpireTime);
            }
            else
            {
                if (getOnly)
                {
                    oldValueReturned = true;
                    result = data;
                }
                else
                {
                    oldValueReturned = false;
                    result = action();
                    if (result == null)
                    {                            
                        return result;
                    }

                    cache.Insert(key, result, null, absoluteExpireTime, slidingExpireTime);
                }
            }
        }
    }
    else
    {
        if(getOnly)
        {
            oldValueReturned = true;
            result = data;
        }
        else
        {
            oldValueReturned = false;
            result = action();
            if (result == null)
            {
                return result;
            }

            cache.Insert(key, result, null, absoluteExpireTime, slidingExpireTime);
        }            
    }

    return result;
}
Listlessness answered 23/10, 2014 at 7:28 Comment(0)
V
0

The accepted answer (recommending reading outside of the lock) is very bad advice and is being implemented since 2008. It could work if the cache uses a concurrent dictionary, but that itself has a lock for reads.

Reading outside of the lock means that other threads could be modifying the cache in the middle of read. This means that the read could be inconsistent.

For example, depending on the implementation of the cache (probably a dictionary whose internals are unknown), the item could be checked and found in the cache, at a certain index in the underlying array of the cache, then another thread could modify the cache so that the items from the underlying array are no longer in the same order, and then the actual read from the cache could be from a different index / address.

Another scenario is that the read could be from an index that is now outside of the underlying array (because items were removed), so you can get exceptions.

Vanish answered 16/1, 2023 at 10:28 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.