Software Development

Implementing LRU cache

In my last post I mentioned that checking whatever a user is an administrator or not using Active Directory query can be slow. That means that we can just make use of that, we have to cache that.

When caching is involved, we have to consider a few things. When do we expire the data? How much memory are we going to use? How do we handle concurrency?

The first thing that pops to mind is the usage of MemoryCache, now part of the .NET framework and easily accessible. Sadly, this is a heavy weight object, it creates its own threads to manage its state, which probably means we don’t want to use it for a fairly simple feature like this.

Instead, I implemented the following:

public class CachingAdminFinder
{
    private class CachedResult
    {
        public int Usage;
        public DateTime Timestamp;
        public bool Value;
    }
    private const int CacheMaxSize = 25;
    private static readonly TimeSpan MaxDuration = TimeSpan.FromMinutes(15);
    private readonly ConcurrentDictionary<SecurityIdentifier, CachedResult> cache =
        new ConcurrentDictionary<SecurityIdentifier, CachedResult>();
    public bool IsAdministrator(WindowsIdentity windowsIdentity)
    {
        if (windowsIdentity == null) throw new ArgumentNullException('windowsIdentity');
        if (windowsIdentity.User == null)
            throw new ArgumentException('Could not find user on the windowsIdentity', 'windowsIdentity');
        CachedResult value;
        if (cache.TryGetValue(windowsIdentity.User, out value) && (DateTime.UtcNow - value.Timestamp) <= MaxDuration)
        {
            Interlocked.Increment(ref value.Usage);
            return value.Value;
        }
        bool isAdministratorNoCache;
        try
        {
            isAdministratorNoCache = IsAdministratorNoCache(windowsIdentity.Name);
        }
        catch (Exception e)
        {
            log.WarnException('Could not determine whatever user is admin or not, assuming not', e);
            return false;
        }
        var cachedResult = new CachedResult
            {
                Usage = value == null ? 1 : value.Usage + 1,
                Value = isAdministratorNoCache,
                Timestamp = DateTime.UtcNow
            };
        cache.AddOrUpdate(windowsIdentity.User, cachedResult, (_, __) => cachedResult);
        if (cache.Count > CacheMaxSize)
        {
            foreach (var source in cache
                .OrderByDescending(x => x.Value.Usage)
                .ThenBy(x => x.Value.Timestamp)
                .Skip(CacheMaxSize))
            {
                if (source.Key == windowsIdentity.User)
                    continue; // we don't want to remove the one we just added
                CachedResult ignored;
                cache.TryRemove(source.Key, out ignored);
            }
        }
        return isAdministratorNoCache;
    }
    private static bool IsAdministratorNoCache(string username)
    {
       // see previous post
    }
}

Amusingly enough, properly handling the cache takes (much) more code than it takes to actually get the value.

We use ConcurrentDictionary as the backing store for our cache, and we enhance the value with usage & timestamp information. Those come in handy when the cache grows too big and need to be trimmed.

Note that we also make sure to check the source every 15 minutes or so, because there is nothing as annoying as “you have to restart the server for it to pick the change”. We also handle the case were we can’t get this information for some reason.

In practice, I doubt that we will ever hit the cache max size limit, but I wouldn’t have been able to live with myself without adding the check

Reference: Implementing LRU cache from our NCG partner Oren Eini at the Ayende @ Rahien blog.

Related Articles

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Back to top button