Using the Caching Infrastructure in ASP.NET, continued

    In a previous post, I talked about how to use the caching infrastructure in ASP.NET to increase site performance. By adding a few lines of code, I was able to increase the performance of the home page by 5 times. This time let's go ahead and squeeze out even more performance without resorting to various hacks.

    For example, I still use the Mvc Music Store project .
    If you have not read the previous posts, then it's time to see how the home page has been accelerated .

    All optimization concerned the home page, now I will go to the internal one.

    Load test

    For verification, I did a load test in Visual Studio for 25 “virtual users” with the following scenario:
    1) Request for the main page
    2) Request for the page of the genre (catalog)
    3) Request for the page of the album (product)
    For accuracy, I made sure that there were more than one the same page of the catalog \ product, and randomized to different pages.
    And also increased the percentage of new users to 80%, which is true.

    The result is 42 scripts per second.

    Adding Caching - A Simple Approach

    In ASP.NET, you can set attributes, caching with database dependencies.
    To do this, you need to follow a few simple steps:
    1. Enter caching parameters in web.config

    The element sqlCacheDependencydetermines the cache dependency on the database. The database dependency will check for changes at intervals pollTime, in this case 1000 milliseconds (1 second).
    The outputCacheProfiles element sets profiles so as not to repeat the same settings for different actions. In addition, it allows you to manage caching without rebuilding the project.

    2. Make changes to the database schema so that dependencies work

    To do this, in the application startup, call the following lines of code
    String connStr = System.Configuration.ConfigurationManager.ConnectionStrings["MusicStoreEntities"].ConnectionString;
    System.Web.Caching.SqlCacheDependencyAdmin.EnableTableForNotifications(connStr, "Genres");
    System.Web.Caching.SqlCacheDependencyAdmin.EnableTableForNotifications(connStr, "Albums");

    3. Add attributes

    [OutputCache(CacheProfile = "Catalog")]
    public ActionResult Browse(string genre)
    [OutputCache(CacheProfile = "Catalog")]
    public ActionResult Details(int id)

    Run the test again - 60 scripts per second. That is, it was possible to increase the speed in this case by almost 50%.

    Installing dependencies from code

    If you use WebAPI, then you cannot use the caching attributes. But in this case the class will help you SqlCacheDependency. Using it is very simple - in the constructor specify the database name from web.config and the table name. You can use the SqlCacheDependency instance to specify the dependencies of the local cache elements.

    So you can do navigation caching if caching all pages is not profitable.
    public ActionResult GenreMenu()
        var cacheKey = "Nav";
        var genres = this.HttpContext.Cache.Get(cacheKey);
        if (genres == null)
            genres = storeDB.Genres.ToList();
            this.HttpContext.Cache.Insert(cacheKey, genres, new SqlCacheDependency("MusicStore","Genres"));
        return PartialView(genres);

    There is another constructor SqlCacheDependencythat accepts SqlCommand. This is a completely different database change tracking engine built on alerts from SQL Server Service Broker. I tried using these alerts, but they do not work for all requests. Moreover, if the request is “wrong”, then no errors occur and the notification arrives immediately after creation. In addition, alerts are very slow. According to my measurements, they slow down writing to tables 8 times.

    but on the other hand

    Database dependencies are not free at all. For their work, triggers are created that trigger the creation, modification and deletion of records. These triggers update the information in the service table about which tables and when they were changed. And the thread on the application side periodically reads the table and notifies the dependencies.

    If the amount of change does not occur often, then the overhead of triggers is small. And if changes occur frequently, then cache efficiency drops. In the example with the Mvc Music Store, any change to any album will reset the entire cache for the entire catalog.

    What to do?

    If you stay within the same server, then the same approach will be used as used to cache the basket in the Mvc Music Store - save individual elements or data samples in the cache, and when recording - reset the cache (more in an earlier post ). With the right selection of granularity of caching and flushing, you can achieve high cache efficiency.

    But when scaling to multiple servers, this approach almost always does not work. In the case of the basket, it will work only in the case of client affinity, when the same client comes to the same server. Modern NLBs provide this, but in the case, for example, with caching of goods, client affinity will no longer help.

    The distributed cache will help us.

    If you have already installed more than one web server for serving requests, then you should think about a distributed cache.
    One of the best options for today is Redis. It is available both on premises and in the Microsoft Azure cloud.

    To add Redis to an ASP.NET project, open the Package Manager Console and run a couple of commands
    Install-Package Redis-64
    Install-Package StackExchange.Redis

    Redis supports an excellent feature - the so-called Keyspace Notifications ( This allows you to track when an item has been modified, even if changes occur on another server.

    To integrate this feature into ASP.NET, I wrote a small class:
    class RedisCacheDependency: CacheDependency
        public RedisCacheDependency(string key):base()
           Redis.Client.GetSubscriber().Subscribe("__keyspace@0__:" + key, (c, v) =>
               this.NotifyDependencyChanged(new object(), EventArgs.Empty );                                        

    This class implements CacheDependency in Redis.

    And now the client itself:
    public static class Redis
        public static readonly ConnectionMultiplexer Client = ConnectionMultiplexer.Connect("localhost");
        public static CacheDependency CreateDependency(string key)
            return new RedisCacheDependency(key);
        public static T GetCached(string key, Func getter) where T:class 
            var localCache = HttpRuntime.Cache;
            var result = (T) localCache.Get(key);
            if (result != null) return result;
            var redisDb = Client.GetDatabase();
            var value = redisDb.StringGet(key);
            if (!value.IsNullOrEmpty)
                result = Json.Decode(value);
                localCache.Insert(key, result, CreateDependency(key));
                return result;
            result = getter();
            redisDb.StringSet(key, Json.Encode(result));
            localCache.Insert(key, result, CreateDependency(key));
            return result;
        public static void DeleteKey(string key)
            var redisDb = Client.GetDatabase();

    The GetCached method stores the result in the local ASP.NET cache. The local cache is very fast, checking an item in the cache takes nanoseconds. This is much faster than a remote request to Redis + serialization-deserialization.

    Now I can bind the item in the Redis cache to the page cache:
    public ActionResult Browse(string genre)
        var cacheKey = "catalog-" + genre;
        var genreModel = Redis.GetCached(cacheKey, () =>
            (from g in storeDB.Genres
                where g.Name == genre
                select new GenreBrowse
                    Name =  g.Name,
                    Albums = from a in g.Albums
                            select new AlbumSummary
                                Title =  a.Title,
                                AlbumId =  a.AlbumId,
                                AlbumArtUrl = a.AlbumArtUrl
        this.Response.Cache.VaryByParams["genre"] = true;
        return View(genreModel);

    The standard OutputCache attribute must be removed, otherwise it will not respond to your dependencies. If you wish, you can write your ActionFilter for caching, so as not to copy-paste the code.

    To reset the cache, you need to call Redis.DeleteKey in the methods that modify the data.

    A second load test produced 52 scripts per second. This is less than without Redis, but performance will not noticeably drop when the number of records in the table increases.

    What else can you do with Redis?

    In addition to manually placing data in the cache, you can use the NuGet packages Microsoft.Web.RedisSessionStateProvider and Microsoft.Web.RedisOutputCacheProvider to place the session state and page cache in Redis. Unfortunately, the custom OutputCacheProvider limits the use of CacheDependency to flush the output cache.


    ASP.NET has a lot of caching options, in addition to the ones reviewed in this series of posts, there are also cache validation callbacks, linking to files and directories. But there are pitfalls that I have not yet talked about. If you are interested in everything related to optimizing web applications on ASP.NET, then come to my seminar -

    All posts in the series

    Source code along with tests is available on GitHub -

    Also popular now: