How To Reduce The Number Of SQL Calls Within Your Umbraco Pages

Recently, I've upgraded a website into Umbraco.  Before I launched the website I used Mini Profiler to check that the websites web pages worked performantly.  If you haven't come across MiniProfiler yet, I suggest you read, How To Debug Your Umbraco 7 Website.  In today's tutorial, I'm going to cover some of the learning experiences I made with the Umbraco API and how I overcame them.

Umbraco Performance Issues

Use The Right API

 So when I started off I looked at mini-profiler and I saw this.  In my header I was calling 60 repeat SQL calls.  The first thing you should do when working with Umbraco is make sure you're usin gthe right API for the job.  InUmbraco you can access data in two different ways, one with IContent and the other with IPublishedContent

IContent allows for read/write access to your Umbraco data.  You may also need ot use ICOntent if you need to access your content from outside of an Umbraco context, e.g. a API etc..  

var contentService = new ContentService();
IContent content = contentService.GetById(123);

IPublishedContent is the much faster way to access your Umbraco data in a read-only mode.  The API interacts with the Examine index and consequently talks to the "content cache" compared to the database.

var umbracoHelper = new UmbracoHelper(UmbracoContext.Current);
IPublishedContent content = umbracoHelper.TypedContent(123);

Umbraco database calls are slow so when you're tryign to display data on your website, it is receommend to work with IPublishedCOntent rather than accessing the database each request.

Get Desendants Vs Get Children

If you use IPublishCOntent you will get rid of a number of the issues I've listed above, however, let#s just assume for this tutorial that you have to use IContent.  In my header menu above, I had a second isssue of redundant SQL queries that was due to the API call to get the menu items.  This website used a mega-menu as the sites primary menu.  MEga-mens have top-level items and then a number of sub-items.  The first way I was getting the menu items was using GetChildren() to get through the top-level items and then if any of the menu items was set to display a mega-menu underneath it, use GetCHildren() again to get the extra data.

As you can see from the mini-profiler results, this was causing a lot of extra SQL calls.  The more SQL calls your webpage has to make the slower it will be.  Instead of having to make a number of SQL calls to get all the data I needed, it is a lot more efficeient in this instance to use GetDescendants() and get all of your items in one SQL hit.  

As you can see from the comparisons, I saved over 50 SQL queries per request from using the API in a slightly different way.  Instead of using using GetChildren numerous times and repeating a lot of the same SQL.  I could use GetDescendants() once and then parse all the data in code.

Caching

Now I'd refactored my code to remove over 51 queries, I was still suck with the 9 calls on first page load.  To get around this I smply put all the data into the cache for an hour.  This means once an hour a page has to do a few more SQL calls than needed, but for 99% of the traffic they will get really quick page loads.  The code to cache objects is really simple:

        public IEnumerable GetPostsFromCache()
        {
           var _cache = MemoryCache.Default
            if (!_cache.Contains("posts"))
                return null;
                
  
            return _cache.Get("posts") as IEnumerable;
        }

The code to add items into the cache would look like this:

        public bool StoreContente(IEnumerable contentItems)
        {
            var cacheKey = "posts";
            return _cache.Add(cacheKey, cacheKey, new CacheItemPolicy { AbsoluteExpiration = DateTime.Now.AddHours(1) }));
        }

Takeaway

Using these three techniques I got my 100 SQL queries from the first page load, down to about 20 on first page load, simply by using the API in a more efficent manner. As I'm hoping you can see, using the wrong API, or, using the wrong method can add a lot of lag onto your page load time.  When it comes to performance, everyone no matte rhow experienced you are will need to bust out a tool to mini-profiler to check if you have done things as optimely as you should have.  When you try to make your pages faster, there are usually one or two pain points.  In my example it was the header that was making the page load slowly.  Simply by using the right API and then calling that API in the most efficent manner, meant instead of using 60 unneeded SQL call I reduced it to 5 in the end.

5 Still seemed quite wasteful, so I put my menu in the normal session cache, so on any other page load these duplicate SQL calls where removed.  When it comes to performance there are loads of different ways to make your pages quicker.  The aim of this tutorial was to guide you through the process of fine tuning your pages.  WHwn it comes to Umbraco make sure you are using IPublishedContent!

submit to reddit

Jon D Jones

Software Architect, Programmer and Technologist Jon Jones is founder and CEO of London-based tech firm Digital Prompt. He has been working in the field for nearly a decade, specializing in new technologies and technical solution research in the web business. A passionate blogger by heart , speaker & consultant from England.. always on the hunt for the next challenge

Back to top