In this tutorial, you will learn about a few SEO tips that you can implement within your Episerver powered website to hopefully help you to rank number one in the search enginesπŸ₯‡πŸ₯‡πŸ₯‡. You can build the best website in the world, however, if it gets no traffic then why bother?!?!? One way to ensure a website gets visitors is by making it rank highly in search engines. As a developer, you might not be writing copy, however, you can aid in this quest by ensuring that your website has the correct meta-data and that a web spider can crawl the site easily. Designing your Episerver powered CMS so that it empowers content editors to manage page meta-data correctly and improve your search engine results page (SERP) ranking will increase the number of visitors to your website. In today's guide, I'm going to cover some best practice techniques you should be implementing in your Episerver website to ensure this.

Global Propertiess and an SEO Tab: One of the most powerful features of Episerver is the ability to completely customize it. Unlike some other CMS solutions, Episerver leaves it up to you to decide what page types to build and how to structure them. To allow content editors to add in SEO details you will likely need to create a base page type that contains the global meta-data properties that you want to appear on each page in your website. To start thinking about SEO, let's look at the Alloy sample SEO tab:

episerver_seo_guide_1

On any Episerver project, you will want to implement something similar to the screenshot above. I recommend that any SEO meta-data properties you can to your base page type, you group within a tab called Seo. After creating a base type, you will need to ensure all your content pages inherit from it. One idea I've been toying around with is making a Dojo SERP-Preview, which would render a little preview of how the page looks in Google, however, I haven't had enough time to get around to looking into that yet. If you do let me know!

META Properties: Content editors will need the ability to edit a page's META tag properties, which ones do you need to add into the base type? In more recent years, a few of the traditional HTML meta-tags are less important than they used to be, however, it is still good practice to enable them on each page. The main META properties you should include support for are, keywords, title (for SERP), disable indexing (for robots), no follow (for robots) and a description (for SERP). When adding these properties to your base type make sure title, description and keywords and mandatory fields, otherwise, editors will forget to add them. The META HTML that your page renders should look roughly like this:

The code to define these properties on a page-type, would look like this:

I always find it useful to implement the SEO properties with an interface. Adding an interface on your base type can help unit testing. It takes an extra few seconds of effort to implement so why not? My interface would look like this:

Robots.txt: The purpose of the robots.txt file is to tell search engine web crawlers how to process your website for indexing. Robots.txt is a file that needs to exists within your project and it needs to live in the root folder. On a plain .NET website, maintaining the contents of the robots.txt file is usually a developer task, not so in an Episerver site! When we work with CMS platforms the aim is to manage everything editable, including Robots.txt πŸ”₯πŸ”₯πŸ”₯

To avoid the need for developers from having to maintain the robots.txt file, you can install a plug-in within your site and give the power to content editors. David Knipe, back in the day, wrote a handy little admin plug-in that allows content editors to update their robots.txt within the Episerver admin. This plug-in has been upgraded ever since and as of writing, this plug-in sill works with the latest version of Episerver. The plug-in is called 'POSSIBLE.RobotsTxtHandler' and you can download the source code from here. You can also install the package via Episerver's NuGet feed:

episerver_seo_guide_3

After installing the Robots.txt plug-in, if you load up your Episerver website and visit the admin section, you will see a new entry in the left-hand navigation menu:

episerver_seo_guide_2

From this screen, you can now manage the robot.txt file πŸ’₯πŸ’₯πŸ’₯

XML Sitemap: You should also implement an XML sitemap on your website to help web crawlers index your site easily. Luckily, with Episerver there are a few third-party plug-ins that you can use off the shelf that will automatically create this file and keep it up-to-date. Over the years, I have always tended to use the Sitemap plug-in built by Geta, available here. There are two steps to use this. First, you will need to configure what content you want to be included in your sitemap. In the second step, you need to enable a scheduled task to run on a regular basis to ensure the Sitemap is updated.

episerver_seo_guide_4

To configure the plug-in, load the Episerver admin section. From the admin UI, you will see a new section called, Search engine sitemap settings.

episerver_seo_guide_7

From the settings page, you will need to create a new sitemap. Configuring the plug-in is fairly straightforward and all the instructions are on the page. I usually create a sitemap that indexes everything so I tend to leave all the options blank. After saving the settings, you will need to enable the scheduled task:

episerver_seo_guide_6

On the admin page, in the Scheduled Jobs section, click on the Generate search engine sitemaps section. Click the Start Manually button and your sitemap will be generated:

episerver_seo_guide_8

After triggering the scheduled task, when you type in www.webste.com/sitemap.xml in a browser, you should now see a sitemap.xml file πŸ’₯

Redirects and Rewrites

Ensuring Single URLs: When launching a new site you need to think about URLs. A search engine will penalise your site if it thinks you have multiple URLs pointing to the same page. It is very easy to accidentally create multiple URLs, how? Assuming your website's domain is a, a search engine will categorize both of these URLs as duplicates, www.a.com and a.com. One of the most overlooked SEO mistakes a lot of companies make is forgetting to decide on a www ornon-www URL strategy. Google will see this as duplicate content and can penalize your search rankings. On any new project, you will need to create a redirect to ensure that only one type of URL can be accessed. I've written a more in-depth article in, Setting Up Episerver To Always Use WWW Links to teach people how to sort this mess out.

301 Redirects: When you launch a new site with new content your URL structure will likely change. This means you will need to map the old URL structure to the new one, otherwise, Google will simply remove your old pages from its index and you need to build your SEO up again. There are several ways you can deal with 404 errors and 301 directs. My recommended approach for developers is via the web.config and Microsoft IIS plugins called Url Rewrite, however, it is also a good idea to allow content editors to manage it as well.

Like, robots.txt and sitemap we can use a free third-party plugin called the BVN redirect module to allow content editors to manage redirects in the CMS. Installing BVN Handler has become a lot easier over the last few years, as it's been bundled up in a Nuget package:

episerver_seo_guide_10

The most useful feature this plug-in provides is 301 re-direct management. Creating a 301 redirect in the CMS using this plug-in is done via a dashboard gadget.

episerver_seo_guide_9

Creating a redirect is simple, in the 'Custom Redirects' tab, add the old URL and the URL you want the 301 to point to, click 'Add' and off you go. From previous project experience, I would say it's good practice to audit this list once every 6 months. If the number of redirects gets too large it can affect site performance a little. In most cases, after 6 months you can delete any old re-directs as the search engine will have updated.

404 Pages If someone mistypes a URL you should make sure they do not see an error page. 404 pages can also be managed with the BVN plugin. I wrote a comprehensive guide to install and using the handler a few years ago in, Installing BVN.404Handler for an MVC Project. As time has passed and the plug-ins improved, some of the information is a little outdated. BVN provide an MVC attribute out-the-box so the third-party NuGet package is now obsolete!


The aim of this guide/checklist is to help teams enable good SEO practices on their Episerver projects. I know it looks like a lot of work, however, after you've done it a few times it's pretty simple and takes less than 30 minutes to implement in total. if leveraged correctly, these tips can help boost the number of visitors to your digital home. Happy Coding 🀘