Sydney Episerver Meetup

We have 2 cool sessions planned for this meetup:

“Episerver Insight Preview”  (Marcus Babajews/Damien Dias)

•  Episerver Insight Sneak Preview from a Tech perspective

“Tag your content using Azure Machine Learning” (Aria Zanganeh)

•  Using cognitive services to automatically tag Episerver Content

There will be pizza and drinks during the evening.

If you want to join the party please clock on

We hope you can make it!

What is CDN and how it fits to Episerver solution (Part 2) – Setup cloudflare

Welcome to the second post of CDN series. In these series, I’m trying to describe how CDN works and with one example show how it can integrate with Episerver and finally we tackle some caching challenges CDN and Episerver has. In this post, we are going to integrate one sample CDN with one of our existing websites. One of  CDN Episerver is using in their managed services called DXC is called cloudflare. I have not affiliated with this company and used it because it may be used by some of DXC clients. Cloudflare is quite good and stable service. They have a very simple dashboard and quite good and fast support. So let’s dig down into it!

  1. I created sample Episerver site and hosted that in
  2. Register on cloudflare – It is free (select free-personal plan) (not CC require)
  3. Now you need to provide your domain name(s) – my domain name, in this case, would be – it may take some time to scan domain

  4. Now you can see list of DNS records which domain scan process fetched. In my case none of them items in the list is relevant, so I removed all of items and add one line for test5:

    Cloudflare proposed me to change my Name Server to use Cloudflare which I haven’t. What I did was just is to remove all items and add my “test5” and associated IP address!

  5. Our original image URL was “” so now with our latest cloudflare configuration we can access “test5” using “”! So without changing name server, we can test our application. Remember in real world scenario if you need to leverage sage DNS and GEO Loaction DNS, you need to use CDN Domain Name Service!
  6. Next step is to add record to IIS website binding:

  7. And obviously new URL to Episerver:

  8. Navigate to cloudflare URL and you should see CDN version of your site!


  1. How should  I know the site is using the cache?
    1. In Chome navigate to site using developer tools -> Network
    2. Wait until all site is loaded
    3. Find one of your images comes from Episerver:

    4. You can see “CF-Cache-Status:HIT” in  HTTP Header. That means the image is coming from CDN (more information)
  2. What happens if  I delete my asset file from Episerver?
    1. If you are using CDN URL, CDN will provide the image. That means even if you delete image from your web server, CDN keeps the file till TTL finish (more information)
  3. How should I know HTML is cached or not in CDN?
    1. HTTP Header is where to find out:

    2. As you can see “CF-Cache-Status:HIT” is missing
    3. If you need want to cache HTML as well you can refer here.
  4. If I’m going to use CDN, what about if my image or video (or any asset, even HTML of the site) is live and I need to update it?
    1. This is our next post!

In our next post, we will look into how INVALIDATE CDN cache from Episerver.  We are going to build a NuGet package specifically for cloudflare CDN to invalidate cached content when the content is updated in Episerver. This can improve site performance by caching content as much as possible and allow web editors to publish the latest content ALWAYS and finally we plan for integrating visitor group  with CloudFalre PageRules! This will be amazing journey, next post will come soon 🙂

What is CDN and how it fits to Episerver solution (Part 1)

In this series of posts, I’m going to describe what is CDN and how it works with Episerver. And then introduce a challenge with CDN and cache and Episerver and how we can tackle that. In out first post we will dig into CDN concept and understand what problem it tries to solve. CDN stands for “Content Delivery Network” or “Content Distribution Network”. So CDN is a network which tries to deliver the content to end-user. The source of content is web-server and the output of the CDN will be delivered to the user. The challenges which CDN tries to solve are:

  1. Take a load of site assets (e.g. Images, Videos, JS, CSS, …) from the web server and cache them. Usually, on first call CDN servers, fetch the assets from the webserver and cache it. From now on it uses cache version and CDN takes the load off from web server
  2. CDN, as the name says, has a network of MANY servers in different regions which is called “Edge Server”. These servers can improve content delivery to make sure content is going to deliver to end-user using the same server in the same region. This can help website owner to avoid paying for MANY web servers for different region and they will pay much less for expensive compute time (or virtual machine) versus cheap CDN
  3. Depend on different CDN you would get the different feature but usually, comes with a security feature. Means CDN will handle “DDos Attacks“, Provides PCI Compliance, Web Application Firewall and SSL.
  4. Most CDN supports content optimization. For example, they can minify JS and CSS on the fly or can optimize image based on the client device (e.g. on small mobile screen the image quality is not important but a size of the image can consume all of your mobile bandwidth).
  5. Can give your a good analytics about traffic and how to optimize the content delivery to give best experience  to end user.
  6. Compare to compute power we are paying for a web server, CDN is quite a bit cheap.
  7. Most of CDN providers are now support SPDY and HTTP2, means you can deliver content HEAPS faster to your end-user.
  8. Some CDNs are supporting video streaming, so if you are providing many videos or event live video, you can rely on CDN to provides high bandwidth in a different region and tackle the complication of video STREAMING mechanism on different end-user devices.

The list above is common CDN features but it is not limited to above list. There are MANY more features from different providers which are not listed but above list is my main concern about CDN and any more feature can help to improve user experience on your site.

In our next part we are going to use cloudflare and one Episerver site to show your how it works.

Episerver is named a leader among WCM vendors – What that means!

Recently Forrester announced Adobe, Acquia, Episerver are new leaders in WCM (Web Content Management). So what that mean and how Episerver made this?

  1. Support cloud
  2. Modularized components
  3. API Support
  4. Micro-service based
  5. All-in-one platform for CMS and eCommerce

Episerver does not have strong API and not built based on Micro-service architecture. Episerver but is working heavily on support cloud. I think Episerver  is doing well but need to consider to work more on below to be able to maintain the current position:

  1. Try to make application more modularize
  2. Go to support Micro-service architecture
  3. Accelerate work on Analytics and Personalization
  4. Introduce Episerver platform more to the public (e.g. Graduate developers, the key person in enterprises, …)
  5. Keep pushing on cloud concept

I really happy which I’m part of Episerver and hope to help the community to improve this awesome product.

How to activate Episerver performance counter and use it

One of the good tools to use for performance monitoring is “Performance Counter”. This works well out of the box with Episerver CMS. Setting it up is quite simple, you need to change web.config and add “enablePerformanceCounters” to “episerver->applicationSettings” :

<applicationSettings enablePerformanceCounters=”True”  />

Remember to remove this on production. In the Episerver.dll which I’m using there is a bug which does not create proper category so if you want you can manually create the category:

            CounterCreationDataCollection counterData = new CounterCreationDataCollection();
            CounterCreationData[] CounterData = new CounterCreationData[6] { new CounterCreationData("Data Factory Reads/Sec", "The number of page objects delivered from the Data Factory API per second", PerformanceCounterType.RateOfCountsPerSecond32), new CounterCreationData("Data Factory Listings/Sec", "The number of EPiServer page listnings delivered from the Data Factory API per second", PerformanceCounterType.RateOfCountsPerSecond32), new CounterCreationData("Data Factory Cache Hit Ratio", "The cache hit ratio for Data Factory API reads and listnings. 100% is normal.", PerformanceCounterType.RawFraction), new CounterCreationData("Data Factory Cache Hit Ratio Total", "Internal counter that only is the base counter.", PerformanceCounterType.RawBase), new CounterCreationData("Authentication Requests/Sec", "The number of user authentications per second.", PerformanceCounterType.RateOfCountsPerSecond32), new CounterCreationData("Database Connections/Sec", "The number of database connections that EPiServer opens in the Data Access Layer.", PerformanceCounterType.RateOfCountsPerSecond32) };

            System.Diagnostics.PerformanceCounterCategory.Create("EPiServer CMS 7", "EPiServer Performance Counters", PerformanceCounterCategoryType.MultiInstance, counterData);

I put this in “StartPageController” and this is one-off and you need to remove it. After the code is being run you need to run “Performance Monitor”:



And then select “Performance Monitor” from the left-hand nav, then click on “Green Plus”:


And select “EPiServer CMS 7” from the top list and select “_total” from the bottom list and click on “Add” and press “Ok” button. And now you can see performance counter:



Rebuild Episerver DB to use MongoDB!!! What you think?

I recently doing some small work with MongoDB and just realize Document Based NoSQL DB is really fitted for CMS! So let’s speak about it.


  1. Entity stored as BSON (brother of JSON!). That means your content properties will be persisted as the current structure in MongoDB.
  2. MongoDB can Scale Out that means if your site has heaps of content and heaps of traffic, instead of complexity around scaling of SQL Server you can use easily horizontally scale application.
  3. It is FREE!
  4. CMS now days is very tight to Web Analytics and MongoDB is designed to store and process this kind of data! Having said that it doesn’t mean because of just analytics we have to move all of our DB to MongoDB, but if we use SQL Server and we need MongoDB for analytics, it added complexity to project and speed up new developer with the project will become really hard!


  1. Currently, Episerver is built on top of SQL Server and I think changing that may need some work!
  2. Many vendors built plugins and they all need to learn and change their packages!
  3. All developers need to learn MongoDB! (To me it is Prons but for business not!)
  4. Technically on my experience “SQL Server” is not a bottleneck for Episerver (thanks for awesome cache)! But in my opinion, next generation of CMS is more dynamic and based on user taste it needs to transform content presentation based on what user is looking for! So very soon it becomes a problem!


I really like the idea of you guys! Let me know what you think!

RobotsTxtHandler project – How it works (part 3)


  1. Create new “Class Library” project.
  2. Add these NuGet packages to your project: (Please consider you need to have Feed Url of EPiserver NuGet server -> Link):EPiServer.CMS.UI.Core
  3. Entry  point: We need to define a “Normal MVC Controller” and decorate that with “GuiPlugIn” attribute:
            Area = EPiServer.PlugIn.PlugInArea.AdminMenu,
            DisplayName = "Robots.txt Handler",
            Description = "Robots.txt Handler",
            Url = Const.ModuleUrlBase + Const.Separator + Const.ModuleAdminController + Const.Separator + Const.ModuleAdminIndexAction,
            RequiredAccess = AccessLevel.Administer)]
        public class AdminController : Controller

    As you can see within attribute properties you can define the name and access level and base Url to use the module to route the URL correctly  into controller. Another important bit is “Area” property! This tells Episerver WHERE to show the  link of the plugin.  You can see full list and description here

  4. When a user clicks on plugin link in “AdminMenu” we want to show a list of available “Robot.txt” handler for each site to a user. To achieve this let’s add “Index”:
           public ActionResult Index()
                var robotTxts = robotsTxtRepository.All().ToList();
                var model = new AdminViewModel
                    Sites = GetSitesList(robotTxts.Select(a => a.SiteId)),
                    AvailableRobotTxts = robotTxts.Select(a => new AvailableRobotTxt { Id = a.Id.ExternalId, Name = siteList.Single(s => s.Id.ToString() == a.SiteId).Name })
                return PluginPartialView(Const.ModuleAdminIndexAction, model);

    And more than obvious we need to define ViewModel and repository class. You can see them on GitHub

  5. We need to add “Index.cshtml” to the project. Let’s create folder “\View\Admin” and create “Index.cshtml” under “Admin”:
    @model AdminViewModel
    @{  Layout = "Layout.cshtml"; }
    @if (Model.AvailableRobotTxts != null && Model.AvailableRobotTxts.Any())
            @foreach (var robotTxt in Model.AvailableRobotTxts)
                    <span><a href="@Url.Content(string.Format("{0}/{1}/{2}/{3}", Const.ModuleUrlBase,Const.ModuleAdminController,Const.ModuleAdminEditAction, robotTxt.Id))">Edit</a></span>
                    <span><a href="@Url.Content(string.Format("{0}/{1}/{2}/{3}", Const.ModuleUrlBase, Const.ModuleAdminController, Const.ModuleAdminDeleteAction, robotTxt.Id))">Delete</a></span>
    @if (Model.Sites != null && Model.Sites.Any())
        using (Html.BeginForm())
            @Html.DropDownList("SelectedSite", Model.Sites)
            @Html.TextAreaFor(a => a.RobotText, new { @rows = 20 })
            <input type="submit" value="Submit" />


  6. Create “module.config”. This file instructs Episerver how route segment should work and register the plugin DLL:
    <?xml version="1.0" encoding="utf-8"?>
    <module loadFromBin="false" productName="Zanganeh RobotsTxtHandler" >
        <add assembly="Zanganeh.RobotsTxtHandler" />
        <route url="{moduleArea}/{controller}/{action}/{id}">
            <add key="moduleArea" value="Zanganeh.RobotsTxtHandler" />
            <add key="controller" value="" />
            <add key="action" value="" />
            <add key="id" value="" />
  7. Create web.config  under “View” folder, so Razor template works fine:
    <?xml version="1.0"?>
        <sectionGroup name="system.web.webPages.razor" type="System.Web.WebPages.Razor.Configuration.RazorWebSectionGroup, System.Web.WebPages.Razor, Version=, Culture=neutral, PublicKeyToken=31BF3856AD364E35">
          <section name="host" type="System.Web.WebPages.Razor.Configuration.HostSection, System.Web.WebPages.Razor, Version=, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" />
          <section name="pages" type="System.Web.WebPages.Razor.Configuration.RazorPagesSection, System.Web.WebPages.Razor, Version=, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" />
        <host factoryType="System.Web.Mvc.MvcWebRazorHostFactory, System.Web.Mvc, Version=, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
        <pages pageBaseType="System.Web.Mvc.WebViewPage">
            <add namespace="System.Web.Mvc" />
            <add namespace="System.Web.Mvc.Ajax" />
            <add namespace="System.Web.Mvc.Html" />
            <add namespace="System.Web.Routing" />
            <add namespace="EPiServer.Framework.Web.Mvc.Html" />
            <add namespace="Zanganeh.RobotsTxtHandler" />
            <add namespace="Zanganeh.RobotsTxtHandler.ViewModel" />
            <add assembly="Zanganeh.RobotsTxtHandler" />
  8. Create new “nuspec” file to instruct NuGet Package Manager to copy “View” properly into proper area:
    <?xml version="1.0" encoding="utf-8"?>
    <package xmlns="">
    	<metadata xmlns="">
    		<tags>Robots EPiServer</tags>
        <file src="module.config" target="content\modules\Zanganeh.RobotsTxtHandler\module.config" />
        <file src="Views\web.config" target="content\modules\Zanganeh.RobotsTxtHandler\Views\web.config" />
        <file src="Views\Admin\Index.cshtml" target="content\modules\Zanganeh.RobotsTxtHandler\Views\Admin\Index.cshtml" />


  9. I’m using “MyGet” (can take a look using link) to build my project and generate NuGet file and hosting purpose.

You can fork GitHub and use push it to your own MyGet to see how it works. Valdis Iljuconoks mentioned good point why not extending Link. I just started working on this project for learning purpose and one of our client was asking for this feature and because it is simple I just picked it up to try the Episerver Plugins.

Azure Cognitive Text Analytics key phraser used to Tag an Episerver content

I wanted to play with “Cognetive Services” and thought best to use Episerver for  this reason. I wanted to have below functionality:

  1. Mark “Content” properties I want as “Tageable” content.
  2. Using Azure Congetice Text Analytics key  phraser to detect tag a content
  3. Save key phrses into “Tag” property of the content

Quite simple. To achieve this need to:

  1. Signup for “Free”  Azure Cognitive as below: (got from here)
    1. Navigate to Cognitive Services in the Azure Portal and ensure Text Analytics is selected as the ‘API type’.
    2. Select a plan. You may select the free tier for 5,000 transactions/month. As is a free plan, you will not be charged for using the service. You will need to login to your Azure subscription.
    3. Complete the other fields and create your account.
    4. After you sign up for Text Analytics, find your API Key. Copy the primary key, as you will need it when using the API services.
  2. Write a code to gather all contents on publishing as below
    string apiUrl = "";
    string apiSubscriptionID = "value from Azure Cognetive API";
    string language = "en";
    IEnumerable<string> contents = new string[] { "first content to detect", "second  content to detect" };
     using (var webClient = new WebClient())
                    webClient.Headers.Add("Ocp-Apim-Subscription-Key", apiSubscriptionID);
                    webClient.Headers.Add(HttpRequestHeader.ContentType, "application/json");
                    webClient.Headers.Add(HttpRequestHeader.Accept, "application/json");
                    var document = new Request { Documents = contents.Select((item, index) => new Content { Id = index, Language = language, Text = item }).ToArray() };
                    var requestContent = JsonConvert.SerializeObject(document);
                    var result = webClient.UploadString(apiUrl, requestContent);
                    return JsonConvert.DeserializeObject<Response>(result).Documents.SelectMany(a => a.keyPhrases);

    As you can see you need “API Subscription ID” which  you can get it from Azure Portal same API Url.

    As you  can see the concept is quite simple, you pass your “Content(s)” and it returns back key phrases.

  3. Now what you need to  do is to change your Page, Block, .. (Content) to inherit “ITageablePage” and apply “[TextAnalysisRequire]” attribute to any content type property which you want to consider as a part of text analysis. Sample as below:
            GUID = "17583DCD-3C11-49DD-A66D-0DEF0DD601FC",
            GroupName = Global.GroupNames.Products)]
        public class ProductPage : PageData, ITageablePage
        GroupName = SystemTabNames.Content,
        Order = 310)]
            public virtual XhtmlString MainBody { get; set; }
            public virtual string Tags { get; set; }

You are now all done! When you publish a “ProductPage” instant it will automatically read “MainBody” property, strip out HTML and pass “Text” to Azure Text Analytics service, ask for “Key Phrases” and store them into “Tag” property! Cool and easy, yeah!

This can become a game changer for the concept of CMS, so now  you can leverage “Personalization” and “Tagged Content” to present your customer what they are looking for. I’m going to write a roadmap for this and make it work with  new personalization concept.  I will push this as NuGet package!

You  can  find the code here

Please bare in mind the API is still in “Preview” version!

RobotsTxtHandler project – How it works (part 2)

You need to have Episerver CMS project (CMS >= 10.0.2). Use Episerver NuGet (if you want check here) and search for Zanganeh.RobotsTxtHandler or directly running code below in “Package Manager Console”:

PM> Install-Package Zanganeh.RobotsTxtHandler

Then re-build  and run project. Then goto {siteurl}/EPiServer/CMS/Admin/Default.aspx and on left hand menu click on “tools -> Robots.txt Handler”:


You can select current site from “Dropdown”and enter text you want to show as “robots.txt” file for that site.

When you save you can see list:



And if you got to http://{siteurl}/robots.txt you can see what you entered in the textbox:



You can delete or edit robots.txt value using same list:


The  idea behind the scene is quite simple. For each site we store data of robots.txt into Dynamic Data Store. And having http handler that based on current site extract associated robots.txt and show the content. In next step I will describe Episerver add-on Gui.

IContentRepository.GetItems – how it works under hood!

I’ve been told IContentRepository.GetItems is heaps better. And when I asked  people why? They told me it is built DB load. Just want to clarify this. You pass a list of ContentReference (IEnumerable<ContentReference>) and it loads all IContent for each item. This seems to build DB load but when you take a look into code it does build load from MEMORY CACHE otherwise, it does load individually:

class EPiServer.Core.ContentProvider:

    protected virtual IEnumerable<IContent> LoadContents(IList<ContentReference> contentReferences, ILanguageSelector selector)
      IList<IContent> contentList = (IList<IContent>) new List<IContent>();
      foreach (ContentReference contentReference in (IEnumerable<ContentReference>) contentReferences)
        IContent content = this.Load(contentReference, selector);
        if (content != null)
      return (IEnumerable<IContent>) contentList;

So please consider this as  performance consideration.