Thanks for coming by to read the third post in my series concerning KPI frameworks. In my first post I discussed why you need a KPI framework, and in the previous post I covered the steps necessary for generating a Top Down KPI framework. In this post I provide an overview of the alternatives to top-down models, discuss some issues that people commonly encounter when setting up these frameworks and provide pointers to further reading to learn more about this subject.
Consequently this document is split into the following sections:
- Alternative models
- Common pitfalls
- Further reading
1) Alternative models
While KPI frameworks contain much of the same information, different approaches can be taken based on personal preference. The three alternatives to “Top-Down” are as follows:
1.1 Lifecycle-based models
Lifecycle-based frameworks are structured around the different stages of a visitor’s relationship with websites. Consequently goals are bucketed by these different stages, rather than being derived from explicit business objectives as is the case with Top-Down models. There are multiple varieties in this category, including Xavier Blanc’s REAN model, and Dave Chaffey’s RACE model. These are often named according to the way the different visitor interactions are modelled.
For example, in the case of RACE, the stages are:
The definitions provided below are largely taken from http://www.smartinsights.com/digital-marketing-strategy/race-a-practical-framework-to-improve-your-digital-marketing/.
1.11 Reach
“Refers to building awareness of a brand … in order to build traffic by driving visits to different web presences”.
1.12 Act
“Act is short for Interact. It’s about persuading site visitors or prospects to take the next step, the next Action on their journey when they initially reach your site or social network presence”.
1.13 Convert
This is where users reach the “conversion point” e.g. a sale for an e-commerce company.
1.14 Engage
This stage is about “building customer relationships over time through multiple interactions using different paid, owned and earned media touchpoints such as your site, social presence, email and direct interactions to boost customer lifetime value”.
Here is an example RACE framework.
While this does not have the segments and targets we see in the Top-Down model, these can (and should) be included in an accompanying document. Dave Chaffey manages this in a formal way with the “SOSTAC®” planning model (http://www.smartinsights.com/digital-marketing-strategy/sostac-model/).
1.2 Persona-based models
Persona based frameworks are as the name suggests; based on personas, or “fictional characters created to represent different user types within a targeted demographic, attitude, and/or behaviour set that might use a site, brand or product in a similar way” (http://en.wikipedia.org/wiki/Marketing_persona). Persona-based planning is commonly used in offline marketing and can provide an easy-starting point to digital measurement. The process for creating a persona-based framework is as follows (credit to Lynchpin):
Step 1: Define key site personas (this can be done via research/workshops) by determining why people might be on the site. Step 2: Define how each persona may be identified online (i.e. what does the behavioural segment look like). Step 3 : Define what a successful visit looks like for each persona based on the visitor’s objectives. This involves setting KPIs for each persona.
Step 4 : Use data to verify those personas existed in the first place. This can be done either by directly asking your visitors e.g. via onsite surveys or email, or by determining whether the given personas show behaviour onsite which is in-keeping with the way they have been categorised.
1.3 Composite models
Composite models combine different frameworks for the goal of ensuring completeness. E.g. Top-down & persona-based or RACE for each persona.
2. Common Pitfalls
Successfully implementing KPI frameworks can be tricky. In this section I explore some of the most common issues and offer some solutions.
These common issues are:
2.1 Too many unknowns and data trust issues
If a company is relatively new to structured approaches to digital measurement, there may be tracking limitations which prevent analysts from being able to see enough of the user journey to assign credible values to each of the conversion types. This makes it difficult to justify decisions which are based on numbers rather than “expert-opinion”.
Related to this issue is a lack of “data trust” – where tracking has been installed, but there are doubts about the accuracy of the data, either because of technical issues with the analytics program/implementation, or because of the assumptions which have been built into it. Attribution is a common area where such doubts arise. This again has the potential to turn people away from data-based decisions entirely.
The answer in both situations is the same – provide a roadmap that addresses the issues that cause the problem or doubts and facilitates a robust framework to be established. Using an interim model, with caveats that cover these issues, is sensible where possible. It should be noted that Analytics is data is never 100% accurate – it is intended to provide insights into trends and allow for approximate comparison. Do not become bogged down in calculations if data is marginally out. For further information on this subject, see Avinash Kaushik’s post “Data Quality Sucks, Let’s Just Get Over it” (http://www.kaushik.net/avinash/data-quality-sucks-lets-just-get-over-it/).
2.2 KPIs used too high/low for level of reporting
Certain metrics are useless at an aggregate level. One prime example is bounce rate . If a visitor bounces after having read the latest blog post, it would be hard to say for sure whether that it was a “bad” bounce. However, if a visitor came on transactional keywords, landed on a product page and then bounced before adding the item to their shopping cart, it is clear that is not the ideal outcome. Here context is critical to interpretation.
Similarly, trying to apply high-level metrics to low-level activity does not work. For example, attempting to determine the ROI of an individual tweet is a waste of resource as it offers little insight. This is because there are so many influencing factors that ROI measurement becomes futile. A far better approach is to analyse ROI at campaign level as this substantially lessens the issues around attribution.
To summarise; it is important to consider the scope of measures and how meaningful they are when including in reports.
2.3 Neglecting to gain buy-in
It is absolutely critical that key stakeholders involved in a project understand the importance of the KPI framework, and are provided with the opportunity of feeding into it. If an individual independently creates a framework without the appropriate feedback it can often result in resistance from the people it affects – even if the framework accurately captures all of the necessary objectives/goals/KPIs. This is because people want to be able to contribute or influence the areas in which they operate, particularly something as important as how results are recorded and performance is measured. Not facilitating this can result in an emotional response.
2.4 Not knowing what you can/can’t measure
There are limits to what you can/can’t measure, especially with standard analytics installations. For example, it is not possible to measure time on site for visitors who bounce in Google Analytics (given a standard, out-of the box implementation). This is because of the way this metric is calculated using time stamps every time a visitor visits a new page. As with bouncing visitors there is no second pageview, time on site cannot be calculated. Consequently, it is useful to have a technical analyst on hand to be able to report on whether the desired KPIs are immediately measurable and if not, whether it would be possible to customise tracking.
2.5 Don’t mix strategic KPIs and tactical metrics
Similar to point 2.2, it is important not to mix strategic KPIs (e.g. ROI) with tactical metrics (e.g. Google PageRank). Tactical metrics are only of use to the extent that they contribute to strategic KPIs. For example PageRank is only useful to the extent it helps garner organic search traffic. Consequently tactical metrics do not belong on the KPI framework.
3. Further reading
Race planning framework by Steve Chaffey.
Cult of Analytics by Steve Jackson
Web Analytics 2.0: The Art of Online Accountability and Science of Customer Centricity by Avinash Kaushik
Google Analytics by Justin Cutroni
Personas: The Art and Science of Understanding the Person Behind the Visit by Will King
Yana, our resident analytics guru, is ready to answer any analytics question you can throw at her. If there’s anything you’ve ever wanted to know about your website, how specific pages are performing, or how to enhance your tracking to generate valuable insight, this is your perfect opportunity…
Contact DAC today to find out more!