Improving your BI capability from levels 1/2 to levels 3/4
Background
Key performance indicators (KPIs) are not the recent innovation which you might assume – emperors of the Wei Dynasty in China rated the performance of the official family members around 230AD. In 13th Century Venice, merchants tracked the success of expeditions by comparing the cost of products bought to the price they achieved at sale and return on investment was born.
In 1912, a DuPont explosives salesman called Donaldson Brown invented the formula for return on equity in an internal efficiency report. During the 1930s, French managers formalised the tableau de bord, although the concept did not gain adoption outside of the country because of the language barrier.
It was the use of key performance indicators by General Motors during the 1930s and by GE during the 1980s which really embedded the concept of KPIs as a core management tool. The formalisation of performance metrics by Dr Robert Kaplan and David Norton in 1992/3 in the Balanced Scorecard extended the range and impact of business intelligence beyond simple activity reporting. Even when certain techniques gained notoriety, such as the “stack and rank” method of grading workers, up to 60% of Fortune 500 companies were using this approach in 2012.
If today’s line of business (LOB) managers and C-suite executives demand dashboards, reports and KPIs, it is because there are decades of managerial culture and business school teaching that have persuaded them it is the right thing to do. At issue, however, is the nature of the metrics being used, their volume and relevance to the business, how those metrics are defined, where they get created and how they are delivered.
This whitepaper considers some of these issues in the context of how to enhance the development and delivery of KPIs by the business intelligence (BI)or data and analytics office (DAO). Specifically, it considers how to improve CARBON™ capability scores on these questions:
- How centralised is the generation of BI reporting in your organisation?
- How does BI reporting get delivered into the organisation’s top-level leadership team (such as your senior executive team or board)?
- How is your BI reporting function aligned with internal reporting demands?
- At what level does your BI leader report into the organisation?
At its heart is the issue of how to ensure KPIs are genuinely driving the business and supporting decision-making. As one DataIQ Leader put it, senior executives can risk defaulting to assumptions about root causes, such as when retailers see a downward trend in sales within a category. As they said, understanding the real reasons for changes in numbers is important because, “it’s not always the weather.”
Who measures – Lines of business, business intelligence, data and analytics?
As management consultancy guru Peter Drucker said, “what gets measured gets managed”. One direct consequence of this view is that managers want metrics for the things they are managing – and lots of them. Organisations that are only at Level 1 (“Aware”) or Level 2 (“Repeatable”) in their business intelligence are likely to have a huge range of KPIs operating within the business.
The scale of this problem was outlined by Eddie Edwards, former head of BI at British Gas, at the DataIQ Summit in 2015. He noted that, “we had 20,000 KPIs, thousands of reports, shadow MI and 1,500 people requesting data.” Not surprisingly, this was identified as unsustainable and so became the focus for a management information transformation project. KPIs were centralised into the MI function, standards were set, outputs standardised and thousands of metrics reduced to a few hundred.
The maturity curve defined within CARBON™ identifies the next step into Level 3 (“Defined”) as requiring an appropriate level of reporting adopted by each LOB and Level 4 (“Managed”) introduces an agreed set of standards. The final step into Level 5 (“Optimised”) sees the centralisation of BI into a centre of excellence. This may or may not be appropriate for every organisation and is not necessarily desirable, especially where resources may be limited. But some important dimensions of KPIs need to be moved away from LOBs and into BI specialist areas (or the data and analytics functions – see below).
BI transformations are often the starting point for the datafication of an organisation.
As an example of how KPIs can represent a blind spot and area of low maturity in otherwise high-performing organisations, an international bank discovered that its trading floor had no metrics in place, only its basic P&L. This meant it was unable to recognise which types of trade or customer were delivering profit as opposed to cost into the business.
BI (or MI) transformations are often the starting point for the datafication of an organisation for reasons set out below. For example, one membership organisation with a substantial information and research division is currently working to get heads of business functions to define their KPIs as part of a three-year plan to develop effective business intelligence that helps the organisation make big decisions. The ultimate goal will be to deliver self-service reporting to LOBs. Establishing stakeholders’ needs and helping them to get more value out of the incumbent BI software tool is also part of the plan.
This type of project is often the result of changes in personnel at the highest level of the organisation. At a licensed gaming organisations, for example, the arrival of a new CEO (combined with the turnover of 80% of the C-suite in the last 18 months) is seeing a transformation of KPIs with a goal of establishing a set of simplified executive metrics. The challenge is to create simple measures which do not trigger knee-jerk reactions. There is also a desire to move away from volume-based metrics towards those based around customers.
None of this implies that LOBs should be excluded from the process of KPI development and delivery. Indeed, they should continue to own key metrics and, as far as possible, self-serve access to these as well as deeper exploration of the data which underpins them. However, a critical step in maturity is to migrate the following away from LOBs and into BI or D&A teams:
- data governance – ensuring data lineage and quality, as well as acting as gatekeeper;
- standards and standardisation – working with LOBs to define KPIs to common standards.
Data governance and standards for key performance indicators
One of the prime motivations for the introduction of a chief data officer (CDO) and/or the creation of a data and analytics office (DAO) is often the desire to ensure alignment of reports and numbers being relied on within business intelligence. Establishing a “Ministry of Truth” in this way underpins the status of the CDO and DAO and also provides a line to value from the investment.
Regulatory demands can also make this step unavoidable. For example, the General Data Protection Regulation has ensured the risk factors around holding personal information have gained C-suite attention, leading to a stronger focus on how this asset is managed and funded. The Basel Committee on Banking Supervision BCSB 239 (Principles for effective risk data aggregation and risk reporting) directly led to one investment bank recognising that it needed a CDO.
With this individual and team in place, a greater emphasis on data quality naturally follows (see CARBON guidance: Improving BI data quality). While setting standards for data quality, it becomes logical to establish similar standards for KPIs.
KPIs need to be genuine drivers of the business – indicators that identify actions which can be taken to achieve improvements. It is not the job of the CDO to provide KPIs for every business area – it is to ensure those metrics are relevant. The board needs confidence in the numbers it is being given by LOBs – the CDO is there to ensure the data being used is reliable, accurate and complete.
“If this number changes, what could you do?”
KPIs have to help make business decisions. But ownership of numbers is political – for example, finance may own revenue metrics, even though the base numbers flow through the data office and on to the board. Having a single version of the truth which can be relied on creates a need to take ownership and control into one place. The DAO or LOB should be the place for this to happen, but not IT. LOBs have the best understanding of how their area of the business operates – the DAO sees end-to-end how the whole organisation functions. The goal is then to provide self-service access and tools for day-to-day reports to keep this workload off the data and analytics team.
Defining KPIs should involve close engagement between the DAO and the LOB. At one national grocery chain, there is a two-year project to review and develop its BI which involves helping business functions to define key metrics using the simple question, “if this number changes, what could you do?” This process requires putting in place data stewards to own the definitions, such as customer, profit, etc.
Although business functions are likely to have definitions for KPIs readily to hand, care should be taken not to simply accept legacy metrics just because they have always been there. One media group found that legacy metrics have been focused on page views and ad views – its transition is towards audience revenues and customer lifetime value. Some functions may even resist the use of KPIs – an educational establishment faces particular challenges with KPIs since defining appropriate metrics is complex and many stakeholders do not welcome measurement of their activities, even though metrics are necessary for reporting to external bodies, for example on the number of students.
Existing metrics that are considered business-critical can equally turn out to lack real meaning once the DAO explores causation for those KPIs more deeply. In the case of a peer-to-peer lending platform, sales was pursuing KPIs such as length of call, whereas it emerged that the most significant measure was the time window between a business submitting an application online and receiving a call to process it. This had a greater impact on conversion rate to closed loans than the activities which were being measured. The DAO also identified that, if the first target window was missed, the next best option was at the same time the next day, rather than the same day at another time. As its chief analytics officer noted: “Habit matters.”
Establishing the suite of KPIs which are to be delivered will involve careful engagement with stakeholders across the organisation and detailed management of source data, definitions and outputs. While internal business managers are well placed to provide an understanding of what can or should be measured, there are also useful external reference points. Bernard Marr’s “Key performance indicators: The 75 measures every manager needs to one” is a useful resource.
Maturity in BI should also see a migration of responsibility for reporting towards the CDO and DAO and aware from LOBs. This is a lagging development with Level 3 organisations still having a BI leader who is at mid-management level. By Level 4, the report is to a C-suite individual, although not necessarily the most appropriate one. Only at Level 5 does the BI leader become a direct report to C-level.
Delivering the right KPIs to the right parts of the business
For any BI team or DAO, there will always be a tension between the demand from LOBs for routine reporting and their own desire for more forward-looking insight. This has an impact on the sustainability of investment into the office as well as its workload, since the business may value metrics, but it seldom sees them as adding value.
An example of the difficulties that can arise is the peer-to-peer lending platform. At its heart, it has a simple BI requirement which is deliberately restricted to 10 – 12 top-level KPIs, with a further 50 metrics in operation that contest for that place in the C-suite reporting pack. The challenge facing its DAO is that the business has grown by 80% in the last year, yet data is still being managed in Excel. The breadth of metrics being developed is also expanding, for example with a new requirement for econometric modelling of the impact of above-the-line advertising by the brand on its demand funnel.
Two approaches can typically help to resolve this kind of problem – automation and layering.
Automation is already well understood as a way to offload routine reporting by creating self-service access to baseline information and dashboard user interfaces. These flow easily out of the single version of the truth data quality and data management initiatives which CDOs are already deploying.
Automated KPIs can also be used to generate engagement metrics for dashboards to show how much they are being used, although these can require more effort than building the dashboards in the first place. Any metrics which are not used for three months should be retired.
Layering is an emerging tactic which addresses many of the operational and workload issues outlined above. In this approach, KPIs are developed in tiers:
- C-level – a combination of top-level metrics together with bespoke analytics give the C-suite trends and insights for executive decision-making;
- Core – routine metrics of day-to-day importance to the LOBs which are automated and delivered through self-service tools as much as possible;
- Exploratory – where business problems are being identified through red flags in KPIs, the DAO undertakes data exploration and modelling to identify root causes.
Both of these approaches require the DAO to be provisioned with full access to organisational data and to possess the appropriate data engineering, analytical and development resources. Typically, these do not start to be found until an organisation has reached at least Level 3 in CARBON terms.
Challenges – politics and personalities
Addressing KPIs in the way outline above is, of course, an ideal view. It is based on the idea that the DAO is an independent function, so it can hold the business to account through metrics that are credible, rather political.
As part of this, the optimal practice is business partnering where the DAO works with LOBs to define KPIs, IT provisions appropriate data, which in turn is governed by the data office and distributed through self-service and data visualisation tools. The CDO takes board-level responsibility for the way these KPIs are produced, with the LOBs owning them.
Managers may dislike the spotlight KPIs shine on their true performance.
In reality, this ideal is more challenging to realise. Revising KPIs in line with standards can encounter resistance, not least because managers may dislike the spotlight it shines on their true performance. One solution is to adopt rigorous discussion of metrics within a “no blame” culture – the peer-to-peer lending platform holds a regular forum for senior leaders to examine those top-level metrics forensically. While often uncomfortable, the fact that it is not done for negative purposes leads to more effective decisions.
KPIs at LOB level can also sometimes conflict with those used strategically for the business as a whole. For example, marketing may track the volume of applicants it drives to an online portal, whereas the more important KPI for the organisation is the conversion rate and value of those applicants, which may be determined by the source they come from. Tracking these fault lines in the logic or connective tissue of KPIs can be an important aspect of the exploratory analysis carried out by the DAO.
Finally, giving the DAO responsibility for BI and KPIs is not always an easy fit with the culture and personality type of analytics practitioners. As one gaming platform put it, “analysts find it hard to stand up to sales”, even when the numbers are on their side. It can be a weakness to believe that the truth which those numbers contain will carry the day – human behaviour is not always rational in this way. After all, even when other factors such as pricing, stock levels or competitor activity lead to a decline in sales, many retailers still prefer to believe it is down to the weather.