About
Business analytics projects fail by not having an impact. The last mile problem of business analytics is the observation that in most projects the data-driven insights fail to translate into action within the company.
This piece is an introduction to the last mile problem, complete with a collection of analytics project failure modes and some pointers on how to finish strong instead.
Introduction
Business analysis is a business function focused on improving the enterprise. Business analytics is the craft of applying data analytics in this domain.
The purpose of business analytics is to study and make use of all data available to a business in order to upgrade the company's products and services, and to influence operations within the company. Business analytics plays an important role in helping the company grow and serve its customers better. It is the engine of transformative change within a business, and the driver of institutional learning.
Businesses today have mountains of data, and many have looked to business analytics for creative ways of mining the hillsides for riches. All kinds of pilot projects and elaborate data storage schemes and AI initiatives and whatnot have come and gone, but most organisations still struggle with putting their data to good use. Enterprises large and small have tried to turn their data into action, but few have succeeded.
For all our data science, dashboards and data-driven insights, most businesses still fail to see any return on their investment. Data and analytics projects can fail in a variety of ways, but they tend to have something in common as well: things fall apart towards the end of the project. Analytics projects fail at the human interface, where the engineering challenges transform into people challenges.
For example, once an analytics project has produced results, these need to be presented to management and the company at large. Sometimes results suggest changes in products and services, sometimes results point at inefficiencies in workflows or flaws in company culture. It's not just about informing people. For analytics projects to have a tangible return, the company needs to evolve and adapt and learn. Making all that happen requires skills that are quite different from engineering and data analysis.
The is the essence of the last mile problem of business analytics: How can data-driven insights be turned into transformative action in an effective way?
Falling Short
There are countless ways in which things can go wrong, countless ways in which projects can fail. This is known as the Anna Karenina principle: a happy result means that you have avoided all of the pitfalls.
Most analytics projects in most companies fail, and the projects tend to fail towards the end of the project. Things always start harmlessly enough, and for a while everything seems to be going just fine. In the beginning data is carefully collected, processed, and organised. Models and dashboards find their shape. Somebody decides on the questions and prepares the answers. Perhaps there's a great reveal and then ... nothing happens. Nobody cares.
Analytics work often culminates in something to look at. In modern business intelligence, this is typically a self-service dashboard. Sometimes the outcome is an action plan. With the rise of machine learning initiatives, this could be a computational model — hopefully packaged in some useful way.
Regardless of the output, most projects fail to have an impact in the organisation. It's just that nothing happens. Even if the project was well structured. The team may have hit every objective, there's dashboards and reports and models and powerful query systems ready to go, but after a while nobody is getting significant value out of any of it.
People shuffle about, key people get anxious and some leave — they wanted to have an impact! The team tries again, maybe the tech stack gets an upgrade. Things slow down. Maybe some fresh faces join in and a new data initiative gets going. Something compels organisations to try and try again. More often than not the result is more of the same.
Months go by, even years go by. The company keeps investing in data and analytics. "We have to keep innovating," the management will say, "or our competitors will pull ahead!" Eventually there is a review and nobody can point at any tangible returns, certainly not in proportion to the investment.
This is how McKinsey frames the last mile problem of analytics. The real challenge of business analytics is not the data, or the advanced tech, or the sophistication of the analytical methods. It's all about impact. The mission must be to deliver "the right insights to the right people at the right time in a way that informs their decision making to drive better business outcomes."
Chris Brahm of Bain & Company agrees: "Enterprises need to prioritise the last mile of their process as they approach data analytics to achieve great results." Whether it's a frontline worker, manager, or even a machine, the struggle is with the last mile, the gap between great analytic output and actual changed behaviour that creates value in the enterprise. When analytics is done right, you begin with that last mile in mind. You make sure the last mile is paved smoothly to results.
The thing is, most business analytics solutions leave it all to the user, when it comes to turning insights into action. The vital task of communicating insights and driving action is left to the analyst and the data science professional, and they may or may not have the necessary skills to present their findings in a compelling way.
Business analytics tools, both old and new pipelines, generally fail to support businesses in carrying data-driven insights across the last mile into the kind of action that has an impact on the bottom line.
How Analytics Projects Fail
Let's take a closer look at some analytics project failure modes. Below is a list of things everyone can do to make sure their data-driven insights never cross the last mile of analytics delivery.
Naturally, all of these are to be avoided. Somehow. The description offer some perspective.
-
Focus on building algorithms before identifying appropriate use cases for the output.
At Oliver Wyman, Lierow et al. argue that having the right data and toolsets is only half the story. The technical elegance of the solutions does not matter if no tangible value is created for the business. -
Isolate analytics and data-driven insights from the rest of the business.
Bisson et al. at McKinsey find that analytics-based decision making must be embedded into corporate culture. Scaling analytics is about building an environment where workers are encouraged to embrace analytics as an essential tool for challenging established thinking and for augmenting judgement. -
Ignore the cultural impact.
Marli Buluswar of AIG believes that adopting analytics and evolving a culture of learning is mostly about imagination and inertia. Driving changes in company culture need not come at a great cost. -
Start with the data.
Starting analytics initiatives by looking at what data is available and seeing where it can be applied limits the impact that analysts can make. Bisson et al. argue that companies should work in the opposite direction. Businesses should start by identifying the decision-making processes that could be improved and then work backward to determine what type of data insights are required to influence these decisions and how the company can supply them. "The last mile should be the starting point of the analytics journey." -
Preach to the choir.
Changing company culture requires a shared, strong commitment from all levels of management. Turning business analytics from a collection of isolated vanity projects into an engine of transformation requires that everybody is on board. Data-driven decision making cannot be dismissed either as the latest fashion of the executive suite or as some renegade effort too controversial for senior management. Those who introduce data-driven business analytics into a company need to win support from everyone. -
Fire and forget.
In their collection of analytics programme failings, Fleming et al. from McKinsey warn executives about complacency. Making a few big moves — in hiring talent and investing in data infrastructure, for example — is not enough to leave the main analytics challenges behind. Scaling analytics into a function with impact takes a persistent effort. -
Play it safe.
Having an impact on the bottom line means taking some chances. Successful analytics programmes prioritise top decision-making and high value processes. Analytics projects can fail even when they go as planned, if the aim was too low and as a result the return on investment remains low as well. -
Play it loose.
Fleming et al. argue that analytics use cases should be prioritised based on feasibility and impact. The first analytics projects should have clearly defined goals so that they can deliver value already in their first year. -
Focus solely on results.
Bisson et al. emphasise that scaling analytics is about more than just the insight pipeline. Companies should have a clear data strategy and strong data governance functions: a comprehensive plan for how data is procured, acquired, stored, managed and accessed. Similarly, on the other end of the pipe, there should be clearly defined decision-making rights and roles and accountability.
👉 With many analytics initiatives, there are also ethical, social, and regulatory concerns to address. -
Don't bother with steering.
Top management has to have enough understanding about advanced analytics to identify valuable initiatives. Beyond individual projects, every data-driven business should have a long term vision for the role analytics will play. Fleming at al. recommend setting up a series of workshops to coach the executive team in the essentials and to undo any lingering misconceptions. -
Forget the existing business.
"Put business science before data science," write Brahm and Sherer, "a company’s advanced analytics goals should reflect the company’s broader aims, allowing it to amplify its most profitable products, services and processes." -
Think short term.
Skip establishing a vision for the analytics programme. "Analytics should have its own strategic direction," say Fleming et al. Every business should have a strategy for how they can generate value with analytics beyond pilot projects. Tackling the company’s analytics opportunities in an unstructured way undermines the overall effort — a missed opportunity. -
Leave analytics roles vague and undefined.
Organisations need a variety of analytics talent with well-defined roles: business skills, tech skills, analytics skills. Fleming et al. suggest that analytics talent is best understood as a tapestry of skill sets and roles. Capabilities and roles overlap — some regularly, others depending on the project. One should aim to build a skills inventory to match people with project requirements, and then train and hire externally to fill vacancies.
👉 Be sure to check out this interactive McKinsey diagram on analytics roles. -
Leave the analytics function disorganised.
Fleming at al. believe that hybrid operational models work best for most businesses. Centralised, far removed analytics teams and poorly coordinated analytics pockets are both less effective than a centrally coordinated, but operationally distributed analytics effort. -
Don't bother with performance metrics.
Analytics programmes are often substantial investments, but equally often get to play by rather different rules to many other programmes. Fleming et al. insist on measuring the quantitative impact that the analytics efforts are providing. Companies should invest in a performance management framework and metrics tracking. It should be possible to evaluate analytics projects by their financial impact, perhaps every quarter. Understanding how analytics initiatives deliver value helps with overall resource allocation. -
Ignore the results.
Bradley Fisher of KPMG writes that many executives are not able to trust the analytics produced by their companies. Two thirds of CEOs have ignored data-driven insights, because they have contradicted their own intuition or experience. "This sense of suspicion risks hurting good decision-making," Fisher writes. The right way to gain confidence in data is to build the necessary frameworks of checks and balances to ensure that data is trustworthy. -
Brute force analytics.
"When it comes to generating value from analytics, following the right approach is just as important as having the right data," writes Robert Holston at EY. Businesses should not neglect the 'softer' capabilities needed to use analytics effectively. When it comes to applying analytics results, automated processes often require a human to make a business decision or change a business process. Organisations that work towards removing behavioural barriers will see a greater impact from analytics. -
Focus on narrow data sets.
Data likes company. Greatest analytics insight is built with data sets that integrate different kinds of data from multiple sources — internal and external. Businesses must be able and nimble when it comes to setting up new data sources, procuring third party data, and turning data assets into easily consumed analytics insight. -
Focus solely on the executive function.
Data-driven businesses deploy analytics in the front lines as well as in the executive suite. The objective should be to make analytics-driven decisions possible at every level of the organisation. Analytics and analytics professionals should be integrated into every key area of businesses. Holston, at EY: "Decision-makers must be able to access analytics-driven insights whenever and wherever they need them. Increasingly, that won’t be when they’re sitting in front of a workstation." -
Hire for maximum hype.
"Companies focus on investing in the technology and talent associated with advanced analytics, without thinking through the broader changes they will need to make to fully deploy analytics and get favourable results," write Brahm and Sherer for Bain. Many rush to invest in the latest analytics software and infrastructure vendors and hire data scientists, but the ultimate winners will align these investments with their strategic and organisational needs in ways that lead to action and results. -
Ignore barriers to adoption.
Organisational inertia blocks analytics efforts in many companies. Businesses should anticipate and plan through potential barriers to adoption and the output of analytics investments. In other words, analytics should be designed with the last mile in mind. As critical design choices are made, the analytics team should consider how end users might respond to results. -
Hoard as much data as possible.
Brahm and Sherer of Bain warn against spending too many resources on mindless data collection. "Find the shortest path from insight to action. More data isn’t necessarily better. In fact, it’s often the opposite." Businesses should focus on the yield they can get from their existing data, complemented by ability to generate insights quickly on demand. -
Insist on a perfect platform.
"The savviest companies don’t wait until they have the perfect analytics solution in place," write Brahm and Sherer. Analytics pilots should just jump right in and try new approaches on real customers and processes, even if the tools are barely viable. Early, continuous feedback will lead to a better final result. Furthermore, the value in analytics comes from the action, not the plan.
In the News
Some headlines from consulting, market research, and industry news.
Recall that in management consulting the business model is to identify the problem, and then sell solutions and services that treat the symptoms, never the cause. Market research is about charging for access and exposure; the trends and rankings used to do this may be real or imaginary. News, of course, is a game of surprising headlines.
-
"70% of enterprises view advanced analytics as a critical strategic priority, but only 10% actually believe they're achieving anywhere near the full potential value of those analytics." (Bain & Company)
-
"Hopes are high, the investments substantial; but the results have been inconsistent at best." (Bain & Company)
-
"Only 8% of global companies have been able to achieve their targeted business outcomes from their investments in digital technology. Said another way, more than 90% of companies are still struggling to deliver on the promise of a technology-enabled business model." (Bain & Company)
-
"A 80-90% failure rate for data-related projects shows we collectively haven’t yet figured out how to run the entire analytics race." (Forbes)
-
"In the 2019 NewVantage Partners survey, 77% of companies found business adoption of data initiatives was a major challenge. Executives who participated in this survey noted people and process (95%) represented the bulk of the challenge — not technology." (Forbes; New Vantage Survey (pdf))
-
"Through 2020, 80% of AI projects will remain alchemy, run by wizards whose talents will not scale in the organisation. Through 2022, only 20% of analytic insights will deliver business outcomes." (Gartner)
-
"Why do 87% of data science projects never make it into production?" (VentureBeat)
-
"Over half of CEOs saying that acting with agility is the new currency of business1 but almost 80% of CIOs saying their digital strategy is only moderately effective, or worse." (KPMG)
-
"It is no wonder 85% of today’s AI projects are unsuccessful." (Booz Allen)
-
"Despite the explosion of big data in recent years, less than 0.5% of data is being used to help guide decision making." (Booz Allen)
-
"The end result [of a bad data culture] is that valuable information is withheld from decision makers. Research has shown almost 33% of decisions are made without good data or information." (Booz Allen)
-
"We found that barely 10% of companies were managing to grow revenue from digital initiatives to more than 5% of group revenue." (PwC)
-
"Gartner, a research company, predicted that by 2017 almost 60% of big-data projects would fail to go beyond piloting and experimentation. Wired magazine noted that almost 70% of enterprise project money is spent on aggregating, storing, and optimising data before a single penny of value is created." (Harvard Business School)
-
"According to Gartner, through 2022, only 20% of analytic insights will deliver business outcomes." (HBR)
-
"Gartner estimates D&A projects falter 60% of the time. Why? We’ve observed that it’s often because they are not supported by the right organisational structure and talent and are not aligned with the business strategy." (HBR)
-
"Less than 25% of organisations feel that their data and analytics maturity has reached a level where it has actually optimised business outcomes, according to International Data Corporation (IDC)." (HBR)
-
"67% of surveyed companies had no 'well-defined criteria to measure the success' of big data investments" (HBR)
-
"Fewer than half of analytics programs met initial return-on-investment (ROI) goals." (HBR)
-
"A 2019 study found that 40% of organisations that make significant investments in AI do not report business gains." (MIT Sloan)
-
"According to Gartner, over 90% of deployed data lakes will become useless as they are overwhelmed with information assets captured for uncertain use cases." (Datanami)
-
"In order to compete in the age of the customer, your data strategy must be top-flight. The vast majority of firms come up short in this regard, with fewer than 10% of firms sustaining competitive advantage through mature systems of insight." (Forrester)
-
"91% report that improving the use of data insights in decision making is challenging for their organisations." (Forrester)
-
"Firms continue to invest in data, people and technology, but in 2017, data and analytics pros reported basing fewer business decisions on data (45%) than in 2016 (49%)." (Forbes)
-
"86 percent of executives say their organisations have been at best only somewhat effective at meeting the primary objective of their data and analytics programs, including more than one-quarter who say they’ve been ineffective." (McKinsey & Company)
-
"Just 35 percent of executives surveyed said they had a high level of trust in their own organisation's use of data analytics while 25 percent admitted they either have limited levels of trust in, or actively distrust, the data they receive. [..] 67 percent of CEOs say they have ignored the insights provided by data analysis or computer-driven models in the last 3 years, because they have contradicted their own intuition or experience." (KPMG)
-
"84 percent of businesses fail at digital transformation, never seeing clear outcomes or a solid business case." (Kearney)
-
"47 percent [of surveyed companies] still don’t have a clear, quantifiable analytics business case, which makes securing the investments needed to move forward difficult." (Kearney)
-
"Only 39 percent of respondents say their company has a strong cultural orientation to data-driven insights and decision-making, and a similarly low percentage (37 percent) feel employees in their company are aware of the importance of data analytics." (Deloitte)
Finishing Strong
"The last mile is where the emphasis pivots from technology to people."
The standard model for business analytics is a straight line. Data is first collected and prepared for analysis, then explored and modelled every which way, until finally packaged and deployed as a data product. The stages before deployment take up lots of resources — both human and machine — but it is all for nothing, if data doesn't translate into action and business value in the end.
Author, analytics hero, Brent Dykes was one of the first to realise that data initiatives fail in their final stage, the last mile of business analytics. Writing for Forbes, Dykes argues against the standard model: under pressure to deliver, people don't want to spend time exploring dashboards and data stores.
Analytics apps, with their shiny user interfaces, may well hold valuable information, but it's too much to fold into existing workflows. Dashboards and interactive things aren't always easy enough to use to be included in the everyday routines of, for example, a customer service team. The data languishes on dashboards, the insight generator is never fully adopted.
Dykes argues that the solution is to focus on the last mile, and to finish strong. He offers two key principles:
1. Prioritise the last mile of the analytics journey **and work backward.**
2. Adoption, almost by itself, determines the success of your data initiatives.<br />**Ensure people are running downhill, not uphill.**
"[Companies] should start by identifying the decision-making processes they could improve to generate additional value in the context of the company’s business strategy and then work backward to determine what type of data insights are required to influence these decisions and how the company can supply them."
The thinking is that by starting with the end in mind, the last mile, businesses find the necessary focus for their initiatives. The objective after all is to help people with their actual problems. The users and the usage should inform the design of data initiatives. Analytics should make it easier for people to make good decisions.
Framing the problem backwards highlights the importance of thinking about impact. Crossing the last mile is not easy, so it has to be worthwhile. Businesses should identify specific value-generating decisions upfront, and then gather the data to support those decisions. Starting with the data leads to irrelevant results and uncertain impact.
Finally, the last mile requires a holistic approach. It's about people much more than it is about data or technology. The last mile is about communication, usability, workflows, accountability, and so on. All of these things have to be aligned with the mission.
Conclusion
Business analytics projects fail by not having an impact. This can happen for a variety of reasons, but interestingly most analytics projects tend fall apart towards the end of their run. Projects fall short of their intended impact, because the hard-won data-driven insights fail to translate into action within the company. This is the last mile problem of business analytics.
Solving the problem begins with a shared understanding that the last mile is less of an engineering problem, and more of a people problem. Adoption determines the success of analytics initiatives, and the winning approach is to start with a plan for that adoption. Analytics projects should begin with the last mile.
The business analytics industry is currently trying to figure out solutions to the last mile problem. Some companies believe in educating their users, others believe in augmented analytics.
Perhaps the most exciting approach to the last mile problem of business analytics is actually an old idea: stories. Data storytelling is the craft of communicating data-driven insights in a narrative form.
With data stories, the thinking is that communication is the real issue with the last mile. Dashboards and reports fail to engage with decision makers and other analytics consumers, because they require too much from the user. Packaging the same insights into narrative form helps with engagement. Engagement leads to understanding and then understanding leads to action. Clear communication, compelling stories, inspire change within the company.
I'll take a closer look at data storytelling in a future post.