The hype around "Big Data" has led to the growing availability of large, nearly comprehensive data sets to executives who previously had little access to this type of information. In many cases their lack of experience has led to an unquestioned assumption that any and all metrics are useful and meaningful. Motivated by an organizational desire to turn data into a narrative—one that can justify a business strategy or ad spend is often the most popular—they end up relying on incomplete, inconsistent or incorrect information. This is less a technical or statistical problem than a cultural one, driven by several key factors:
The organization lacks the proper filtering mechanisms.
Effectively separating the signal from the noise is both a technological and a human challenge. The axiom we usually hear is that we are “drowning in data,” but that is the wrong metaphor. The truth for most companies is actually closer to the reverse: their data flow is more like a faucet turned very low. The business leaders who might offer real insight see just drips and drops of data that typically come via cumbersome reporting. The availability of data is hindered by two factors. The first is that business analysts who can dive deeply into the data are typically in short supply, because they are both difficult to hire and are frequently seen as cost centers. The second, related issue is that the analysts that do exist tend to be isolated in centralized groups, like finance or business operations, rather than distributed among business units. These isolated analysts tend to lack the experience or context that someone embedded might gain. So data becomes a periodic reporting task by a few, rather than a real-time glimpse into the business by those in a position to really improve business outcomes.
The result is blind faith in the data and analysts. It’s not uncommon, for example, for marketers to take the advertising attribution models they are presented with at face value–because that is what the “quants” have told them. This isn’t because the marketing department is stupid–far from it. Rather, the issue is that some of the data used in attribution modeling are so technical and arcane that marketing simply doesn’t know how to question the underlying assumptions, and the quants don’t have enough experience in marketing to properly explain them. Being able to ask the right questions helps an organization get much deeper insight into the relative accountability of specific marketing channels.
Data is used to CYA.
Too often, the “data as gospel” mindset is used to avoid professional risk. Too many people are culturally indoctrinated to use data as a way to cover their proverbial ass. There needs to be a philosophical shift towards using data as a form of guidance rather than a post-hoc justification for decisions. This won’t be easy. For one thing, it will require people to chart a course that runs against their own self-interest if the data demands it. Reporting culture is onerous and overly dense.
Good data reporting is fluid and relatively light on presentation. At many companies, though, executives use the end of the quarter crunching-of-the-data to reinforce a pre-existing narrative with senior management. Because of the data reporting structure, this process consumes a lot of time and energy. Ultimately, this is a waste of resources. Rather than spending their time optimizing their business and moving forward, they spend time looking backward, using data to justify past decisions.
Moving from n=all to n=few.
So how can business leaders overcome the data trap? By developing and reinforcing a strong Key Performance Indicator (KPI) structure that isolates and emphasizes a few essential metrics that are structurally aligned with organizational needs. Too often, marketers falsely equate metrics in general with KPIs. The two are not the same. In fact, there should be a hierarchy of business management indicators. This structure allows an organization to optimize its business goals and analyze performance on multiple levels. The levels range from macro, C–suite level questions to micro, operational issues.
The top level of measurement is Business Goals: why does this initiative exist? For most businesses, this question will relate to revenue, but not always. Sometimes it can be a goal as abstract as driving brand awareness or as concrete as decreasing call center volume. Beneath the overall Business Goal, there are Objectives—the specific strategies used to accomplish the larger goal. Under Objectives are the KPIs that measure against the Objectives. Finally at the lowest level are individual Metrics—the data that constitute the KPIs.
Here's a basic example of how it works. Consider a media company with the common Business Goal of increasing revenue. An Objective is to sell more display advertising inventory. A KPI for this is the rise or fall in available inventory, or its proxy, the rise or fall of page views. The media company also could look at individual Metrics, such as unique visitors, pages per visit, or return visits. If the KPI—page views—is falling, there are three Metrics to analyze to correct the problem. Say that pages per visit and return visits are both up, but unique visitors are down--that means marketing activities like SEM probably need to be better optimized. But if unique visitors are up, but pages per visit are down—that means something different, like the CMS's referral tool needs to be improved.
A good KPI structure is the heart of the “test and optimize” cycle. In this cycle people aren’t reporting out every quarter for “the big reveal.” Instead they’re running a business based on data. Operating to the analytics that they’re reporting, they are diminishing the impact of the big reveal because they’re optimizing as they go. This cycle empowers managers to make decisions based on the KPIs that they’ve agreed to. So data as a form of CYA changes a bit. Sure people can blame the data for leading the wrong way, but at least everyone has agreed on the path.
It is very easy to say, “I want to run a data driven business,” because the benefits of using data to manage an organization are clear. It’s become all too frequent, however, for organizations to make significant investments in “big data” only to see them fail. Sometimes expectations were too high. Sometimes the technology implementation was broken. For the most part, though, it’s because the organization didn’t change its culture along with changing its technology. A data-driven culture is about people, not machines. It’s about people making decisions based on data—smart decisions based on KPIs and operating through a reliable planning structure. And it’s about saving the storytelling for selling to customers instead of the executive board.