Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
FAQ for Eric Siegel’s New Book, “The AI Playbook”
  There are plenty of questions to answer about...
Announcing Eric Siegel’s New Book: The AI Playbook
  Dear Reader, I’m excited to announce the forthcoming,...
Predictive Analytics for the Call Center
 So, you just received your shiny new smart watch....
MLW Preview Video: Gulrez Khan, Data Science Lead at PayPal
 In anticipation of his upcoming keynote presentation at Predictive...
SHARE THIS:

This excerpt is from VentureBeat. To view the whole article click here.  

8 years ago
Data scientists to CEOs: You can’t handle the truth

 

Too many big data initiatives fail because companies, top to bottom, aren’t committed to the truth in analytics. Let me explain.

In January 2015, the Economist Intelligence Unit (EIU) and Teradata (full disclosure: also my employer) released the results of a major study aimed at identifying how businesses that are successful at being data-driven differ from those that are not.

Among its many findings, there were some particularly troubling, “code red” results that revealed CEOs seem to have a rosier view of a company’s analytics efforts than directors, managers, analysts, and data scientists. For example, EIU found that CEOs are more likely to think that employees extract relevant insights from the company’s data – 38 percent of them hold this belief, as compared to 24 percent of all respondents and only 19 percent of senior vice presidents, vice presidents, and directors. Similarly, 43 percent of CEOs think relevant data are captured and made available in real time, compared to 29 percent of all respondents.

So why is there such a disconnect? It turns out the answer is much more human than the size of a company’s data coffers, or the technology stockpiled to analyze it. Big data initiatives fall down at the feet of biases, bad assumptions, and the failure, or fear, of letting the data speak for itself. As insights make their way up the corporate ladder, from the data scientist to the CEO, the truth in analytics is lost along the way. And this leads to a cumulative effect of unintended consequences.

So why is there such a disconnect? It turns out the answer is much more human than the size of a company’s data coffers, or the technology stockpiled to analyze it. Big data initiatives fall down at the feet of biases, bad assumptions, and the failure, or fear, of letting the data speak for itself. As insights make their way up the corporate ladder, from the data scientist to the CEO, the truth in analytics is lost along the way. And this leads to a cumulative effect of unintended consequences.

Communicate the Known-Unknowns to Your CEO

Take the idea of known risks, for example. In analytics, you always have to make some assumptions because the data hardly ever paints a complete picture. So, you have to identify and rank those risks to understand what might happen when assumptions go wrong. In some cases, the risks aren’t tied to big consequences. But, in other cases, it can be devastating.

Look at the stock market crash of 2008. A whole host of people made a simple and logical assumption that home prices would only go up. But most analysts didn’t experiment enough with what would happen if prices actually fell. Well, now we know what would happen. It was almost a global calamity. The people investing in the pre-housing crisis bubble were working on an assumption that was very flawed on many levels. And very few people considered, or realized, the risk until it was too late.

The same thing happens, at generally smaller scales, in businesses. The CEO doesn’t have a clear view of risk. It is up to the data scientists, business analysts and their managers to make the CEO well aware of the risk in assumptions. The CEO needs to understand that there is a critical, level 1 risk in assumptions – in the housing example, if prices were to go down, this whole thing falls apart. Even if that risk is unlikely, at least it is on the table. Many people are uncomfortable discussing such negatives with senior executives and many senior executives don’t like to hear it. But to succeed, everyone must get past that hurdle.

Get Past the Culture of Fear of the Truth

Then there is the fear of the truth, with a bit of cognitive bias thrown in. For example, it is very common that sales people, when asked for their forecast, even armed with data on historical performance and current pipeline, are generally not sure if they are going to hit their number. But, typically, they’ll tell the VP of sales they will hit their forecasts – unless, of course, a miss is very apparent. They share the information they’re expected to share, and withhold any acknowledgement that the numbers are malleable.

The problem arises in the aggregate: The VP gets a rosy picture from five sales people on her team, even though they all have serious doubts, so she puts that assumption in and the data rolls up to the CEO, or CFO. In reality, the metric is underpinned by a huge amount of doubt. The truth is buried under the fear of losing one’s job and the cultural expectation that the goal will be met. Failure is not an option. However, while it is likely several of the sales people will manage to hit their number, the chance that they all will is small. This makes the VPs figures even more unrealistic than the initial estimates.

So what happens? Everyone is shocked when the company misses its forecast. This is an example of where people sugarcoat a little at the low end, and the cumulative effect leads to the business incorrectly forecasting company-wide results.

Don’t Underestimate the Future of the Truth

Another common problem is underestimating, or simply not considering, the confidence level in the analytics results that the CEO is being fed. Maybe we are comfortable with the data and the assumptions, we’ve asked the right questions and we’ve taken the risks into consideration, but we haven’t assessed the confidence level of our predictions. This gets into classic model assessment techniques in analytics. Is the forecast plus-or-minus 1 percent or 20 percent? If it is critical to increase sales by 5 percent and the model predicts 10 percent sales growth within plus-or-minus 5 percent, then we’re probably fine. But if the model predicts 10 percent sales growth plus or minus 15 percent, then we might be closing up shop at the end of the year if we aren’t careful.

Bill Franks is Chief Analytics Officer at Teradata, where he provides insight on trends in the analytics and big data space. He is author of the book Taming The Big Data Tidal Wave and most recently published his second book, The Analytics Revolution. He is an active speaker and a faculty member of the International Institute for Analytics. Find him at www.bill-franks.com.

This excerpt is from VentureBeat. To view the whole article click here.

Leave a Reply