Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Three Best Practices for Unilever’s Global Analytics Initiatives
    This article from Morgan Vawter, Global Vice...
Getting Machine Learning Projects from Idea to Execution
 Originally published in Harvard Business Review Machine learning might...
Eric Siegel on Bloomberg Businessweek
  Listen to Eric Siegel, former Columbia University Professor,...
Effective Machine Learning Needs Leadership — Not AI Hype
 Originally published in BigThink, Feb 12, 2024.  Excerpted from The...
SHARE THIS:

Prediction: Was I right?
By: Neil Mason, SVP, Customer Engagement, iJento
Originally published at clickz.com

 

11 years ago
“Prediction Will be Hot”: Now Time to Validate the Forecast

 

Toward the end of last year I suggested, predicted even, that prediction was going to be hot in our space in 2013. As we approach the halfway point of the year, it’s not a bad time to check that forecast and see where we are. After all, one of the aspects of forecasting is that it’s important to validate the forecasts and adjust your assumptions or model as needed. The world doesn’t stand still, things change, and so part of any predictive analytical process is a constant re-evaluation of the assumptions that make up your forecast.

My forecast was very much a subjective or judgmental one. It was based on observations of trends and a few signals. It wasn’t scientifically based in terms of being built upon vast amounts of data from which I forecast some particular variable. So I don’t have an econometric model to hand, I just have a number of checkpoints against which I can check my assertion. My data points are mainly around how much the subject is being talked about and how much it appears that people are actually doing it.

I guess I was being what you might call a “pundit.”

If we look at how much people are talking about it, then one proxy measure could be the amount of material being published on the subject. A search on Amazon for the term “predictive analytics” throws up a large number of books. On the first page of the search results over 60 percent of the books have been published in the past 12 months, so there’s clearly a growing body of new material being created on the subject, presumably being driven by increased demand. I’m currently working my way through two of them. “Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die” by Eric Siegel, and “The Signal and the Noise” by Nate Silver.

Nate Silver famously called the U.S. presidential election right in every single state. What’s interesting about Silver’s book is the subtext around the psychology of prediction and forecasting, particularly what he calls the “prediction paradox.” This broadly says that the more we appreciate uncertainty, the better we can become at predicting future outcomes. The subtitle to my version of the book is “The Art and Science of Prediction,” and that makes sense to me. Forecasting and prediction is as much about the use of judgement as it is about the use of science, and we have to build an appreciation of uncertainty into our forecasting processes. Overconfidence in forecasts can have catastrophic consequences.

The other signal is the amount of case studies beginning to emerge in our space. From the conferences I’ve been to this year I can see more organizations talking about what they are doing with predictive analytics and forecasting approaches. One of the hot themes is the fusion of traditional marketing mix modelling approaches with digital direct response techniques to understand campaign attribution and ROI. We’re also seeing systems modelling approaches being used to forecast the impact of the launch of new digital products. This isn’t to say that this hasn’t being going on for a while, but the fact that we’re seeing more of these kinds of examples being published shows that it’s now becoming more mainstream.

My last signal is the availability of skills and expertise in the space. My example here is from my own search for talent to join our business. Over the past six months I’ve been looking for people to join our team on both sides of the Atlantic. The brief I sent the recruiters was to find people with a good mix of digital analytics and predictive analytics/data-mining skills. I half expected to drive the recruiters nuts with this brief, but in the U.S. I soon started to receive a steady stream of good-looking resumes. What was an interesting pattern was a significant chunk of them were from people who had used predictive techniques in the “offline” world and had then moved into the digital analytics space. It’s rare to see people who started their analytics in digital who have then gone on to acquire these skills. That’s something for employers to think about.

In the U.K., I’m still driving the recruiters nuts.

So there we have it. I think I just about called it right. But let’s not let this trend of our ability to predict future outcomes be a panacea to all our problems. We need to incorporate checks and balances into our business processes to ensure that we don’t become overconfident or overreliant on our predictive analytical technologies. As Silver points out in his book, pundits are generally wrong.

By: Neil Mason, SVP, Customer Engagement, iJento
Originally published at clickz.com

Comments are closed.