Asked to name a big data company, many of us would say Google or Facebook or eBay. But for old-school giants such as General Electric Co. and Macy’s Inc., big data is fast becoming as central to their business models as jet engines and women’s apparel.That’s the conclusion Tom Davenport, president’s chair and distinguished professor of information technology and management at Babson College and co-founder of International Institute for Analytics, came to after he and Jill Dyché, vice president of best practices for SAS Institute Inc., looked at what happened when 20 traditional companies adopted big data.
“Many of them [were already] leading adopters of ‘small data’ analytics,” Davenport said during a presentation in Cambridge, Mass., hosted by Dennis Ringland and the Data Science Meetup Group. Davenport’s research led him to believe the industry has entered a new era, one he’s calling Analytics 3.0.
Today, with the help of big data, companies such as Macy’s and Caesar’s Entertainment Corp. are making the same business decisions (what price to charge, what merchandise to offer) — only faster. Companies such as Citigroup Inc. are investing in open source big data technologies such as Apache Hadoop to make the same business decisions — only cheaper. Some organizations are using more data to better serve the customer; United Healthcare Services Inc., for example, is searching for speech patterns in its call center data to uncover potential triggers for customer churn.
Others are using big data to build new products and services. “GE is a fascinating example,” Davenport said. The company is putting sensors in gas turbines, jet engines, MRIs and anything it calls “things that spin” to determine when the machines will need to be serviced. “GE makes 50% of its income off of services for these devices, and they are transforming their services business with all of this data,” Davenport said. The sensors from one gas turbine alone create more data per day than Twitter does in a week.
What do these businesses have in common? According to Davenport, the old guard is blending Analytics 1.0 (traditional business intelligence) with Analytics 2.0 (big data and data science) in one environment. That’s what’s happening with this capability in the 3.0 era.
“Everyone is in the data products game,” Davenport said.
When she feels brave, SAS’s Dyché asks CIOs if they’re managing their data as a corporate asset. “This has become a platitude in the big data industry,” she said during the Data Science Meetup Group presentation.
It’s a question that tends to get brushed off with a quick thumb’s up, but Dyché doesn’t let the conversation end there. She grills CIOs on five questions. “You get two points if you answer ‘yes.’ You get zero points if you answer ‘no.’ And you get minus-one point if you answer ‘I don’t know,'” she said.
Curious? Here are her five questions:
1. Are you giving your data resources that are comparable to other corporate assets? “The answer is always no. And I’m not saying invest the same amount of money in your big data as you’re investing in your fleet of trucks,” Dyché said. “But if you’re managing your data as an asset, you’re making investments in it.”
2. Are you dedicating technology to data that is comparable to the technology dedicated to other corporate assets? “I’m not talking about databases or even Hadoop. What I’m talking about is things like data integration tools, data profiling tools, data quality tools, metadata,” she said. “Are you investing in those enrichment technologies to make your data better and more meaningful?”
3. Are you allocating funding to data, just as you would for other corporate assets? “If data is truly a corporate asset, it deserves its own budget,” Dyché said.
4. Do you measure the cost of poor, missing or inaccurate data? “Sometimes we get a yes to that question, believe it or not, but [the cost is measured] only after something has failed,” she said.
5. Do you understand the “opportunity cost” of not delivering timely and relevant data to your business? “That’s the conversation we need to have with the executives,” because data has become integral to most businesses, she said. Companies that don’t invest in data and the infrastructure that supports it “may actually fail in some of our strategic initiatives as well.”
If you want better analytics, start asking ‘crunchy’ questions
To create an enterprise data strategy, cultivate people untainted by data science
Hey CIOs, crowdsourcing is the new cloud computing
So, CIOs, how did you score?
One of the characteristics of big data is that it’s big. Really big. And growing. So much so, there’s a discussion underway on what to call the soon-to-be-largest quantities of data. Yottabyte, the equivalent of 1,000 zetabytes, is currently the largest unit of measurement for digital information, and it also marks the end of the scale.
“We’re literally about to run out of the metric system,” Andrew McAfee, principle research scientist at the Center for Digital Business in the MIT Sloan School of Management, said during a Harvard Business Review webinar. “As you can imagine, the digital geeks are aware of this problem.” (Sidebar: The term geeks is used with the utmost respect, he clarified.)
What comes after yottabyte? “Ninabyte” or “brontobyte” have been proposed to describe 1,000 yottabytes. McAfee said the leading candidate is “hellabyte.” Hella as in the Northern Californian slang for “a lot.” Think this is a joke? The term has been formally proposed to the International System of Units.