Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Three Best Practices for Unilever’s Global Analytics Initiatives
    This article from Morgan Vawter, Global Vice...
Getting Machine Learning Projects from Idea to Execution
 Originally published in Harvard Business Review Machine learning might...
Eric Siegel on Bloomberg Businessweek
  Listen to Eric Siegel, former Columbia University Professor,...
Effective Machine Learning Needs Leadership — Not AI Hype
 Originally published in BigThink, Feb 12, 2024.  Excerpted from The...
SHARE THIS:

5 years ago
Data Lakes: The Future of Data Warehousing?

 

Originally published in InsideBigData, August 2, 2019.

The term Big Data has been around since 2005, but what does it actually mean? Exactly how big is big? We are creating data every second. It’s generated across all industries and by myriad devices, from computers to industrial sensors to weather balloons and countless other sources. According to a recent study conducted by Data Never Sleeps, there are a quintillion bytes of data generated each minute, and the forecast is that our data will only keep growing at an unprecedented rate.

We have also come to realize just how important data really is. Some liken its value to something as precious to our existence as water or oil, although those aren’t really valid comparisons. Water supplies can fall and petroleum stores can be depleted, but data isn’t going anywhere. It only continues to grow—not just in volume, but in variety and velocity. Thankfully, over the past decade, data storage has become cheaper, faster and more easily available, and as a result, where to store all this information isn’t the biggest concern anymore. Industries that work in the IoT and faster payments space are now starting to push data through at a very high speed and that data is constantly changing shape.

In essence, all this gives rise to a “data demon.” Our data has become so complex that normal techniques for harnessing it often fail, keeping us from realizing data’s full potential.

Most organizations currently treat data as a cost center. Each time a data project is spun off, there is an “expense” attached to it. It’s contradictive—on the one side, we’re proclaiming that data is our most valuable asset, but on the other side we perceive it as a liability. It’s time to change that perception, especially when it comes to banks. The volumes of data financial institutions have can be used to create tremendous value. Note that I’m not talking about “selling the data,” but leveraging it more effectively to provide crisp analytics that deliver knowledge and drive better business decisions.

What’s stopping people from converting data from an expense to an asset, then?  The technology and talent exist, but the thought process is lacking.

To continue reading this article, click here.

2 thoughts on “Data Lakes: The Future of Data Warehousing?

  1. Pingback: Data Lakes: The Future of Data Warehousing? – Predictive Analytics Times – Bitfirm.co

  2. Pingback: Data Lakes: The Future of Data Warehousing? – Predictive Analytics Times – NikolaNews

Leave a Reply