The term ‘big data’ has become ubiquitous as it has permeated nearly every industry over the last ten years. While definitions abound, its fundamental implications remain constant: it represents a cross-functional focus on leveraging exponentially growing volumes of data to increase operational performance and ROI.
Though it hit the mainstream only recently, big data has been a mainstay of brick-and-mortar retail – where data is created every time a consumer makes a purchase – for years. In fact, retailers have long looked to big data as the key to competitive advantage, particularly as competition has intensified and margins have compressed.
In the early days of big data (think 1993), only a select few major retailers were tapping into the potential of data analytics at scale. These top-tier national retailers began using data insights to control inventory and optimise store and warehouse operational efficiency, eventually passing these cost savings on to the consumer.
In turn, this put margin pressure on the entire industry and lead to more competition on price, further increasing the use of big data analytics as second and third tier players adopted it to stay competitive.
These days, many retailers are using sophisticated analytics that leverage all types of data, including loyalty, structured POS and transaction history, in attempts to better understand their customers and deliver more personalized shopping experiences – with the ultimate goal of improving their bottom lines.
But while big data continues to provide many benefits to brick-and-mortar retailers, many in the industry are also becoming familiar with its limitations.
This is particularly true, and perhaps most critical, in the $300 billion retail trade promotions space. Despite the use of sophisticated data analytics, more than half of all in-store promotions fail to deliver a significant ROI, with many actually losing money for the manufacturer.
One of the main reasons for this comes down to the use (or misuse) of post-event analysis or trade promotion optimization (TPO) solutions – systems originally designed measure the results of promotional events.
These systems enable retailers and CPGs to apply econometric regression to sort through large volumes of aggregated data. These insights are then used to plan promotional calendars.
While this approach works fairly well for its intended purpose – determining the performance of your promotional spend after the fact – the approach falls down when it comes to discovering new promotions to run next. As a result, promotional calendars look the same as they did last year (and the year before that).
Ironically, a downside to big data is the sheer volume that’s collected. With the number of data points directly collected and otherwise acquired by a modern retailer on a daily basis, it can be difficult to break through the noise and determine which data sets are most relevant for identifying future consumer buying behavior.
To get above the noise, these systems aggregate data from various sources to control for numerous external factors, such as differences in weather, geography and competition. As a result, many valuable insights are lost as data is aggregated.
For example, promotional demand models treat all economically-equivalent offer structures the same; ‘£1 off a £4’ and ‘Buy 3 Get 1 Free’ are grouped with 25% off as the same promotion. The result is a retailer may only know how a particular discount level performed, with no insight into how variations in offer structure may have impacted performance. This presents a major challenge to developing optimal promotions, as offer structure can vary consumer response by as much as 200%.
In the dark
Complicating things further is the fact that today’s systems simply weren’t designed to capture the level of detail required to understand what differences in how an offer is framed could mean to its effectiveness.
Thanks to advances in behavioral economics, we now know that consumers respond to far more than just price when making purchase decisions – they use social, cognitive and behavioral cues to sort through the overwhelming amount of information they are confronted with on a daily basis.
Unfortunately, retailers are often in the dark about the impact of leading text, calls-to-action and artwork – each of which can have an outsize influence on consumer response.
In many cases, this level of granularity is simply not captured by most retail systems. For the few retailers that have invested in robust infrastructure, there are still difficulties. Even sophisticated models struggle to decipher critical nuances that can make or break large promotional campaigns.
Quite possibly the biggest limitation of big data’s application to retail promotions is its inability to support offer innovation. No matter the scale and volume, using data that describes past transactions to determine go-forward strategy inherently limits insights to only what can be culled from that data.
In other words, the data can only tell you about what’s already been tried in the past, and is effectively blind to the entire universe of what hasn’t (but might work better).
This means that possibilities for novel offer structures, cross-merchandising, discount levels and so on can’t be found in that data. Furthermore, as you continue to optimize in a backwards-looking mode using transaction data, the resulting set of promotions on your calendar will continue to converge, and the data will become increasingly homogeneous. This explains why certain brands only run one or two types of promotions – they simply have no other data points to look at.
By: Ben Rossi
Originally published at www.information-age.com