By: Greta Roberts, Conference Chair, Predictive Analytics World for Workforce 2017
In anticipation of his upcoming Predictive Analytics World for Workforce conference presentation, The Joy of Text: Building Actionable Models with Perceptions, we interviewed Andrew Marritt, Founder and CEO at OrganizationView GmbH. View the Q-and-A below to see how Andrew Marritt has incorporated predictive analytics into the workforce of OrganizationView GmbH. Also, glimpse what’s in store for the new PAW Workforce conference, May 14-18, 2017.
Q: How is a specific line of business / business unit using your predictive decisions? How is your product deployed into operations?
A: We have an employee feedback tool called Workometry which provides executives the ability to ask open questions on any topic to their employees. We process the many tens of thousands of open-text responses, usually in multiple languages to, identify issues and target segments to direct action. We think of it less as a survey and more as scaling qualitative methods like interviews. As well as typical employee surveys it’s being used by executives to ask key questions on a range of topics from post-merger integrations, improving customer experience to understanding issues in supply chains.
The tool is also used by People Analytics teams to bring perception data into their models. If we think of the model pipeline typically Workometry will automate much of it and then the most advanced teams will include the metadata we produce for other purposes. Often clients will get us to build those models.
Our view is that it’s better to guide executives to where they need to take action rather than just present data. We use probabilistic and predictive models to do this.
Q: If HR were 100% ready and the data were available, what would your boldest data science creations do?
A: I like to think many of the boldest, cleverest data science innovations should be invisible, or at least the user shouldn’t need to know that you’re using ML. When we presented at SwissText last year one of the Google team were talking about how they ‘curate’ information to answer key questions in search these days. The amount of understanding of natural language sentences is extensive but as users we just get the answers we want.
Q: When do you think businesses will be ready for "black box" workforce predictive methods, such as Random Forests or Neural Networks?
A: I think they are. Personality tests are for the most part black boxes and most in HR are comfortable with them. The issue is that in most decisions you have to optimize against multiple variables. The issue isn’t really the model but the loss function. Whilst we can be rational and define what is best for the company — and the company can probably afford to be rational — the manager often has different goals. Because everyone’s loss function differs they need to be able to interpret the model to make effective decisions.
If the loss function is simple and one which is widely understood it’s easier to use black box methods.
Q: Do you have suggestions for data scientists trying to explain the complexity of their work, to those solving workforce challenges?
A: We do a lot of work teaching empirical decision making to HR managers to enable them to have better conversations with the data scientists. This is a huge help but currently quite rare. We’re also working with graphic designers and data journalists to help identify and tell stories. One of the statements I frequently use is that the most important part of communicating with data is the communication, not the data.
Q: What is one specific way in which predictive analytics actively is driving decisions?
A: One simple way that we use is to predict variables at a group level, and then compare that with the actual. This can be used to ensure that managers don’t chase after ‘issues’ that are probably at a natural level. For example we’ll often build a predictive model to estimate the level of engagement, or eNPS for a team based on the demographics of the people in the team. We then can guide managers to groups which differ significantly from the expectation. These are usually groups where another variable which wasn’t in the model is causing the difference. These alternative factors are usually ones that you need to address.
Q: How does business culture, including HR, need to evolve to accept the full promise of predictive workforce?
A: I think the thing that people struggle with most is thinking probabilistically. Weather forecasters have struggled with this for a long time. As analysts if we over-promise I suspect that we’ll build disillusionment. I wouldn’t be surprised to see this in 2017 in HR. Predictive modelling isn’t about helping make decisions that are right — it’s about ensuring those decisions are less wrong, or optimizing on the impact of those decisions.
Q: Do you have specific business results you can report?
A: Sure. With a big financial services firm we built an attrition model of one of their country businesses. We included perception data and a reasonably sophisticated loss function. What we found was that the way to optimize the impact of attrition wasn’t to minimize the attrition. There were significantly different drivers for attrition that affected different groups of the population. In fact some of the factors that would have probably done most to reduce the overall level of attrition would have likely increased the attrition of some of the most valuable segments. What the model, aligned to the loss function let us do was say ‘you need to address these issues in these teams,’ meaning HR could be very specific and targeted in their interventions.
Don't miss Andrews’s conference presentation, The Joy of Text: Building Actionable Models with Perceptions, at PAW Workforce, on Wednesday, May 17, 2017, from 11:30 am to 12:15 pm. Click here to register for attendance.