Machine Learning Times
Machine Learning Times

3 years ago
Are You Practicing “Bad Data Science” with your Pre-Hire Talent Assessments?


Talent Analytics uses data gathered from our own proprietary talent assessments as an input variable to predict hiring success – pre-hire.  We treat this dataset just like any other dataset in our predictive work.  We are careful to analyze it for a strong (or weak) correlation to actual job performance. Our theory?  If there is no correlation between data gathered via this method our clients should stop using it.  Continuing without proof of success would be a little like a doctor “knowing” a certain medication doesn’t work for you, but continues to encourage their patients to keep using the medication.  Malpractice at the very least.

Like all great predictive solutions, we use the most current predictive analytics methods any top data scientist would use – with any dataset – to find if there are strong patterns in human attributes that predict either “lasting in a role” or achieving some kind of KPI performance like sales performance, calls per hour, balanced cash drawers, customer satisfaction scores, errors and the like.

We use methodologies that include training datasets, validation datasets and lots of cross validation which all lead to the highest level of rigor called Criterion Validation of our talent assessment.  Criterion validation proves the correlation between certain assessment characteristics – and specific performance in the role.

If your business can show that your talent assessments accurately increase your hiring success, then it clearly makes sense to continue using them.  If you can’t – what’s the point?


I am stunned by how few businesses (or assessment vendors) take the time to analyze their talent assessment dataset to see if it provides any positive or negative value.

We recently evaluated another vendor’s solution to see if it accurately predicted customer service scores (pre-hire) in Bank Tellers. It was predictive – but negatively so meaning, if their assessment said someone would have great customer service and flagged them as a “hire”- the new hires actually ended up having low customer service scores.  We analyzed the predictions, individual assessment scores as well and actual customer service scores new hires received after they were hired.

How do you know if you’re practicing good data science with your talent assessments:

  1. Ask your talent assessment vendor for access to the raw assessment scores so you can analyze how their scores compare to the actual performance they are “predicting”
  2. Ask your talent assessment vendor if their assessments are Criterion Validated. If so – how often.  If not, ask them how they know they work?
  3. Once you have the raw talent assessment scores, ask your workforce analytics team to see if there is a correlation between any of the scores and length of time in a role or performance KPIs
  4. If you can prove that nothing positive is being predicted, stop using them immediately.
  5. If you’d like some assistance with pre-hire testing, look for a solution that is Criterion Validated and uses modern data science to prove their usefulness

Pre-hire assessments can be a powerful dataset for learning more about your job candidates.  Used as part of a responsible data science initiative, they can often predict the probability of someone lasting in a role, or performing very specific KPIs.  Used irresponsibly they introduce bias, they waste time and worst of all they are a signification cost to your organization both in terms of the fees your organization pays to use them and in terms of the bottom performing employee they help you to hire.

To learn more about successful predictive pre-hire projects, visit Talent Analytics at or +1-617-864-7474.

About the Author

GretaGreta Roberts is an acknowledged influencer in the field of predictive workforce analytics. Since co-founding Talent Analytics in 2001, she has established Talent Analytics, Corp. as the globally recognized leader in predicting an individual’s business performance, pre-hire and post-hire.

She has led the firm to use predictive analytics to solve line of business challenges making Talent Analytics one of the only firms in the world predicting business outcomes. Examples include predicting someone’s probability of making their sales quota, or being able to process a certain number of calls, or make errors, and the like.

Greta leads the company in developing predictive solutions that can be easily deployed into employee operations, to teams without a background in analytics, statistics or math.  This strategy has led to the development of Talent Analytics’ award winning predictive cloud platform Advisor.

In addition to being a contributing author to numerous predictive analytics books, she is regularly invited to comment in the media and speak at high end predictive analytics and business events around the world. Through recognition of her commitment and leadership, Greta was elected and continues to be Chair of Predictive Analytics World for Workforce, an innovative, annual predictive analytics event dedicated to solving workforce challenges.  She is an Instructor on Predictive Analytics for HR and Workforce at UC Irvine; she is a faculty member with the International Institute for Analytics (IIA), is a member of the INFORMS Analytics Certification Board.

Follow Greta on twitter @gretaroberts 

Leave a Reply

Pin It on Pinterest

Share This