Predictive Analytics Times
Predictive Analytics Times
EXCLUSIVE HIGHLIGHTS
SHARE THIS:

4 weeks ago
The Legal and Ethical Implications of Using AI in Hiring

 

Originally published in Harvard Business Review, April 25, 2019

Digital innovations and advances in AI have produced a range of novel talent identification and assessment tools. Many of these technologies promise to help organizations improve their ability to find the right person for the right job, and screen out the wrong people for the wrong jobs, faster and cheaper than ever before.

These tools put unprecedented power in the hands of organizations to pursue data-based human capital decisions.  They also have the potential to democratize feedback, giving millions of job candidates data-driven insights on their strengths, development needs, and potential career and organizational fit. In particular, we have seen the rapid growth (and corresponding venture capital investment) in game-based assessments, bots for scraping social media postings, linguistic analysis of candidates’ writing samples, and video-based interviews that utilize algorithms to analyze speech content, tone of voice, emotional states, nonverbal behaviors, and temperamental clues.

While these novel tools are disrupting the recruitment and assessment space, they leave many yet-unanswered questions about their accuracy, and the ethical, legal, and privacy implications that they introduce.  This is especially true when compared to more longstanding psychometric assessments such as the NEO-PI-R, The Wonderlic Test, the Ravens Progressive Matrices test, or the Hogan Personality Inventory that have been scientifically derived and carefully validated vis-à-vis relevant jobs, identifying reliable associations between applicants’ scores and their subsequent job performance (publishing the evidence in independent, trustworthy, scholarly journals).  Recently, there has even been interest and concern in the U.S. Senate about whether new technologies (specifically, facial analysis technologies) might have negative implications for equal opportunity among job candidates.

In this article, we focus on the potential repercussions of new technologies on the privacy of job candidates, as well as the implications for candidates’ protections under the Americans with Disabilities Act and other federal and state employment laws. Employers recognize that they can’t or shouldn’t ask candidates about their family status or political orientation, or whether they are pregnant, straight, gay, sad, lonely, depressed, physically or mentally ill, drinking too much, abusing drugs, or sleeping too little. However, new technologies may already be able to discern many of these factors indirectly and without proper (or even any) consent.

To continue reading this article click here.

About the Authors

Ben Dattner is an executive coach and organizational development consultant, and the founder of New York City–based Dattner Consulting, LLC. You can follow him on Twitter at @bendattner.

 

Tomas Chamorro-Premuzic is the Chief Talent Scientist at ManpowerGroup, a professor of business psychology at University College London and at Columbia University, and an associate at Harvard’s Entrepreneurial Finance Lab. He’s the author of Why Do So Many Incompetent Men Become Leaders? (And How to Fix It). Find him on Twitter: @drtcp or at www.drtomas.com. 

Richard Buchband is Senior Vice President, General Counsel and Secretary at ManpowerGroup, with a background in corporate and securities law. He is a member of the New York Stock Exchange Listed Company Advisory board, and a frequent speaker on topics of board-level governance, public company issues, workplace trends, and ethics and compliance.

Lucinda Schettler is a Senior Attorney for ManpowerGroup, specializing in employment law in the U.S. Lucinda focuses on the legal issues implicated in the ever-changing world of work, including AI and the use of technology. She strongly believes in ManpowerGroup’s approach of “Doing Well by Doing Good.”

Leave a Reply

Pin It on Pinterest

Share This