When Amazon put together a team to work on its new recruitment engine in 2014, it had high hopes. The experimental solution used artificial intelligence to rate candidates’ resumes to identify top talent. However, shortly after the solution was tested, the team found that the system was not rating candidates in a gender-neutral way. The algorithm, like any deep learning algorithm, relied on training with historical data. Unfortunately, the pre-existing real-world data embedded within it had patterns that exhibited gender bias, which the AI algorithm eventually incorporated into its functioning.
Amazon’s recruiting engine was trained to evaluate candidates by observing patterns in resumes submitted to the company over a 10-year period. Unsurprisingly, most applicants were men, a reflection of the gender gap across the tech industry. As a result, the recruiting engine taught itself that male candidates were preferable. It penalized resumes that referred to identifiable gender information – for instance, if the engine came across a term in the resume that said a candidate was a part of a “women’s basketball team,” it assigned a lower rating to that resume.
Unfortunately, this is not the first instance of an AI program displaying signs of inherent bias. Remember Microsoft’s Tay chatbot that littered Twitter with racism and profanity? The old GIGO adage – “garbage in, garbage out” still holds, and feeding intelligence systems with incomplete or inaccurate data without safeguards remains a major threat to building an equitable world of work.
“It all comes down to what kind of data AI is using to make hiring recommendations,” says Caitlin McGregor, speaking exclusively to HR Technologist. McGregor is the Co-founder and CEO at Plum, an I/O psychology inspired AI solution designed to counteract human bias. “It’s always been the standard to evaluate candidates based on skills and knowledge — think degrees and years of experience — which can trigger biases when a hiring manager sees a resume that name drops Harvard or a prestigious unpaid internship. These qualifications tend to point to privilege, not necessarily job fit. So, when AI-based hiring solutions rely on skills and knowledge, such as resume and social media scraping tools, it’s just perpetuating the same biases, but at a larger scale.”
“The foundation of talents is a combination of traits and competencies that can be captured by measuring applicants’ personality, problem-solving ability, and social intelligence. Decades of Industrial/Organizational Psychology research not only proves that talents are four times better at predicting future success than skills and knowledge, but they’re a whole lot less biased, too.”
HR leaders like Caitlin have good reason to be critical of traditional hiring practices that perpetuate cognitive biases. “I think generally talent professionals want to evaluate candidates on more than just a piece of paper, but they just don’t know how,” she says. “The first step is admitting our unquestioned method of using resumes as the first step in the hiring process needs to be scrutinized. AI can help — but if we’re truly going to commit to moving beyond resumes to make the hiring process less biased and more predictive, that means we also have to move beyond AI that simply automates resume keyword matching.”
Also read: Artificial Intelligence: Coming Soon To An Office Near You. Or Is It?
The real opportunity that AI presents in recruitment is scalability and automation to a practice such as Industrial/Organizational Psychology that once relied on (often costly) consultation services. “The predictability and objectivity of talent data can now be democratized and available to all, not just Fortune 500 companies,” opines Caitlin.
A common misconception among people is that AI simply automates established practices; however, the development of sophisticated AI programs now enables solutions to go beyond just automating repetitive tasks to tackling complex problems that human cognition is too limited to tackle. Caitlin believes, “this is the kind of AI that can go beyond simple resume screening and actually make more objective, predictive decisions — when fed the right data.”
Will AI replace human recruiters?
While the adoption of black-box solutions across industries is on the rise, such solutions replacing humans is an unwarranted fear. AI can serve recommendations based on pattern recognition or candidate matching; however, selling the job to the candidate, or building a relationship with the candidate will ultimately depend on a human recruiter.
“As cliché as it sounds, AI can truly make the hiring process more ‘human,’ by eliminating menial and repetitive tasks to allow hiring professionals to focus on human relations,” says Caitlin.
As the recruiting function embraces a more objective, data-driven approach, AI will enable recruiters to make faster and smarter hiring decisions, and not make those decisions for them.
Also read: Employee Experience, Jobs and Skills: How AI will Impact HR
Choosing the right AI recruiting solution
At a time when nearly every vendor in the HR technology space claims to have integrated artificial intelligence into its workflows, how do you assess AI solutions that meet your hiring needs?
Caitlin shares three key considerations that HR leaders must take into account before zeroing in on an AI recruiting solution.
The first consideration is scalability. “AI in hiring is meant to act as a solution to expensive, inefficient consulting services and hiring team pipelines. If the AI you’re using doesn’t save you time, money, and resources, then it’s not doing its job. An AI product should also be able to grow with your company as it grows. If it’s not a long-term solution, then the technology is falling short of its purpose,” says Caitlin.
The second point is consistency. Caitlin’s advice to HR leaders looking to acquire AI recruitment solutions is to ensure that the AI can accurately qualify candidates across all organizational functions. The solution must be capable of assessing candidates for engineering roles or mid-management roles just as easily as it assesses candidates for sales roles.
Also read: The Role of Artificial Intelligence in Recruitment
The third and the most important parameter is kind of data the AI solution uses for assessments. “The bulk of hiring AI solutions on the market are using data that is scraped online. Most hiring AI solutions, therefore, are using the same data sets! You don’t want to look up one day and realize your entire office is made up of white men named Jared who went to Ivy Leagues, played lacrosse, and read Harry Potter (this is an actual example I heard about at a Society for Industrial-Organizational Psychology conference)! You want to look up and see a team comprised of people that possess the qualities most important to your company. It’s what the saying “garbage in, garbage out” means — if your AI is relying on useless data, you’re going to get useless results. Because AI isn’t magic. To put yourself in a position to build a diverse team, it’s important to look at talent acquisition AI solutions that create and synthesize objective, predictive, and new data,” says Caitlin.
When assessing the efficacy of AI recruiting solutions, it all boils down to the data that is fed into the system. Amazon’s recruitment engine stands testament to this fact.
In conclusion, AI should be viewed as an opportunity, and not as an inhibitor to social equality. After all, it is easier to remove biases from algorithms than from humans, so ultimately AI has the potential to build a fair, diverse and equitable world of work.