Recruiting is a high stress, high turnover industry. To combat recruiter burnout in an age in which applying for jobs can be as simple as swiping left or right, companies are increasingly turning to talent acquisition automation. Research has found that 80% of recruiters and nearly all Fortune 500 companies use technology to sort and evaluate resumes, and an increasing number of companies are using AI-powered talent acquisition software to intelligently automate their work.
Many early AI-powered talent acquisition algorithms were fundamentally flawed, resulting in talent pools that replicated biased hiring practices and a lot of bad press. Fortunately, hiring algorithms have come a long way to become more effective and unbiased than humans.
But despite the promising developments towards ending workplace bias that AI-powered talent solutions deliver, New York City legislators have introduced a bill to amend anti-discrimination rules for an age of automation and algorithms. The bill centers on two concepts: companies must disclose the use of the software and are required to perform annual audits to ensure the software is not discriminating against candidates.
While both of these concepts have the right thought in mind, they simultaneously do both too little and too much to ensure that these algorithms put an end to recruiting and talent bias.
Disclosing the use of hiring technology would only be meaningful if the companies also gave candidates insight into how it operates so that they can better prepare. Failing to do so would likely create more resentment and litigation. Asking companies to audit their recruiting results is also problematic as it places an undue burden and liability on companies to evaluate and defend results delivered through automated systems. These results might be further skewed by incorporating previous hiring decisions.
Rather than implementing legislation that needlessly complicates the matter and prevents companies from implementing technologies designed to make them more efficient, data-driven, and unbiased, here are some recommendations for companies who strive to make data-driven, unbiased talent acquisition and talent management decisions.
Choose the Right Technology
Not all talent acquisition and talent management software are created equally. When evaluating talent acquisition technologies, companies should do their due diligence and ask the provider about how their algorithms ensure unbiased results. One way providers accomplish this is by using publicly available data to assess job candidates. If companies choose a technology that evaluates candidates based on their company’s past hiring patterns, that technology will replicate and amplify any existing bias. Instead, they should look for a technology that gathers, cleans, and analyzes public talent data and turns it into talent intelligence.
Additionally, companies can look into recruiting technologies that provide additional features and fail-safes to prevent bias from creeping back into the recruiting process, such as providing the ability to view profiles that have been stripped of profile photos and additional indicators that may reveal factors that would lead to discrimination.
Hire for Skills, Not Titles
Bias doesn’t just happen in the hiring process but exists throughout an employee’s tenure at a company. And for many of them, it follows them throughout their careers. A joint study by McKinsey & Co. and LeanIn.org found that Black women are 40% less likely to be promoted, and are less likely to be supported by their managers. As such, they are less likely to have the titles the company is hiring for, even though they have the skills needed for that role.
A shift to more skills-driven hiring can also help companies shift into future-proofing their companies by developing the right skill sets and creating a more agile workforce.
Set Diversity Benchmarks
The benefits of having a diverse workforce are numerous and well documented, and yet most companies still struggle to attain them. In addition to ensuring that their pipelines are as unbiased as possible through the use of intelligent automation, recruiters and managers need to educate themselves on unconscious biases and how they might come into play during the final stages of a hiring process. A Harvard Business Review article detailed how a candidate pool needs to be composed of at least 50% diverse candidates for any diverse candidate to be hired.
Disclose both Use of Technology and Expectations
In addition to disclosing the use of hiring technology, companies should give better guidelines to applicants about how their information will be assessed and how they can meet those requirements. An excellent place to start would be to tell candidates how to optimize their resumés for automated systems. If the system uses publicly available information, they should advise applicants to ensure their online profiles are up to date.
Use Technology Appropriately
A 2014 Harvard Business Review article pointed out that algorithms do a better job predicting employee success than humans, but that doesn’t mean that algorithms should control every part of the recruiting process. AI-powered talent acquisition software employs predictive analytics to determine who would be most likely to thrive in a given role and advance up the career ladder, but at the end of the day, the people you hire have to work with the humans at the company, not the machines.
The most successful companies, and employees, aren’t those who turn over their work to the machines, but those who use intelligent automation to maximize their impact.
The end of bias won’t come from regulating talent acquisition technologies. If anything, regulation will slow down the innovation and potential impact on resolving long-standing inequalities of the modern workforce. 96% of CEO’s have indicated that diversity, equity, and inclusion a top priority for them, and legislators would be better served to spend their time empowering companies and candidates alike to embrace technology in order to prepare for the future of work.