Recently a customer shared his concern that Talent Intelligence could be biased, referencing the 2015 news about Amazon’s sexist recruiting tool. The truth is that early recruiting AI was biased, and was misused to hide or shift bias without actually reducing or eliminating it. There are many stories about how AI-powered recruiting solutions “learned” human bias and had to be scrapped because they overwhelmingly favored male candidates.
In her analysis of AI and recruiting, “Can AI Solve the Diversity Problem?” Katherine Houser addressed this problem and showed that these instances were created by an inappropriate input of materials, such as using primarily male resumés to develop the model. Amazon built its AI using its own historical recruiting data to train their models, and these contained traces of bias, which made the AI biased. The reason early AI was biased is simple: All humans have unconscious biases, and many also have conscious biases.
It was this misuse of AI that inspired Censia’s mission to design an AI that would eradicate human bias and empower hiring and recruiting managers to measure people by their merit and character. Censia is built with fairness-aware machine learning and follows stringent compliance thresholds to ensure that is an unbiased AI, allowing it to help companies improve OFCCP compliance.
The Censia Talent Intelligence Data Platform is the engine of the company, where machine learning is used to derive insights on hundreds of millions of professionals that go far beyond keyword matching to derive real insight into hundreds of professional characteristics. Keyword search requires human input of assumptions of what the best candidates are, which often entails an ivy league pedigree, experience at Fortune 500 companies, etc. Talent Intelligence, such as Censia’s, replaces this human input with a multi-dimensional analysis that accounts for hundreds of factors, accomplishing in seconds what would take the human mind years to do.
Censia’s sourcing and screening solutions are powered by one of the largest professional data platforms in the world—automating the process of identifying and engaging talent for opportunities in the most efficient and fairest way possible. It significantly improves OFCCP compliance by eliminating unconscious affinity bias, expanding the talent pool to reach more diverse talent, and resurfacing previous applicants that may have been overlooked due to unconscious bias.
Here are just a few of the ways that Censia mitigates bias and increases diversity:
- Censia’s Talent intelligence employs holistic, multi-dimensional candidate models that use predictive analysis to determine which candidates are most likely to succeed in a role. It does this by creating clusters of skills, career trajectories, company information, and hundreds of other pieces of information while excluding factors related to OFCCP protected statuses.
- Censia does not build any algorithms on customer data and instead has anonymized hundreds of millions of candidates from public sources to develop its algorithms. All bias indicators are stripped from every candidate in the platform to train the algorithms, and prevent bias factors from being inserted and multiplied.
- Censia uses Ethical AI known as “Fairness Aware Machine Learning.”
- Censia does not use keywords for matching and recommendations, which allows Censia to uncover a qualified candidate’s profiles where keyword systems fail. For example, research shows that women put 40% fewer skills on their resumes and profiles than their male peers. In this example, any keyword search system would present fewer women in recruiting pipelines because. In contrast, Censia builds skill similarity analysis, which allows us to see qualified candidates even when their profiles or resumés lack the exact keywords. This is how Censia uncovers qualified diverse talent that gets overlooked because their resumés lacked certain keywords.
- Censia removes all bias indicators that may be on job descriptions, a resumé, a profile, or from recruiter’s actions when creating recommendations and ranked candidate slates.
- After creating an initial slate, Censia allows companies to filter by a candidate proclaimed diversity indicators (gender, veteran status) so that they can fill their diversity gaps.
- Censia offers an anonymous mode, which removes bias identifiers like gender, ethnicity, age, race, sexual orientation, etc. from a candidate’s profile or resumé.
- Censia removes all bias indicators on customer job descriptions when converted into holistic, unbiased people models, which are then used to build candidate slates and recommendations of qualified candidates.