Combating AI Bias Requires Diverse Hiring

PinIt

Math and numbers can’t naturally overcome an implicit bias without some active involvement by those who can see the blind spots.

Despite the general opinion that artificial intelligence (AI) runs on numbers and is inherently unbiased, AI has displayed very biased behavior in the past few years. A landmark government study showed that facial recognition has a tough time with non-white faces. Amazon’s new recruiting engine didn’t like women. In many of these cases, the problem was blind spots in the data.

No one purposefully trains AI to discriminate, but our implicit bias often creates patterns that we don’t notice, and machines definitely do. In Amazon’s case, the data was gathered from ten years of resume combing in a field that men already dominated. In facial recognition software, the data simply had more white faces.

See also: Approaching AI and Ethics with Eyes Wide Open

Diversity closes blind spots in AI

Technology is a highly homogeneous field. This leaves gaps in thinking and data digestion that can lead to high profile discrimination cases found in AI lately. Math and numbers can’t naturally overcome an implicit bias without some active involvement by those who can see the blind spots.

So, what do companies and research institutions do? Hiring itself is fraught with bias already, continuing to favor candidates from the white male category. With the addition of AI-driven resume screening, these numbers could become more entrenched.

New hiring practices could prioritize diversity

Luckily, many companies have opted for a different approach. Using project-based hiring instead of pure resume screening, companies can discern skills-based benchmarks and find candidates outside the traditional hire.

These assessments have candidates answering business questions and tackling business problems in a real-world setting, either through take-home assessments or during the interview itself. This bypasses the resume screening and gets to the subtle nuances required to work in the tech field.

They’re also paying attention to how candidates ask questions and communicate results, two soft skills not visible on a traditional resume. The results could tip the scales towards the diversity the tech field needs to develop truly objective AI.

Companies that don’t adapt to less discriminatory hiring practices may continue to experience embarrassing, high-profile AI missteps while the companies that do may go on to better, more responsible development.

Elizabeth Wallace

About Elizabeth Wallace

Elizabeth Wallace is a Nashville-based freelance writer with a soft spot for data science and AI and a background in linguistics. She spent 13 years teaching language in higher ed and now helps startups and other organizations explain - clearly - what it is they do.

Leave a Reply

Your email address will not be published. Required fields are marked *