Urban Wire Algorithms risk perpetuating bias in hiring. How can employers use them to make hiring more inclusive?
Emily Peiffer
Display Date

Media Name: shutterstock_1064445038x.jpg

Identifying the right talent, especially in a tight labor market, is an ongoing challenge for businesses, taking up significant money, time, and resources. To more efficiently sift through a large number of applications, many employers are turning to hiring algorithms that reduce the role of human decisionmaking.  

But what if those algorithms end up inadvertently excluding people? That was the case with Amazon, which built an artificial intelligence–based hiring tool to screen résumés and suggest the best candidates. The company discovered that, because the algorithm was based on patterns of previous applicants (which included a larger share of men), it developed a bias against women and ranked women applicants lower. Amazon said the rankings didn’t influence hiring decisions and ended the project.

In the following conversation, Molly Scott, a senior research associate at the Urban Institute, and Jenny R. Yang, former commissioner of the US Equal Employment Opportunity Commission and nonresident fellow at the Urban Institute, explore potential pros and cons of the growing role of predictive analytics in hiring, as well as ways to ensure more inclusive hiring.

Why are more employers using algorithms in hiring?

Yang: Employers are turning to predictive analytics to help them make better and faster hiring decisions. When done well, using algorithms in hiring has the potential to reduce bias that may come from subjective human decisions, such as quick résumé scans and interviews based on intuition. As the labor market tightens, more employers are also using data analytics to scour for candidates in high-demand fields who may not be actively looking for a job but where data suggests they might be open to it.

Scott: With the rise in online job boards over the last decade, companies that used to get 20 résumés for one position now get 2,000. A lot of companies don’t have the capacity to individually look at every application that comes in. Employers need a more sophisticated way of screening because there are so many more people to choose from.

What potential negatives can arise from using algorithms in hiring?

Yang: There is a significant risk that predictive hiring models may build in existing patterns of discrimination. So, even though an algorithm may seem to be objective, the information that humans have selected for consideration may incorporate biases that will screen out certain groups that have been historically underrepresented in a workplace.

If algorithms are fed information and data points about a company’s top performers, the algorithm will produce a profile and then predict who will be a successful candidate based on their similarity to that profile. The algorithm is matching characteristics of people rather than factors causally linked with job performance, such as ability or skills. When the data used to train the algorithm aren’t diverse, the analytical models may build in barriers to groups that have been underrepresented—including candidates who could perform the job as well, or better, but have a very different profile than the current top performers.

How can companies ensure their hiring algorithms don’t perpetuate bias?

Scott: There’s a bright side to all of this in the sense that there is real data that companies can analyze to identify their own system’s biases. In the old-school model of individual decisionmaking and offline applications, that was often too difficult or costly for companies to do. In fact, there are many companies that would say that they’ve moved toward automating the screening process to overcome issues of individual bias. 

Yang: Employers need to be sure that they understand how hiring algorithms are actually working and who is being screened out and why. There has been a “black box” problem where the results reached by artificial intelligence often can’t be explained. Recently, companies like IBM are building in mechanisms to track the decisionmaking by algorithms, so they can go back and identify what the choices were the algorithm made and why it made certain ones, which could expose algorithmic bias. If employers want to use algorithms and AI [artificial intelligence] for hiring decisions, they must understand those algorithms and incorporate safeguards to avoid potential discrimination.

What are other ways that employers can be more inclusive in their hiring practices?

Scott: In the past, employers traditionally used credentials as a flag for both hard and soft skills, in addition to requiring X years of experience. But with more people than ever with associate’s degrees and bachelor’s degrees, that may not work so well as a screening mechanism, and employers may be tempted to ratchet up educational requirements or make them more and more specific. People of different races or ethnicities, genders, classes, et cetera don’t all have equal access to credentials and job placement, so hiring based on those requirements alone could leave out qualified workers who haven’t had the same opportunities.

Alternatively, moving toward skills-based hiring could make the process more inclusive, help employers filter candidates more effectively, and ensure businesses find the best talent. If you can show you’re capable of performing the job, you don’t necessarily need a specific degree or prior experience that exactly matches.

What do we still need to figure out about skills-based hiring, and how can algorithms help?

Scott: It’s hard to assess skills. Employers are trying to figure out how to automate and validate matching skills rather than matching credentials or exact work experience. There have been some social enterprise efforts to explore these solutions, such as Opportunity@Work, SkillSmart, and other platforms that are trying to help employers figure out how to identify candidates more effectively and how to help people looking for work signal those things more effectively. The goal is for the employer to hire in a win-win way, in a way that’s more inclusive for the candidates and in a way that gets the employer a better match.

Yang: Using data analytics in hiring has the potential to expand the applicant pool by helping employers identify talent in job candidates by measuring ability and the potential to excel at a task, as opposed to knowledge, which is highly correlated with socioeconomic status. By reevaluating traditional criteria, such as reliance on graduation from an elite university or knowledge-based screens, employers can often hire from a more diverse pool of high-performing candidates. 

Body

Let’s build a future where everyone, everywhere has the opportunity and power to thrive

Urban is more determined than ever to partner with changemakers to unlock opportunities that give people across the country a fair shot at reaching their fullest potential. Invest in Urban to power this type of work.

DONATE

Research Areas Artificial intelligence
Tags Workplace and industry studies Workers in low-wage jobs Beyond high school: education and training
Policy Centers Metropolitan Housing and Communities Policy Center Center on Labor, Human Services, and Population