HR software AI can provide several benefits, but you must ensure it treats people fairly and equitably. Learn steps to combat bias in your HR automation.

The United States National Bureau of Economic Research recently sent out 40,000 fictional job applications with similar qualifications. They found significant evidence of bias in both race and age. Resumes with African-American-sounding names received fewer responses, while 50-year-olds received less response than 30-year-olds.
Researchers were unable to tell whether the bias was intentional or unconscious, but it did rise to the level of what they term a systematic pattern or practice of discrimination as defined by the Equal Employment Opportunity Commission (EEOC).
The study didn’t state that HR automation software was used in the screening process, but it’s a good bet as 99% of Fortune 500 companies are deploying some form of Applicant Tracking Systems (ATS) to screen applicants.
The EEOC announced an initiative in October 2021 to study the problem of bias in hiring AI software. Bias in AI and algorithmic hiring is a big enough concern that the city of New York passed a law banning employers from using AI in recruiting and hiring unless the tools have passed an independent audit for bias. The law goes into effect in January 2023. Illinois recently enacted the video interview act, which governs how employers use AI tools for evaluating video interviews.
Ways that AI solutions can create bias
HR automation can significantly improve productivity. The right HR software can reduce manual input, streamline workflow, and help companies comply with laws and regulations. AI can play a big role in anticipating and solving problems; machine learning can optimize results as it recognizes patterns and learns your preference.
However, AI models can also duplicate biases found in the real world if data scientists are not careful about the way they train these models.
Amazon reportedly scrapped its AI screening after years of development when they found it favored male applicants. In reviewing current employees and resumes for clues to predict how potential hires would perform, it developed a preference for men over women and started surfacing fewer female candidates with similar backgrounds.
The unintended consequence of bias
Even unintentional bias can significantly hinder diversity and inclusion. It can perpetuate systematic discrimination even when there is no intent to do so. Over time, bias can skew a company’s culture.
It can also have long-term societal effects.
“If we don’t see male kindergarten teachers or female engineers, we don’t naturally associate men and women with those jobs, and we apply different standards,” said Iris Bohnet, co-director of the Women and Public Policy Program at the Harvard Kennedy School. Not only can this impact the way the recruiting and hiring process is approached, but it discourages students from pursuing certain roles.
It can also skew AI models.
How to combat bias in your HR automation
Combatting bias in your HR automation starts by recognizing the potential for discrimination, and taking proactive steps to ensure you are treating people fairly and equitably. Here are a few ways you can reduce bias and unintentional consequences using AI.
Combatting bias in your HR automation starts by recognizing the potential for discrimination, and taking proactive steps to ensure you are treating people fairly and equitably.
Make sure your keywords are gender-neutral
When using ATS systems, it’s important to ensure your sorting process uses gender-neutral language. Experts also say gender-coded language can create bias as well. The Employers Council suggests these gender-coded words can skew the hiring process:
- Male-coded: Aggressive, decisive, fearless
- Female-coded: Collaborative, dependable, honest
In job ads, this can impact the type of applicants you get. In training AI models, it can lead to bias.
If you’re unsure about whether particular words have a gender bias, the Massachusetts Institute of Technology (MIT) has a free gender-decoding tool you can use.
Perform audits
The NYC law requires independent audits of vendors that provide AI tools for hiring. There’s no reason you can’t do the same. Ask any vendor that you work with to provide documentation demonstrating their HR software doesn’t show bias.
Consider blind auditions
On the TV show, The Voice, judges have their backs turned to contestants as they audition. This allows the judge to assess talent based on someone’s voice and not their appearance. These blind auditions help to remove bias and allow judges to focus solely on a candidate’s talent.
You can do the same thing in the screening process by removing names and pictures from resumes.
Don’t remove the human element
Even if you do use AI in your HR automation, make sure it is just one of the tools you use when making hiring decisions. While AI can augment the process, it shouldn’t be the only thing you use to make hiring or screening decisions.
Consider using checklists for HR automation that document and guide your process.
Standardize processes
Structured interviews also help reduce unconscious bias. Asking each candidate the same set of questions keeps the focus on the answers rather than other factors, such as ethnicity or appearance. Applying this strategy to all of your screening and hiring processes can provide a more level playing field for all job candidates.
Training
While it’s important to make sure you properly train any AI models to eliminate bias, it’s also crucial that you train your employees and hiring managers to recognize signs of bias. Everyone should be on guard to prevent discrimination in the workplace at every level.
Continue monitoring
Once your process is in place, you need to continue to monitor your process for bias. It’s no different than you would do in assessing your diversity and inclusion goals with hiring managers. Look for patterns that demonstrate bias, discrimination, or unintended consequences.
Eliminating bias makes good business sense
Diversity and inclusion in the workplace isn’t just a lofty goal. A diverse employee base, especially in leadership roles provides a diversity of thought and approach.
A study by the Boston Consulting Group (BCG) showed that companies with above-average diversity scores had nearly twice as many innovative ideas as those scoring below average. These innovations accounted for nearly half of the revenue in the companies studied.
The companies with higher diversity also had better financial performance, including margins that average 9 points higher than those with lower levels of diversity.