Adverse impact is a term that comes up in discussions related primarily to recruiting. Still, it can also influence advertising (both commercial and job advertising), business practices, reputation building, and every other department in a business. It’s essential to be aware of adverse impact, as well as the ways that you can mitigate it.
So what, specifically, is it?
The Definition of Adverse Impact
A good, concise definition of “adverse impact” comes from Mighty Recruiter:
“An adverse impact results from employer practices that seem to be neutral, but that disproportionately and negatively affect protected groups such as women and minorities. An adverse impact can occur at any stage of the employment process, including hiring, training, performance reviews, promotions, and layoffs. A practical means of measuring if an adverse impact exists is to evaluate whether a group’s selection rate falls below 80 percent of the group with the highest selection rate. For example, if you give a hiring test for job applicants, and the pass rate of a protected group is 80 percent of the pass rate of the group with the highest selection rate, the hiring test may hurt that protected group.”
In other words, adverse impact is closely tied with discrimination. However, unlike intentional discrimination and bigotry, unintentional prejudice, and inherent bias, adverse impact is not outwardly discriminatory. Adverse impact is also known as discriminatory impact or disparate impact.
This can crop up in many locations, and it may not be obvious. Some bias may also not be discriminatory, depending on the context. Some examples of discriminatory adverse impact may include:
- The inclusion of physical lifting requirements on jobs that do not have duties that include physical lifting, such as office jobs, is an example of discriminatory adverse impact. This example is discriminatory to disabled applicants.
- Machine learning or “AI” systems filtering resumes and opaquely filter out applicants with education histories from foreign schools that are not in its database of valid schools. This strategy is an adverse impact on foreign applicants and may not be evident to anyone who isn’t double-checking the results of a software’s filtering.
- Your company issues a “general intelligence test” for applicants that asks questions unrelated to the job but related to a specific culture or area. This practice is discriminatory to anyone not part of that culture or location.
The foundational court case that defines adverse impact is the case of Griggs v Duke Power from 1971.
Per Hire Vue:
“In the landmark case, Griggs v. Duke Power, Willie Griggs and twelve other African-American employees of Duke Power sued their employer, alleging that the general intelligence test Duke used as a screening tool unfairly impacted African American applicants.
These are the passing rates from Duke’s general intelligence test: Whites: 58%, African Americans: 6%.
The Supreme Court ruled that if pre-employment tests had a disparate impact on protected groups (such as women and ethnic minorities), the organization requiring the test must prove that the test is “reasonably related” to the duties performed on the job.”
Adverse impact is bias and discrimination, whether it’s part of a business process, a software system, or institutional habits. The 80% rule in the definition above, also known as the four-fifths rule, is the general standard by which employment procedures are judged.
How to Avoid Adverse Impact
Preventing adverse impact means being aware that it’s a thing that exists, understanding how it is measured, analyzing your existing practices looking for it, and implementing new processes to replace those that prove to include adverse impact.
Measuring adverse impact means codifying applicants’ acceptance and rejection rates based on their demographics, particularly those related to a protected class. Measure whites versus minorities, locals versus foreign applicants, men versus women, and so on. According to the Equal Employment Opportunity Commission (EEOC), these are the protected classes that you must be aware of:
“Applicants, employees, and former employees are protected from employment discrimination based on race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age (40 or older), disability, and genetic information (including family medical history).
Applicants, employees, and former employees are also protected from retaliation (punishment) for filing a charge or complaint of discrimination, participating in a discrimination investigation or lawsuit, or opposing discrimination (for example, threatening to file a charge or complaint of discrimination).”
Measure the acceptance rates, success rates, termination rates, and other relevant metrics per group, and compare them using the fourth fifths rule. Bear in mind that raw numbers of people hired are not necessarily representative. Here’s an example, again from Hire Vue:
“Let’s say an organization is looking to fill 25 open positions in its local call center. Five hundred men and 1000 women apply. Of those applicants, ten men and 15 women are hired. In this situation, the selection rate for men is 2%, while the selection rate for women is 1.5%. Dividing 1.5 by two, we get 75%: below the cutoff. Even though the organization hired more women overall, the women were still adversely impacted.”
It’s also worth mentioning that the fourth fifths rule is merely a start. Disproportionate measurements in a short time and with small numbers of open roles and applicants are not necessarily a sign of adverse impact. The adverse impact must also be statistically significant. Other tests, such as the Z-Test or the Fisher’s Exact test, can also be used to judge adverse impact.
To be able to judge adverse impacts, you need raw data. This raw data needs to accumulate demographic information about protected qualities, so job applications can ask for this information, with the disclaimer that it is not used in the filtering or judgment of candidates.
Job analysis is a crucial part of monitoring for adverse impact. A job analysis should look at a given role and distill it down to the core components. What does the job require? What specific activities make up the job? What particular qualities does an employee need to perform the job? What is the environment of the job?
Codifying this is a vital strategy for many reasons, including creating accurate, effective job postings. We often point out that a job posting should only list requirements necessary to begin the job; you should consider leaving out requirements that are “nice to have” and not necessarily a firm condition.
A minimal list of requirements is also helpful in avoiding adverse impacts. Many “requirements” may be discriminatory, even if they aren’t intended to be. Remember, as well, that reasonable accommodation must be made for protected individuals and categories.
For example, an office job cannot list physical requirements such as heavy lifting because it’s not a reasonable component of the job. Sure, employees might be occasionally required to move reams of paper or heavy binders, but accommodations may be made. The same requirement can be relevant for a job as a roofer, where the inability to work on a roof physically disqualifies an applicant.
Job posting analysis is related. To determine whether or not you have an adverse impact in your workplace, you’ll need to gather the necessary data. However, you can also look at your existing job postings to see if there is an adverse impact inherent in them. For example, listing “Requires 4-7 years of experience” can be discriminatory. It prevents applicants with less experience (who are often younger) or those with more than seven years of experience (who are generally older), a veiled form of age discrimination.
You should also examine your job postings and your hiring process for consistency and unbiased judgment. A common technique to help remove bias from the interview and hiring process is an objective interview scorecard, which is formulated for specific roles and judges only the qualities and skills necessary to perform those roles and nothing else. Subjective “impressions” and other judgments can be a source of unconscious bias and, thus, adverse impact.
Analyzing algorithms is also an essential part of bias analysis. Algorithms may seem objective – algorithms are code, after all – but all code is as biased as the people writing it. Time and again, examples come up in technology where bias occurs. It can be as blatant as facial recognition software that identifies minorities as more significant security threats, hidden as light-reactive hand dryers not working on individuals with darker skin.
The Harvard Business Review has an excellent overview of how algorithms can create or amplify bias in many situations, and it well worth a read.
Recruit from diverse sources. Even something as simple as choosing which recruiters to work with, which sites to use to promote your job listings, geo-targeting for your job postings and advertising, can all be sources of bias and adverse impact.
“If your recruitment team is not diverse, then that makes it far easier for unconscious bias to overtake the process. If you want to minimize adverse impact, your hiring team needs to be as diverse as your applicants. As the hiring manager, you will surely be on the interview panel. You need to select two people quite different from yourself to make up the rest of the panel. This strategy will offer a diverse range of views on the potential job candidates you are interviewing and can help you to hire a more diverse range of people.”
Remember, as well, that diversity and inclusion are an ongoing process. Your hiring process may never be perfect. Your goal should be “good enough” with the addition of “continually improving.” Keep an eye on data, keep an eye on changing laws and cultural mores, and make adjustments as necessary. Even something as simple as your word choice in a job posting can have a significant impact.
Dealing with Adverse Impact
Say, for the sake of argument in a hypothetical, you’ve examined your company policies and have discovered significant adverse impact. What steps should you take?
First, it is good to have discovered it before it becomes a significant legal issue. People will have been tangibly hurt by adverse impact, and they are within their rights to bring legal action against your company for it. Discovering adverse impact and taking steps to rectify it prevents this from becoming a problem further down the line.
The primary thing you need to do is adjust your practices – be it in job postings, in the hiring and interview process, in onboarding, in metric reviews, in terminations, in performance reviews, or anywhere else within your company where you remove the discriminatory practices. This process can include adjusting the wording in a job posting, changing software filters, or revising the entire hiring process from the ground up.
If no lawsuits come your way and you have removed your discriminatory processes, you may proceed with caution. Continue ongoing monitoring to watch for your new processes’ impact and continue to adjust them to remove bias.
As mentioned above, some forms of “discrimination” are context-sensitive. An individual unable to work on a roof cannot be hired as a roofer; you can create a defense against discrimination for that requirement. Developing a basis of legal defensibility for your hiring practices is essential for preemptively protecting your company from legal challenges.
You may also consider talking to lawyers who specialize in discrimination and employment law. They can assist you in protecting yourself, revising your processes, and uncover more hidden biases that may not be obvious.
There will always be an ongoing discussion about the impact of adverse impact, the presumption of equitable application and hiring processes, and how bias impacts company progress. One thing is undeniable, however. A more diverse, inclusive, and robust workforce is universally better for productivity, profitability, effectiveness, and success within a business. Study after study proves this, time and again.
If your company is not actively working to remove bias present in hiring, review, termination, and other business policies, you can do better. I hope this guide has been helpful and a step in the right direction!
Did this article shed some light on adverse impact, and did it help you rethink some of your company processes? Are you struggling with adverse impact in your industry or hiring practices, or have you been a victim of it yourself? I’d love to get a conversation started on this subject and hear your thoughts! Please share with us in the comments section below. I reply to every comment here and would be happy to hear from you.
Andrew Greenberg’s roots in recruiting date back to 1996. He has experience both on the agency-side and corporate-side of the staffing business, with a focus in the financial services space at companies like Bloomberg and UBS. He also has core experience with information technology staffing, and has worked for major software companies such as SAP Business Objects and IBM/Informix Software. To get in touch with Andrew, you can reach him by email or by phone at (800) 797-6160.