Legal Risks Associated with Hiring and How Testing Can Help?
Recruitment can be a tricky game at the best of times, but organisations need to take the legal risks associated with hiring into consideration when designing their recruitment process and choosing the right tools.
In this article, we will look at the legal risks associated with hiring and understand some of the issues that can occur from the obscure to downright obvious. We have all heard horror stories of organisations being fined or sued for extraordinary sums due to failures in their process. The worst instances are around biases towards race, age, and gender, where practices have either asked candidates to state these or directly discriminated against these factors using technology.
INSTANCES OF LEGAL ACTION
We all know (or should know) the basic questions to avoid.
Questions to Avoid:
- Asking if candidates have children
- Asking how old candidates are
- Asking about ethnicity
But the legal risks associated with hiring can be much more complex. There is a plethora of ways organisations can come under fire for their process but for the purpose of this article we have broken these down into 2 types of discrimination: Human and Technological.
Amazon famously came under fire after Hiring Managers stalked candidates’ social media and were found to show discrimination on the grounds of race and sexuality. Other organisations include Facebook, McDonalds, Pinterest, and many more for similar reasons.
Let’s start with the human factor and look at some solutions to mitigate these risks within your process. Below are the most common types of bias that occur in recruitment showing how one can project their own biases onto candidates:
- Resume Screening: Recruiters may unconsciously favour resumes with names, educational institutions, or experiences that align with their own backgrounds, leading to a preference for candidates who resemble them.
- Interview Bias: Interviewers may exhibit bias based on a candidate’s appearance, tone of voice, or non-verbal cues, which can lead to unfair judgments unrelated to the candidate’s qualifications.
- Stereotyping: Preconceived notions about gender, race, age, or other characteristics can lead to stereotyping, where certain groups are perceived as a better fit for specific roles, even if this isn’t the case.
- Affinity Bias: Recruiters might show a preference for candidates who share their interests or backgrounds, assuming that such commonalities indicate a better cultural fit.
- Confirmation Bias: Recruiters may focus on information that confirms their initial impressions and disregard data that contradicts those impressions, leading to biased decision-making.
- Halo and Horns Effects: A single positive or negative trait can disproportionately influence the overall evaluation of a candidate, causing either an overly positive or negative assessment.
- Groupthink: When hiring decisions involve multiple people, group dynamics can result in conformity to the dominant opinion, which might be influenced by bias.
- Unconscious Bias: Many biases operate on a subconscious level, making it challenging to recognise and address them without deliberate effort.
- Availability Bias: Recruiters may rely on readily available or memorable information.
Some of these factors carry heavy legal risks and should not be ignored. Using assessments in your recruitment process provides a proven, Legally Defensible, and Science-Backed Approach: Psychometric testing provides a reliable and legally defensible method for recruitment, as the bias in human decision-making is removed from the process in varying degrees.
Psychometric testing promotes diversity and inclusivity by relying on objective and fair data. Traditional hiring methods, such as relying solely on resumes or unstructured interviews, can be influenced by unconscious biases. By leveraging psychometric assessments, organisations can reduce bias and ensure a more equitable and diverse selection process, leading to a more inclusive workforce.
In 2022 a lawsuit was filed against iTutor for age discrimination in hiring. The software used to collect applications by iTutor was programmed to automatically reject female applicants older than age 55 and male applicants older than 60. In the end iTutor had to pay $365,000 to the 200+ rejected applicants.
A well-known video interviewing technology is currently going through a class action lawsuit now for illegally capturing biometric data.
In the examples above the organisations used tools that were new and not tested before being rolled out. This resulted in candidates taking legal actions against organisations and the supplier of the screening tools. It is crucial when using new tools that you have checked their validity and that they do not discriminate against different types of candidates.
When it comes to the utilisation of recruitment technology, there’s a significant concern surrounding its potential to inadvertently foster discrimination in the hiring process. Numerous pitfalls await unwary employers who embrace these technological tools. Let’s explore a few examples:
- Keyword-Scanning Software and Algorithms for Resumes: Imagine software that scans job applicants’ resumes for specific keywords. On the surface, this seems like an efficient way to filter through applicants, but there’s a hidden danger. If the algorithms are unintentionally programmed with a bias towards certain gender-specific terms, it could result in favouring candidates of a particular sex or gender. This seemingly harmless tool could inadvertently perpetuate gender-based discrimination.
- Facial and Voice-Recognition Technology: Some recruitment technology employs facial and voice-recognition technology to evaluate job applicants based on their speaking patterns and facial expressions. While this may aim to identify strong candidates, it could inadvertently penalise individuals with accents, unique speaking styles, or disabilities affecting their speech. This could lead to unlawful discrimination based on factors such as race, colour, national origin, or disability.
- Personality Self-Assessment Tests: Self-assessment tests designed to evaluate a candidate’s personality and suitability for a specific role can be problematic. For instance, a candidate with severe depression may answer questions in a way that reflects their condition, potentially resulting in their disqualification. This would be unjust discrimination, particularly if their responses are influenced by a recognised mental impairment, as defined by the Americans with Disabilities Act (ADA).
- Chatbots for Basic Qualification Questions: Chatbots or virtual assistants are often used to pose basic qualification questions to job applicants. Consider a scenario where a job applicant, when asked if they were ever fired from a job “for cause,” responds truthfully with a “yes.” However, they aren’t provided with the opportunity to clarify that their termination was later deemed illegal due to age-related discrimination. In this case, the chatbot’s rigid approach may lead to unfair disqualification based on incomplete information.
- Pre-Employment Computer Tests: Pre-employment tests, when administered without careful consideration, may inadvertently disadvantage certain applicants due to their protected traits. The test itself might not be discriminatory, but the way it is administered or interpreted could result in illegal discrimination against individuals with specific characteristics.
While these recruitment technologies may appear harmless at first glance, it’s crucial for employers to be aware of the potential pitfalls and discriminatory outcomes they can create. To ensure a fair and inclusive hiring process, it’s essential to carefully evaluate and customise these technologies to mitigate the risk of discrimination and bias.
Your Recruitment Process
So, humans can show bias and so can technology, what does that mean for your process?
Psychometric assessments have been used for decades to gather data on candidates and provide recruiters with non-biased, valid data that allow them to make smarter and faster hiring decisions. Our assessments Assessment methods and psychometric tests should be based on valid and reliable data and research, ensuring that the tests are credible and accurately measure what they claim to assess.
Validity refers to how accurately the test evaluates or measures a skill or trait. It represents the relationship between a test and the quality it measures and, thus, is a critical factor in pre-employment testing.
There are many types of psychometric assessments, and there are often so many factors to consider when selecting suitable recruitment selection methods: candidate care, efficiency, cost, and so on – that it’s sometimes easy to lose sight of the most important factor: which assessment methods are the most predictive of job performance – and therefore the most valid?
The Validity Scale
The validity scale is a timely reminder of the importance of ensuring that you select valid assessment tools that will help you make the most effective recruitment decisions possible. The higher the statistic (on the ladder), the more valid that assessment method is, and as you can see, psychometric assessments really are much greater at predicting on the job performance.
Learn more about assessment validity in our blog Psychometric Assessments: Can They Really Predict Job Performance?
GET IN TOUCH
If you want to talk to one of our experts about avoiding the legal pitfalls associated with the recruitment process in your organisation, get in touch with our team here, or call 03 9040 1700 to learn more.