Is Cognitive Testing Culturally Biased? How To Use It In a Diversity-Centric Manner
We recently came across a Canadian cognitive aptitude testing study published this year by Hausdorf and Robie, which explored differences in general mental ability (GMA) test scores among culturally diverse groups in the context of employment selection. This paper explored whether using GMA tests in recruitment can result in adverse impact upon cultural minority groups. For context: this study specifically examined GMA testing scores for immigrant versus non-immigrant applicants for a bus driver position.
Applicants completed a short pen-and-paper cognitive aptitude testing GMA test. The authors’ hypothesis: that first-generation immigrant applicants would score lower on the GMA test than other applicants (second-generation immigrants or non-immigrants), was supported. However, cultural differences in GMA scores were largely eliminated after controlling for variables like age, years of education, and English fluency.
An earlier study by the same authors used data from a number of countries including Australia, and found that cultural differences in GMA testing scores in Australia were “small to non-existent”. The authors also found in this sample that second-generation immigrants’ GMA testing scores were higher than non-immigrants’ scores. Importantly, in this study, the authors noted:
It is important to recognise that we are not implying that current GMA tests are predictively biased against immigrants to any significant degree. Eliminating the use of GMA tests in personnel selection will likely reduce the overall validity and utility of a selection system. Moreover, relying [solely] on interviews has its own problems
What can we learn from this cognitive aptitude testing, and how can we ensure that we administer cognitive testing in a culturally sensitive manner?
In this study, the test was administered in English, however – 79% of applicants (most of the sample) identified as first or second-generation immigrants, and consequently may have learned English as a second language. Some online GMA tests are available in a range of different languages: the Saville Swift Analysis Aptitude – Rx, for example, is currently available in 29 languages! Checking language availability of cognitive aptitude testing may be worthwhile, particularly if you are administering assessments internationally.
English as a Second Language (ESL) norm groups
If your chosen cognitive assessment is only available in English (which can be the case), Testgrid strongly recommends considering using an ESL norm group. When using cognitive aptitude testing for recruitment, we are most interested in how the candidate has performed relative to others: for instance, relative to other graduates, or to other professionals and managers. When examining cognitive testing results, it is best practice to compare an individual to others who are most similar to them. (This also makes logical sense: for example, we wouldn’t really be interested in how an electrical apprentice candidate has performed compared to HR managers! We would be most curious about where their results sit compared to other apprentices.) In the case of individuals who have learned English as a Second Language: research suggests that compared to the general population (on GMA tests administered in English), ESL candidates tend to perform slightly less well on the Verbal Reasoning component and that there is generally no difference for Numerical or Abstract Reasoning components. Again, this makes sense as Verbal Reasoning items often involve vocabulary, grammar, analogies, and reading comprehension. For an individual reading such items in their second language, they may still answer correctly, but may just take a little longer to read through or analyse the items. Consequently, we believe the fairest and most diversity-centric approach is to use ESL norms where possible. Testgrid has created ESL norms for a number of our cognitive assessments.
Abstract Reasoning cognitive aptitude testing is becoming more popular as diversity-centric cognitive tests, as they are language-free (and acultural), and therefore cannot discriminate on the basis of English language. Rather, these visual items involve shapes and patterns and provide an important measure of fluid reasoning: how we solve new problems and reason in a logical manner. Performance on these tests is also less affected by socio-economic factors and educational circumstances.
To speak with us about diversity-centric cognitive testing, please get in touch with us at here!