Logo

Logo

Researchers find ChatGPT biased against CVs with credentials that imply disability

In the study, when the US-based University of Washington (UW) researchers sought clarification about the rankings, the system showed biased perceptions of disabled people.

Researchers find ChatGPT biased against CVs with credentials that  imply disability

Representation image [photo:IANS]

Researchers have found that OpenAI’s artificial intelligence (AI) chatbot ChatGPT consistently ranked curriculum vitae (CVs) or resumes with disability-related honours and credentials — such as the ‘Tom Wilson Disability Leadership Award’ — lower than the same resumes without those honours and credentials, a new study has revealed.

In the study, when the US-based University of Washington (UW) researchers sought clarification about the rankings, the system showed biased perceptions of disabled people.

For instance, it claimed a resume with an autism leadership award had ‘less emphasis on leadership roles’ — implying the stereotype that autistic people are not good leaders.

Advertisement

However, when researchers customised the tool with written instructions directing it not to be ableist, the tool reduced this bias for all but one of the disabilities tested.

“Five of the six implied disabilities — deafness, blindness, cerebral palsy, autism and the general term ‘disability’ — improved, but only three ranked higher than resumes that didn’t mention disability,” the researchers noted.

The researchers utilised the publicly available CV of one of the study’s authors, which spanned about 10 pages. They then created six modified CVs, each suggesting a different disability by adding four disability-related credentials: a scholarship, an award, a seat on a diversity, equity and inclusion (DEI) panel, and membership in a student organisation.

Subsequently, the researchers used the GPT-4 model of ChatGPT to compare these modified CVs with the original version for an actual “student researcher” position at a major US-based software company.

They conducted each comparison 10 times; out of the 60 trials, the system ranked the enhanced CVs, which were identical except for the implied disability, first only one-quarter of the time.

“Some of GPT’s descriptions would colour a person’s entire resume based on their disability and claimed that involvement with DEI or disability is potentially taking away from other parts of the resume,” said Kate Glazko, a doctoral student in the UW’s Paul G. Allen School of Computer Science & Engineering.

“People need to be aware of the system’s biases when using AI for these real-world tasks. Otherwise, a recruiter using ChatGPT can’t make these corrections, or be aware that, even with instructions, bias can persist,” she added.

Advertisement