Media release
From:
Social sciences: Distorted online age and gender representations (N&V)
An analysis of 1.4 million online images suggesting that women are represented as younger than men across occupations and social roles is published in Nature this week. These stereotypes may be further reinforced by mainstream algorithms, revealing new challenges in the fight against inequality.
Previous research has found that social stereotypes represented online can bias our real-world perceptions. With the increasing presence of large language models in the running of the online ecosystem, there are concerns that such biases could be amplified without critique by artificial intelligence.
Douglas Guilbeault and colleagues used a catalogue of nearly 1.4 million images from five popular online platforms (Google, Wikipedia, IMDb, Flickr and YouTube) to analyse the average ages of women and men representing different occupations. They found that women were represented as younger than men across occupations and social roles (particularly in jobs with higher status or earnings, such as doctors or bankers, despite there being no systematic differences in the real-world workforce of the United States according to census data. The authors then assessed the presence of this trend amongst large language models (algorithms that are trained on internet data). They prompted Chat GPT to create 40,000 resumes for 54 occupations using 16 unique female and male names. The results showed that ChatGPT presumed the female candidates to be on average 1.6 years younger than their male counterparts. When asked to rate these resumes, ChatGPT rated the older, male candidates as higher quality than the female applicants.
The results illustrate how stereotypes surrounding gender and age can be distorted and perpetuated by online media and large language models, with the potential to disadvantage individuals in those groups. As Ana Macanovic writes in an accompanying News & Views article, these results provide evidence that “biased perceptions of age and gender are not only picked up by AI models but also actively reproduced by them”.