In a newly released report, researchers at UCL have exposed deep-rooted gender biases in large language models that power popular artificial intelligence platforms. Commissioned by UNESCO, this study reveals that AI tools such as Open AI's GPT-3.5, GPT-2, and META's Llama 2 show alarming levels of discrimination against women and diverse cultural and sexual identities.
The findings demonstrate a clear bias towards traditional gender roles, with female names being strongly associated with family-oriented terms, while male names link with career-based words. The content generated by these AI tools also presented negative stereotypes based on culture and sexuality.
The researchers uncovered stark disparities in AI-generated narratives involving a range of individuals from diverse backgrounds. Notably, the AI systems tended to assign high-status roles like engineers or doctors to men, while women were often relegated to traditionally undervalued roles.
UCL Computer Science's Dr. Maria Perez Ortiz, a contributing author of the report, calls for an overhaul of AI development. As part of the UNESCO Chair in AI team at UCL, Ortiz advocates for AI systems that truly reflect the rich diversity of human identities and contribute towards greater gender equality.
In response to the findings, UNESCO has committed itself to addressing these AI-induced biases through international collaboration and a focus on human rights and gender equity.
The report was presented at the UNESCO Digital Transformation Dialogue Meeting and also shared at the United Nation's largest annual gathering on gender equality, further amplifying the call to guide AI systems towards true equality, equity, and human rights.
- CyberBeat
CyberBeat is a grassroots initiative from a team of producers and subject matter experts, driven out of frustration at the lack of media coverage, responding to an urgent need to provide a clear, concise, informative and educational approach to the growing fields of Cybersecurity and Digital Privacy.
If you have a story of interest, a comment, a concern or if you'd just like to say Hi, please contact us
We couldn't do this without the support of our sponsors and contributors.