I was reminded of Dr. Buolamwini’s words when I read about a study finding gender bias emerging from some LLMs. The tl;dr of the study:
- If queries presented female (i.e. name, language, or impression), responses were simplified or redirected to less technical stuff (e.g. design over coding).
- If queries presented male, responses includes more detailed steps and technical language (e.g. jargon).
- Bonus-Ugh: If queries presented female, responses were 23% more likely to include phrases like “Don’t worry if this seems complicated.” Aka, the LLM assumed a more emotional response from the user.
Other evidence of our past in the data: Jobs are gender stereotyped.
”Women (were) mainly assigned job titles such as graphic designer, fashion designer, or nurse and men assigned job titles such as software engineer, architect, and executive… ChatGPT has a hard time associating male pronouns with nurses and an even harder time letting female pronouns handle a pilot’s duties of getting a plane ready for landing.”
Why Is This?
Our collective unconscious lives (and thrives) in the training data. Genders are associated with certain jobs because the source material does so. The associated responses to technical questions is because of the training data.
What Can Be Done?
All three of the below are critical to improve our collective training data:
- Be aware of what you put online. Remember, LLMs often use web content to train on.
- Report what you find. Most LLMs have a reporting mechanism. Use it.
- Support an organization. Some organizations include:
- Distributed AI Research Institute (DAIR)
- Center for Responsible AI at NYU
- Montreal AI Ethics Institute
References:
Buolamwini, J. (2023). Unmasking AI: My journey to hold AI Accountable. Penguin Random House.
Emelyanov, A., & Chuprina, S. (2025). Ethical and security aspects of multimodal foundation models. Array, 19, 100295. https://doi.org/10.1016/j.array.2025.100295
Kennedy, P. (2024, March 22). New study finds gender stereotypes persist in ChatGPT. TechXplore. https://techxplore.com/news/2024-03-gender-stereotypes-chatgpt.html