Does AI bias exist? What a Scientist image taught me
- Rishika Aggarwal
- Aug 25
- 2 min read

When I asked an AI tool to generate an image of a scientist, it gave me a man in a lab coat. No hesitation and no question. Just a male scientist. I didn’t tell it the gender, I just said “scientist.” That was the first red flag.
But it didn’t stop there. A few days later, I asked the same AI to generate an image of a person based on their hobbies, like coding, badminton, guitar and drawing. Again, it gave me a male. I hadn’t mentioned gender at all. Still, the tool defaulted to “he.” That moment revealed the bias that exists in AI systems.
These aren’t random glitches. They’re reminders that bias in AI isn’t science fiction. It’s actually built into the systems we use every day. When an AI makes these assumptions, it’s not thinking. It’s just reflecting the data it was trained on. If most of its training examples show men as scientists or men as default figures, then that’s what it repeats. Not because it “believes” it, but because that’s what it’s seen the most.
I’m a student building my own chatbot — a simple one, really. Mine doesn’t generate content or draw pictures; it just answers questions using text patterns. But even in a system like mine, I’ve had to think about what examples I include. What questions are students most likely to ask? I’m still working on how to make sure the bot doesn’t ignore questions that come from different ways of phrasing? Bias may not be widely visible. It is sly.
Gender bias in AI doesn’t just affect how tools respond, it shapes how we see ourselves. If an AI can’t imagine a woman (or any other gender, for that matter) as a scientist, what message does that send to someone who wants to become one? If it assumes every user is male, who gets erased?
The problem isn’t just technical — it’s human. And that’s actually good news. Because if people built these systems, people can make them better. The first step? Asking better questions.
Like: Why did it assume that? Who gets to be the default? And how do we change that?



Comments