AI bias is usually talked about in terms of algorithms: skewed datasets, flawed outputs, and stereotypes baked into models. But new research suggests there’s another, more subtle problem about who gets to use AI in the first place. According to a recent report by Lean In, women are less likely than men to use AI tools at work, and even when they do, they’re less likely to get recognition or support for it.
The numbers paint a clear picture. Men are more likely to use AI regularly (33% vs 27%), more likely to have ever used it at work, and significantly more likely to be encouraged by managers to adopt it. And it’s not just about access, but also about perception. Women are more likely to worry about the risks of AI, question its accuracy, and even fear being judged for using it, including concerns that it might be seen as “cheating.”
Why this matters more than it seems
Chances are, this gap could compound fast. AI is quickly becoming a core workplace skill, and early adoption often translates into better opportunities. If one group is consistently using it less, or getting less credit for it, that gap can grow into real career disadvantages over time. And this isn’t happening in isolation. Broader research already shows women are underrepresented in tech and AI roles, meaning they’re not just using these tools less, but they’re also less involved in building them.

What makes this interesting is how familiar it feels. This isn’t a new kind of bias; it’s an old one, just showing up in a new space. The same patterns seen in workplaces for decades, with less recognition, less encouragement, and more scrutiny, are now playing out in how AI is adopted and used.
Same bias, new tech?
As AI becomes a core workplace skill, even small gaps like this can snowball into missed opportunities, slower career growth, and less representation in shaping the tech itself. Because if the people using AI aren’t equally represented… the future it builds won’t be either.
