0 0
The Sexism of AI and STEM Expectations - Metaphors Are Lies

The Sexism of AI and STEM Expectations

Read Time:3 Minute, 28 Second

In an oddly telling article, the BBC reports that significantly fewer woman than men report using AI tools like ChatGPT. I say the article is oddly telling because it seems to be reinforcing sexist stereotypes while trying to report on how cultural expectations might cause women to be more reluctant to be caught using AI.

According to the article, 54% of men but only 35% of women surveyed reported using AI tools in their daily work. The article quotes a few women with what appear to be solid reasons for being leery of AI tools: their business is built on quality and they don’t think the tools can produce said quality; their business depends on a reptation for honesty and they don’t think people will find AI derived work honest; their work must be accurate and the tools have been inaccurate in their experience. The interviews are anecdotal, of course, but these ae all perfectly reasonable justification for not wanting to use imitative AI tools.

As is one of the broader potential reasons. Women are generally held to a higher competency standard than men. Men are more likely allowed, by society, to use AI tools because their competency is less likely to be called into questions. Women, for good reason, are concerned that if they use AI tools they will be seen using them not as aids but because they are not competent enough to do the work without help. All fair enough. But the article also has some rather odd notions about AI, women, and STEM.

The article posits that women don’t have the STEM skills needed to use AI and, because science fiction tends not to be marketed towards women, are not as likely to be attracted to using new technology. Both of those points show how sexists assumptions distort our view of reality.

First, while it is true that women are criminally underrepresented in STEM fields, they are not underrepresented in the use of technology. Women drove the creation of internet culture, for example, and have always been users of modern technology. It is not really even fair to say, at least recently, that women have been outpaced by men when it comes to adopting technology.

More importantly, using AI tools does not in any way require STEM specific skills. The use of imitative AI tools requires appropriate prompting — the use of logic and natural language to get the things to produce whatever imitation you want it to produce. Those are not STEM skills — those are language skills. Just because those are language skills applied to a piece of tech does not make them STEM skills. And women, as a class, generally do better with language-based skills than men do — so they, in theory should face no skill driven impediment to using imitative AI.

The article is a good example of how insidious sexism, especially sexism around technology, is. It takes a real difference between how men and woman are approaching a piece of new technology and falls back on the simplest, hoariest cliches instead. The article gets real women to give them actual reasons and explores at least one interesting, culturally based explanation, but doesn’t seriously explore the reasons the women provide and pivots immediately to cliched “men are from STEM, women are from humanities” nonsense. It squandered an opportunity to help readers understand real issues and instead reinforced a flawed understanding of how women and tech interact.

In a small way, this article makes the reader dimmer than they would have been otherwise. And more importantly, it is one more on the pile of bad reasoning that encourages women to think that they don’t belong in the tech world. By misstating the skills required around AI and by dismissing women’s actual concerns about imitative AI, the BBC subtly furthers the alienation of women from tech.

If you want to know why women don’t use AI, or why they generally find themselves driven away from STEM, it might just because gatekeepers like the BBC constantly dismiss their concerns and tell them that they aren’t welcome, in big ways and small.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Posted

in

,

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.