A short one in honor of Labor Day — I will not be doing much labor in honor of the day.
An interesting post from Blue Sky:
Kid has an English assignment, where school has kids first submit essay to an “AI checker.” Kid did not use AI. AI checker says the use of the word “devoid” magically turned essay into 18% AI written. Changing “devoid” makes it drop to 0%. We’re spending time “un-AI-ing” an essay that has no AI.
This is probably more widespread than we think. According to EduWeek, 68% of teachers in middle and high school have used imitative AI detectors in 2023. And they are all likely to have an effect similar to the one above.
Imitative AI is a median chasing tool. Remember, it has no conception of reality, no model of the world, so it can have no opinion on quality. It merely calculates what the next word should be based on the odds that the word has appeared before in that situation in its training data. And since its training data is a significant portion of all publicly available texts on the internet, it is going to tend toward certain phrasing and a general median level writing. AI detectors, then, are going to reinforce that tendency.
AI detectors have to find markers that stand out, that they can say with some confidence “this is indicative of AI.” Since a lot of imitative AI writing is generic, specific vocabulary quirks are going to stand out, and those quirks are most likely to be either really bad or uncommon. And uncommon often means the use of a richer vocabulary. Hence, “devoid” being driven out of essays, dumbing them down, so as to not appear to detectors as cheating.
This has always been something of a problem. I had teachers who occasionally questioned my vocabulary in middle school (I was a voracious reader and, for example, knew what epitome meant earlier than the usual student. Even if I could not pronounce it) and my own sons had to run their work through a plagiarism detector. But teachers who are told that a paper is AI generated by a machine are more likely to believe the determination because humans mistakenly put more faith in machines than in human expertise. Companies could prevent this, of course, but they do not want to. Harder for them to make money if people can easily see what is and is not human created.
And so, we end up with teachers trying to prevent kids from using imitative AI tools to do their writing. Because, as I have mentioned before, writing is thinking. Teachers assign essays at least in part to help teach kids how to think, how to make an argument. Unfortunately, the need to protect those assignments from imitative AI is more likely to dumb down kids’ work, especially those that are precocious readers and have the slightly mis-pronounced vocabulary to match.
AI in education is a mess, and this part of the reason why. The tools’ function often runs counter to good educational practices. But because the companies refuse to help us know what is and is not imitative AI, the protections put in place also can run counter to good educational practices. Imitative AI harms students coming and going.
Leave a Reply