The Washington Post highlights that Alexa is telling lies about the 2020 election. If you ask it about election fraud, it will lie to you and tell you that the election was stolen and that there was unprecedented fraud. It does this because it uses for sources conservative and alt-right news sources such as random Substack newsletters and Rumble, the place conservatives go when YouTube’s conspiracy theories are too tame for them. This is especially pernicious because studies have shown that people trust voice assistants more than text due to their human like attributes, and because the voice assistants give one answer, not a list. But more importantly, I think this presages a world in which misinformation wants to be free and truth wants to be paid for.
Creating misinformation is easy. I wrote on Monday about how much easier it is to write fantasy information by history compared to historical fiction. The principle is the same. Getting things right takes time and effort. Making things seem right takes much less research and leans much more heavily on imagination. Search systems like Alexa depend upon being able to parse information. Doing so accurately becomes more difficult without human intervention and constant upkeep if the information they have access to is primarily incorrect. And since misinformation is both easy to produce in large quantities (more so now that large language models like OpenAI can do it for you in quality good enough to fool another machine) and will be produced in large quantities since it serves an agenda, systems like Alexa will either be overrun with lies or require constant human intervention to be useful.
And humans cost money. Money that these companies do not want to pay.
The future, then, is likely to be awash in assistants that lie to us between reminding us that the pie in the oven is done and that its time to go pick up the kids from soccer practice. The problem is that people generally don’t like being lied to. Alexa’s lies are newsworthy because people, especially the people who have the disposable income required to buy a toy like Alexa, are not interested in having 2020 election lies told to them. This presents a problem for Amazon and companies like it. Spend money to improve a product that already struggles to make money, or let it wither and fade away.
Or, more likely, at least try to sell a premium product. Add five bucks a month to your Prime subscription and Amazon will promise that your Alexa queries will be faster and more accurate. The promise of voice assistants is real, and I think that something like this is probably going to be at least tried. The problems should be obvious. People who can afford not to be lied to will get the truth. People who don’t have the disposable income will get the lies.
This will be especially troublesome in the realm of public policy. Imagine these kinds of systems in schools or as government aids. How likely is it that governments will be allowed to pay for the version that tells children the truth? Never mind the culture war nonsense around defining the truth — we generally don’t pay for healthy meals for public school kids. In what universe are we going to pay someone to tell them the truth?
Even outside voice assistants, as new organizations, government services, and internet providers, for example, try to save money by turning over simple fact-based articles and searches to these kinds of agents, we run the very real risk of further bifurcating the reality this country experiences. In the reality that those with disposable income can afford, the truth has more sway. In the other, misinformation and damaging lies are allowed to control the discourse. Our democracy is already suffering from the effects of the right-wing epistemic closure. Segregating the truth entirely behind paywalls will only supercharge those tendencies.
Alexa may be more than just the world’s most privacy invading egg timer — it may also be the harbinger of a very depressing future.
Leave a Reply