The Great Deception
Just because AI can outperform the best humans at chess and Go, that doesn’t make it the best source of reference information when fact-checking.
“The battle lines [where algorithms and tech companies are competing] are now shifting from attention to intimacy.”
-Yuval Noah Harari
Although few people expect to end up like Theodore Twombly, and falling in love with their operating system, a new trend is already growing among ordinary middle-aged couples. Husbands and wives, who spend more time asking their partner to take out the trash than to spice things up in the bedroom, are starting to find a new outlet for their romantic fantasies by flirting with chatbots.
This trend, along with an emerging AI-therapy industry, and Mark Zuckerberg’s proposal that AI help to alleviate the “loneliness epidemic,” that he created all point in the same direction.
Regardless of how AI companies attempt to establish intimate relationships with their users, the economic value is clear. The trend in recent years has been for digital technology to steadily sink deeper into the nooks and crannies of human society that we all take for granted. If you were deciding what type of car to buy 30 years ago, a close friend’s strong opinions would be much more of a dealbreaker than an essay in MotorTrend.
AGI is a “magic intelligence in the sky”
-Sam Altman
While I have alluded to the notion that AGI is becoming a cult-like quasi-religion in some of my previous posts, Karen Hao was years ahead of me, investigating this in her new book, Empire of AI, which I plan to read and review shortly.
To understand why this hype is dangerous, rather than something to roll your eyes at, consider the Eliza effect. Eliza was a chatbot programmed in the 1960s, to respond to end users the same way that a psychotherapist would. Responses were very simple, such as “Please continue,” or “Tell me more about [keyword].” After developing the source code, and allowing some end-users (including his secretary) to experiment with it, Eliza’s creator was shocked to see how quickly people became attached, and anthropomorphized the program. The lesson learned was if you can have a conversation with something, you will quickly ascribe human-like qualities to it.
Many experts would consider March 2016, when AlphaGo beat Lee Sedol, to be the watershed moment, kicking off the modern AI boom. Yet nearly a decade, and one global pandemic later, we are still at just 4.1% unemployment in the US. In a few years, we may be living in a world where (thankfully) most workers have kept their jobs, but AI companies are desperately looking for a way to turn a profit after making extreme promises to investors.
Under these economic circumstances, when AI has been over-hyped by the experts as smarter than all humans, and everyone is going home and talking to it, it won’t matter how smart or wise the chatbots really are. When experts have claimed it is capable of doing anything, and everyone is confiding with it, it won’t matter whether it really has those capabilities, only that it is perceived that way. The perception of superhuman intelligence, combined with the intimate influence it has over its users will be all that matters.
But what happens to journalism then? What happens to the truth and objective facts that all citizens in democracies agree on? What happens to the trend of polarization that has been afflicting the US?
I don’t want to imply that nobody should ever use AI. It certainly has valuable uses, whether helping you learn a new language, or write lines of code. The danger comes when you rely on it for basic facts that you could verify by clicking a few links.
Publications such as Business Insider are already laying off employees because they were dependent on click-through traffic from Google. The trade-off is clear. AI generated responses to search queries are not factually reliable, even if the underlying AI is hypothetically better than the best humans at Go.
Finally, it is important to remember that AI models are not organic outgrowths of the internet. They are tools (or agents, perhaps) of mega-corporations and governments. They will likely become adept at manipulating your emotions long before they master an unbiased understanding of truth and facts regarding the physical world.
If McDonald’s, Exxon, or the Chinese Communist Party offered to send a cult-leader to your house, to hypnotize you with a pendulum for 30 minutes per day, would you take them up on that offer?
Even Big Macs are better for you than rocks.
