Of all the things overhyped on the Internet at present, so-called AI (artificial intelligence) ranks near the top of the list. (Right after whatever Taylor Swift, and the Duke and Duchess of Sussex happen to be doing at the moment.)
Perhaps you’re a techno-utopian and you’re already annoyed with me for being a wet blanket. This technology is freaking amazing, you say. Before you send me an email accusing me of Luddism: I’m not against all AI, as a blanket policy. But much of the AI marketed at consumers is onanistic, yet another solution in search of a problem. AI is not a monolith. There are a few gems, but a lot of coal, too.
For example: the AI in Photoshop that allows me to create a layer mask is clearly worthwhile, and incredibly useful AI. Photoshop is a program of pure genius, that enables the artistically inept—like yours truly—to make composites and image collages with only a journeyman’s grasp of the program. (More complicated artistic tasks, like original illustration, require the hand and eye of a professional, of course.)
On the other hand, my last washing machine had an “AI” sensor that was supposed to detect overloads. The sensor malfunctioned, and basically defined every load as an overload (even a load consisting of three medium-sized bathroom towels).
I had to scrap the entire washing machine. When I purchased my next one, I specifically selected a machine without any AI capabilities. It functions perfectly. And since I’ve been using washing machines since the early 1980s, I’m fairly certain that I have the common sense not to overload one, sans AI assistance.
Here’s the point. Sometimes AI is useful, and sometimes it’s like the stuffed birds that briefly appeared on women’s hats in the 19th century, before everyone came to their senses. Washing machines do not need AI. Nor does the phone answering system at your cable company. A menu telling you to press “1” for technical support, “2” for billing, and “3” for sales was always more than adequate. And the number-driven phone menu is early 1990s technology. No one needs an AI voice that sounds sort of like a person, but can’t really do anything extra for you, aside from raising your blood pressure.
In recent months, there has been an endless stream of online hype articles about programs like Sudowrite, ChatGPT, etc. These programs produce walls of text that kinda sorta maybe appear to be stories for a paragraph or two.
The result has been predictable: a vast tsunami of AI-generated fiction, flooding online magazines and Amazon’s self-publishing platform. Clarkesworld reportedly had to temporarily suspend submissions to deal with the glut.
The AI fiction glut seems to be most acute at the level of short stories and children’s books, which are usually no more than a few thousand words. If you’re going to try to write a book using AI in the first place, after all, why stretch your attention span any more than is absolutely necessary?
Many of these books and stories seem to arise from bets. A recent Reuters story describes a book written by a man “who bet his wife he could make a book from conception to publication in less than one day.” The result was a 27-page “bedtime story about a pink dolphin that teaches children how to be honest”. Make of that what you will.
This trend is also driven by social media, especially on TikTok and YouTube. Since the advent of Amazon-based indie publishing, there has been no shortage of hustlers and scam artists who are eager to tell the unwary how they can “strike it rich!” with low-content books, and even plagiarized books. Should we be surprised that these same video charlatans have now picked up the baton of AI written books?
The title of this post is a misnomer, of course. The coming AI fiction glut is not “coming”, it is already here.
We might have foreseen this. Long before AI, overnight fortunes were made by peddling get-rich-quick schemes and “lose weight without diet or exercise” promises.
Never mind that such ruses predictably disappoint in the long run. The lure of the quick and the easy has an enduring appeal.