“Any fact becomes important when it's connected to another.”
― Umberto Eco, Foucault's Pendulum
Digging through some old notes the other day I came across this jewel of a moment, which manifested one morning about six year ago during my daily commute, which for a not inconsiderable range of time consisted of two to three hours of hoofing through downtown Atlanta. Passing a pair of homeless men on this particular occasion I heard one exclaim: “Motherfucker had some kind of disease. He don't hate me because he knows I do crack.”
I get a sharp stabbing pain in my head when I try to imagine the sequence of events that led to this linguistic pomegranate, brimming with rubies the likes of which I could never conjure up if I had a million years and a million monkeys and a million typewriters. David Mamet would shit his pants. He might shit your pants if he started thinking about the dizzying line of ancestors to these two interlocutors, survivors all, stretching back past the distant borders of the modern era, through the Renaissance and the Middle Ages, through preliterate civilizations and migratory tribes sweeping across Europe, back to some prehensile Ur-father and mother scratching a living off the grasslands of Africa a million years hither, and so on and so forth all the way to the very first spark of life in the primordial oceans of the infant Earth. The sheer stupefying machinery of creation in all of its eternal glory and the relentless ramification of being all led to that moment, now vanished from the world, having left but the tiniest trace, scribed by yours truly and encoded in microscopic magnetized flecks of metal-oxide in a colossal data center in Virginia.
What a strange world we do inhabit.
Just for shits and giggles I asked ChatGPT to speculate what the conversation was about. It responded thus:
It's difficult to determine the exact context or situation based on a snippet of overheard conversation. The man could have been talking about someone he knows who is dealing with a health issue or a mental health condition, and the reference to crack might be related to the person's coping mechanisms or lifestyle choices. Alternatively, it could be a metaphorical expression or a way for the speaker to describe a difficult relationship. Without more information, it's challenging to provide a precise interpretation of the conversation.
No kidding. The irony here, of course, is that determining contexts and situations based on snippets of overheard conversations is more or less exactly what ChatGPT does. Fattened on a voluminous sludge of human conversation, ChatGPT and others of the latest generation of AIs seek out patterns to guide their own utterances. I question whether any Large Language Model (LLM) actually finds “precise interpretation” or whatever one might call what it does “challenging.” I suspect—and this is underscored by the speed with which it responded to my query—it found this task exactly as challenging as answering any other question. The reason for this is simple: LLMs are sentence machines. They’re not “thinking” so much as mimicking thought like a Munch's Make-Believe Band of cogitation.
Blessed with instantaneous recall of a body of words and sentences and paragraphs of almost inconceivable size, the sort of mental gift any writer would sell her soul to possess, the answers LLMs generate are as banal as anything ever conjured in the hearts of those responsible for the small print on the backs of medicine boxes. Ask ChatGPT to say something novel and surprising and it will indeed plug a bunch of words together in what it deems to be a statistically unlikely combination, but novelty and surprise require context, and the context LLMs bring to bear is vast and featureless, like a sea of gray porridge.
“As Gregor Samsa awoke one morning from uneasy dreams, he found himself transformed in his bed into a gigantic insect.” Franz Kafka wrote these words in Prague in 1912, a German speaker surrounded by Czechs, a Jew living among Christians in the ass-end of an empire that by the time of his arrival was well past its use-by date and indeed fated to collapse only six years after he loosed “The Metamorphosis,” announcing with brutal authority the arrival of modernism in literature. The sentence contains an unusual juxtaposition, sure—the sort of novel assembly of words ChatGPT could be persuaded to produce with sufficient goading and digital slapping of chops. But such an ersatz expression would convey none of Kafka’s alienation or his eerie precognition of the horrors of the coming century. It is, rather, a parlor trick, like throwing a handful of refrigerator magnets and getting a syntactically correct utterance.
This is not to say I don’t find it an impressive feat. I’ve spent plenty of time playing around with ChatGPT, trying to get it to rewrite pop songs in the style of Chaucer or to return proofs showing that 2 + 2 = 5. It’s cute. Is it useful?
That remains to be seen, but I am heartily sick of being told I’m in danger of falling behind if I don’t use an LLM for something. Use it for what? Based on the marketing I’ve come across, the chief purposes to which LLMs are being put is stuff like taking meeting notes or summarizing essays—things I like to call thinking. But wait, I’ve already noted that LLMs don’t think—they simply generate text. Don’t worry; it’s still true: thinking is what you are doing when you take notes. If you get ChatGPT to summarize an essay you’re too lazy to read yourself, what are you really getting? A load more text and a lot less thought.
This is obviously a bad thing for businesses that adopt these systems in a vain attempt to slough off some of the difficult work of understanding what it is that they are trying to do, but more importantly, it’s bad for all of us. Do we really believe that what we need right now, given the difficulty we face in evaluating all of the text currently flowing through our lives, is more text? Do we really need text generated at industrial scale?
I say “text,” but there’s another more comprehensive and pernicious word out there that makes more sense in this melieu: content. Content, that undifferentiated soup of data that fills our eyes and ears every day. It’s a perfect word for the glut of information that was already threatening to drown us before we started automating it—decontextualized, lacking any point of view, measured in bulk like cat litter. Is it any wonder we have such difficulty with truth these days? Finding truth in “content” is like trying to find fun-size chocolates in a sewer.
What could we do to make this state of affairs worse? Easy: make more content. My most excellent friend Ken recently alerted me to a story about Adobe selling AI-generated images of the Israel-Hamas war. I don’t know about you but I think we have enough images of that open wound already, and we certainly don’t need machines generating fictions about it, thank you very much. But, according the age old logic of “progress,” that’s what we’re going to get, and we’re going to get so much of it that truth won’t possibly be able to compete.
It’s enough to make you want to do crack.
N.B.: The title of this piece refers to a work of installation art by the Belgian artist Wim Delvoye from 2002. The Cloaca Machine was a mechanical device that digested food—prepared by a gourmet chef employed by Delvoye—fed into one end and produced very convincing output at the other, which was then vacuum-sealed and sold to art collectors.
Well, I speak hobo. There are three positive expressions here. They are good words about another fella, not seated. We know he is a nice person because 1) they call him motherfucker, affectionately, 2) it is unfortunate and unfair in the cosmos that this guy is stricken by God with an affliction of note that will likely be his demise, and 3) his otherworldly grace extends to not judging a fellow fella for giving in to the temptation of a taboo practice.
I like this motherfucker too and I think he works at Videodrome.
This is the essay: https://www.nytimes.com/2023/06/30/opinion/artificial-intelligence-danger.html?unlocked_article_code=1.-Uw.KUF5.qQcc_wUjeCdf&smid=url-share