In 1959, Truman Capote appeared on a television show with Norman Mailer, who praised the Beat Generation writers, especially Jack Kerouac. Exasperated, Capote famously replied: “That’s not writing; that’s just typing.” This is exactly how contemporary writers dismiss the growing language capabilities of Machine Learning and Artificial Intelligence — their output could not possibly contain the vital spark. That’s somewhat true. Machines have found use mainly in utility communications (chatbots), glorified stenography (Otter.ai) and basic, sub-literary translation (Google Translate).
But through this summer, amidst the stasis of the lockdown, one machine has been training hard in the billion-dollar OpenAI lab in San Francisco backed by Elon Musk, among others — GPT-3. This week, the Guardian newspaper commissioned GPT-3 to write an oped, and it produced an eminently readable opinion article, like the ones you see on this spread. It begins by playing harmless, assuring the human race that it would never try to harm or eliminate its makers, but ends with what appears to be a veiled threat. That variance reflects the amorality of technology. GPT-3 can write an article, a poem, a novel or a newspaper report. It can style it with CSS and publish it to the web. It can even write code in computer languages like Python, to run on the web server. On the other hand, it can also be trained to churn out spam, lies, social engineering content and malware at nearly the speed of light, and pose a very real threat to civilisation.
But while editorialists may fear for their jobs, editors remain secure. GPT-3 actually produced five opeds, from which human editors picked out sections to assemble the final version. The human factor is crucial, because it dictated the flow of the article, and the slightly threatening ending suggests that some editors enjoy frightening readers.
📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines