Chatbot vs. Writer: Vauhini Vara on the Perils and Possibilities of Artificial Intelligence
In Conversation with Whitney Terrell and V.V. Ganeshananthan on Fiction/Non/Fiction
Novelist and journalist Vauhini Vara joins V.V. Ganeshananthan and Whitney Terrell to discuss how ChatGPT, the artificial intelligence chatbot developed by OpenAI, may—or may not—impact publishing, education, journalism, and the humanities in general. Vara explains differences between ChatGPT and another OpenAI tool, GPT-3, which she used as a way into writing about the death of her sister, a topic she had previously found unapproachable. She reads from the resulting essay, “Ghosts,” which was published by The Believer and anthologized in Best American Essays 2022.
Check out video excerpts from our interviews at LitHub’s Virtual Book Channel, Fiction/Non/Fiction’s YouTube Channel, and our website. This podcast is produced by Anne Kniggendorf.
*
From the episode:
V.V. Ganeshananthan: Can you talk, for our listeners, about how GPT-3 operates? And if you’ve used ChatGPT—which is the free bot that was released in late November—can you talk about how those two things compare?
Vauhini Vara: Yes. OpenAI, this lab in Silicon Valley—co-founded by Elon Musk, actually—developed this algorithm that can deal with text. So this algorithm is trained on all of this text available in the world, stuff like… well, we don’t know exactly. But stuff like self-published novels and random information from the internet and chat conversations that so-called “trainers” have created to help these algorithms understand how to work. Basically, these algorithms are shown a bunch of text in order to understand how text functions.
GPT-3 is a version of what OpenAI created a couple of years ago. The way I used it was through something called the Playground created by OpenAI. It was not open to the public like ChatGPT is—you could use it by invitation. So I got an invitation to use this thing called the Playground. And the way it worked was that you can type some text and press a button and that text will be completed for you; the idea is that it will be completed in kind of a similar style. ChatGPT came more than a year later, recently.
And that is a different product, a different tool that’s based on some of the same modeling. It’s trained in a similar way but it has some more specific, further training that teaches it specifically how to have chatbot-style conversations. So you can’t actually use it exactly the same way as I use the Playground to generate text; you can ask it to do that, but it’s not going to be as effective at that as it was for me using this other tool.
VVG: We were talking about this a little bit at the top of the show—I signed up for a ChatGPT account last night and was trying to get it to complete text for me, and then I realized that’s not what it does. So then I started asking it increasingly specific questions. It was interesting, because in a weird way… what I was doing unintentionally, I realized when I looked back at your essay, was kind of the same thing that you were doing. I was giving it more and more information, because my questions were wrong.
At first, I wanted to write the top of the show of banter using ChatGPT, so I was trying to banter with it. And it was like, “I do not do that.” And then I was like, “I want to do a show about artificial intelligence,” and it renamed us “The Pen and the Machine.” That’s our new name if every episode were about this. Then I was like, “No, I want to write banter for Fiction/Non/Fiction.”
VV: But did it eventually work?
VVG: It kind of eventually worked. But Whitney and I ended up sounding, in this banter, like… congenial robots. Just very… “and for whomever might be interested in the world of intelligence and literature, sail forward with us into this unknown!” You know, it’s very… as you said when you were talking about ghosts on, I think, NPR, “If I gave it a cliche, it gave me a cliche back.” I felt like my experience was mirroring. Anyway, let’s talk about “Ghosts,” because that was sort of what happened.
Whitney Terrell: Well, I wanted to first say that I thought that essay was beautiful. I’m sorry that your sister died. I thought it was an amazing way of expressing grief. We’re going to talk about the technology of the essay, but I thought the emotional part of the essay was the part that really landed with readers. We just did an episode about my old friend Russell Banks who died. Death is difficult to deal with. I thought that it was an excellent essay about death and loss.
But when you were writing it, as Sugi was saying, you kept feeding more and more specific prompts into GPT-3. And your final essay features nine mini-stories, each with more of your own writing and less from artificial intelligence. It’s a really beautiful and strange progression. As you write in the essay, inconsistencies and untruths appear. But it started off with you writing just a line about your sister’s death, which you hadn’t written before that. When you wrote that first line, what did you expect the Chatbot to do, and what did it really do? And how did this response change what you’re able to do next?
VV: So I didn’t know that I was writing an essay, I should say, to start. I had access to this technology, and as I was playing around with it, I started to understand that what this technology was promising to do was to help people write what they couldn’t figure out how to write on their own. And so I started thinking about what it was that I had never figured out how to write on my own.
There are a number of things, but for me, the most profound thing, probably, is the death of my sister. It’s something that continues to be hard for me to write about, it continues to be hard for me to talk about. I had my writer-mind turned on less than just my person-mind, right?
When I went into Playground and wrote a line, just saying that my sister had been diagnosed with this form of cancer, I left it there and allowed GPT-3 to continue. The text that it provided was about somebody—not me, a fictional character, really—whose sister had gotten sick and then got better and everything was great. We were all fine and happy in the end. And reading that was really difficult, actually. I read it and was like, “Oh, God, that’s not what happened to me. This isn’t right at all.”
And so—again, less as a writer and more as just a person—I was like, “Okay, I think I need to say more in order for this algorithm to “understand” where I was trying to go. And so, gradually, I told it more and more and I realized as I did that—I didn’t know it from the outset—that because of the way these algorithms are built in such a way that they can respond to what we give them, the more I gave this algorithm, the more it was able to essentially mimic my voice and even my emotions, the way I was really feeling. And when it started to do that, when it started to do the latter, it was really spooky for me. I mean, it was quite accurate often… Sorry. It still makes me emotional. There were times when the algorithm was able to make statements, write things that I thought were really profound characterizations of what my experience was like.
Selected Readings:
• The Immortal King Rao • Ghosts
Others:
• Review: ‘The Immortal King Rao,’ by Vauhini Vara – The New York Times • Meet GPT-3. It Has Learned to Code (and Blog and Argue). – The New York Times • Is This the Start of an AI Takeover? – The Atlantic • AI Has Come to Save the Arts from Themselves – The Washington Post • AI Reveals the Most Human Parts of Writing | WIRED • Don’t Ban ChatGPT in Schools. Teach With It. – The New York Times • Microsoft Bets Big on the Creator of ChatGPT in Race to Dominate A.I. – The New York Times • ChatGPT: Optimizing Language Models for Dialogue • Fiction/Non/Fiction Season 6 episode 16: In Memory of Russell Banks: Rick Moody on an Iconic Writer’s Life, Work and Legacy • W.B. Yeats’ Autoscript • ‘Even the spirits get a say’: A Look Into James Merrill’s Ouija Poems by Harriet Staff • Exquisite Corpse poetry • Will Artificial Intelligence Kill College Writing? Online programs can churn out decent papers on the cheap. What now? – The Chronicle of Higher Education • Fiction/Non/Fiction Season 3 Episode 25: No Innocents Abroad: Scott Anderson and Andrew Altschul on the CIA and US Provocateurs in Foreign Politics