The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
ChatGPT Will Make Writers Lazy
Chatbots are a quantum leap from writing aids in the past, like a thesaurus, word processing, and spell check.
My parents were in the tech business, so I always had computers growing up. In the late 1980s, we had a Commodore 64, the original Macintosh (which was released the same year as my birth), a pre-Windows PC running DOS, among other devices. I learned to type at a very young age, and that is a proficiency I keep to this day. (My parents thought it would be fun if I typed an electronic journal, like Doogie Howser.) I quickly realized how typing was, in many ways, superior to handwriting. One of my biggest frustrations with handwriting was the editing process. I would write a draft, then review it for errors, then have to rewrite the entire document from scratch, only to write from scratch all over again if there were more errors. Even in elementary school, this iterative process struck me as waste of time. With word processing, that entire process became unnecessary. I could make any necessary edits, and just click "print" again.
In 1995 or so, I had a disagreement with my fifth grade teacher. He required us to write all of our papers by hand. I wanted to type my papers. He replied (and I'm paraphrasing three decades later), "When you grow up, you won't be able to type everything, so you will need to learn how to handwrite." I grumbled, and acquiesced. Three decades later, my beloved teacher has been proven wrong. I almost never handwrite. I rarely ever touch a pen, other than to sign a receipt or write a greeting card.
Still, I acknowledge my dependency on word processing had consequences. My penmanship was not very good. And because I typed from such an early age, it never became important to improve my penmanship. On my report card, penmanship was consistently my lowest grade--"Needs Improvement" on the old New York City public school rubric. I at least learned how to write in cursive--even if my penmanship was poor. And, I can still read cursive script--a useful skill when reviewing archival documents. Likewise, I grew up with spellcheck. I never had to perfect my spelling, because I could always turn to that useful tool. (Readers of this blog will attest to my poor spelling). Grammar check came around later, but--to this day--it is not very useful. And I came of age just as the internet was booming. Plus we had Microsoft Encarta electronic encyclopedias. Information was usually a few clicks away. My seventh grade teacher criticized me for only using online resources for my research paper. Sensing a pattern? I never really needed to use actual books for research. To this day, I cannot find my way around a library. And so on.
Because of modern technology, I never developed skills that were essential in the past. I suppose I have managed okay, but I recognize these deficiencies.
This long-wind up brings me to the point of this post: I worry that ChatGPT, and similar AI, will make writers far too lazy. The impetus of this post was an Opinion column by Farhad Manjoo in the New York Times. He explains that ChatGPT has changed the way he performs his job as a writer.
First, Manjoo uses ChatGPT for "wordfinding."
Where [ChatGPT} does really help, though, is in digging up that perfect word or phrase you're having trouble summoning. . . I've spent many painful minutes of my life scouring my mind for the right word. ChatGPT is making that problem a thing of the past.
Selecting the right word is indeed one of the hardest parts of writing. For some high-profile op-eds, I can spend an hour thinking about the right word for an opening sentence. I consider so many factors: the length of the word; what sounds it makes (alliteration sometimes works, sometimes doesn't); whether I've used that word elsewhere in the piece; whether the word makes the sentence choppy or flow nicely; whether the word will be understood by my audience; and so on. Maybe ChatGPT can internalize all of those functions. I'm skeptical. But relying on ChatGPT to find words will weaken our ability to find words. Chatbots can only repeat what has said before; AI cannot generate new ideas (yet, at least). The very nature of this technology means literary innovation will be stifled--even with word selection.
Second, Manjoo uses ChatGPT to develop transitions:
Take the problem of transitions — you've written two sections of an article and you're struggling to write a paragraph taking the reader from one part to the other. Now you can plug both sections into ChatGPT and ask for its thoughts. ChatGPT's proposed transition probably won't be great, but even bad ideas can help in overcoming a block.
Transitions are extremely important. The best writers can guide a reader from the beginning to the end, without the reader even realizing they are on a ride. And, for people who skim, a transition can indicate whether a paragraph needs to be read or skipped. Transitions must be smooth, but instructive. Again, delegating this task to a Chatbot will weaken the author's ability to connect with the reader, and make the reading experience inviting.
Third, Manjoo explains that ChatGPT can take the role of a breaking-news reporter.
When big, complicated news stories break — a court ruling, an earnings report, a politician's financial disclosure forms — editors and reporters often have to quickly determine the gist of the news to figure out how to cover it. ChatGPT excels at this sort of boiling down: Give it a long document and it will pull out big themes instantly and seemingly reliably.
This capability can be very useful for legal decisions. When an opinion comes out, editors immediately want to know what happened.
Carlson [an editor] used it this way when Donald Trump was indicted: He gave ChatGPT the charging documents and asked it for a 300-word summary. "I want a reporter to read the whole indictment and understand it extremely well," he told me, but in the moment of breaking news, Carlson just wanted the big picture. "It did it, and it was helpful," he said.
Back in the day, I would do an "instant analysis" of court decisions. As I read through an opinion, I would update the post. They would be works in progress. Over time, I adopted a new policy: I (generally) will not write about a judicial decision till I have read it front to back. Reporters, alas, do not have that luxury. They are always on deadline, and need to produce content right away. And we know reporters have bolloxed that task before. For example, a few journalists completely blew NFIB v. Sebelius in 2012. So in this regard, ChatGPT provides a very valuable function. But again, a skillset is lost. Being able to quickly digest a document is very, very useful. And longtime Court watchers may spot things in an opinion, on quick glance, that a chatbot will not. Lyle Denniston has recalled that he used to dictate an article by phone to his newsroom in Baltimore. If more journalists turn to ChatGTP, the quality of breaking news analysis will ultimately suffer.
Manjoo lists other possible uses for ChatGPT:
There are so many other ways I can imagine ChatGPT being used in the news business: An editor could call on it to generate headline ideas. An audio producer could ask it for interview questions for a podcast guest. A reporter approaching a new topic might ask it to suggest five experts to talk to, to get up to speed.
I love writing headlines! I think carefully about length, concision, possible puns, and so on. Writing questions for a guest is a deeply personal task. Indeed, scripting questions will weaken the ability to generate follow-up questions on the fly. And developing a good network of sources relies on developing human relations. Offloading that task to a chatbot will weaken, in the long run, connections between interviewers and interviewers.
Manjoo compares ChatGPT to an ever-present editor:
"As a writer I like getting an idea from an editor to rewrite till it's mine," Carlson told me. ChatGPT functions as that editor — your always available, spitballing friend.
I don't think this analogy works. Generally, an editor only sees a piece after the author has put all of the necessary effort into the writing. I don't send my editors a half-baked essay, asking which words I can use. Treating ChatGPT bypasses the usual pre-submission labor. Simply put, it allows the author to slack off, and avoid the difficult process of writing.
***
People often ask me how I can write so much. My response is fairly consistent: writing is like a muscle, which you have to exercise. The more you write, the quicker you can write. Eliminating the need to pick out words, or develop transitions, or digest new material, will weaken writers.
I apologize if I sound like a Luddite. I am in the education business, and worry about the writing skills of students today. What will the writing skills of students look like in a decade when chatbots become fully integrated into pedagogy? From a meta perspective, I worry that the next generation will lose many of the essential skills needed to write on their own. We will become dependent on chatbots.
I'll admit, I was curious about this new technology for a bit, but I never actually used it for my own work. Chatbots are a quantum leap from innovations in the past, like a thesaurus, word processing, and spell check. But it's not for me. I take a lot of pride in my writing. I will not let students or research assistants generate prose, in large part, to avoid becoming lazy. But I may be an outlier. This technology will be too appealing for writers, and professors. Perhaps, at first, they'll use it to develop ideas, to brainstorm, or to hep refine writing. But over time, there will be less-and-less original content. Every time writers load the app, they are increasing their dependency on this technology.
Show Comments (71)