The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
ChatGPT Will Make Writers Lazy
Chatbots are a quantum leap from writing aids in the past, like a thesaurus, word processing, and spell check.
My parents were in the tech business, so I always had computers growing up. In the late 1980s, we had a Commodore 64, the original Macintosh (which was released the same year as my birth), a pre-Windows PC running DOS, among other devices. I learned to type at a very young age, and that is a proficiency I keep to this day. (My parents thought it would be fun if I typed an electronic journal, like Doogie Howser.) I quickly realized how typing was, in many ways, superior to handwriting. One of my biggest frustrations with handwriting was the editing process. I would write a draft, then review it for errors, then have to rewrite the entire document from scratch, only to write from scratch all over again if there were more errors. Even in elementary school, this iterative process struck me as waste of time. With word processing, that entire process became unnecessary. I could make any necessary edits, and just click "print" again.
In 1995 or so, I had a disagreement with my fifth grade teacher. He required us to write all of our papers by hand. I wanted to type my papers. He replied (and I'm paraphrasing three decades later), "When you grow up, you won't be able to type everything, so you will need to learn how to handwrite." I grumbled, and acquiesced. Three decades later, my beloved teacher has been proven wrong. I almost never handwrite. I rarely ever touch a pen, other than to sign a receipt or write a greeting card.
Still, I acknowledge my dependency on word processing had consequences. My penmanship was not very good. And because I typed from such an early age, it never became important to improve my penmanship. On my report card, penmanship was consistently my lowest grade--"Needs Improvement" on the old New York City public school rubric. I at least learned how to write in cursive--even if my penmanship was poor. And, I can still read cursive script--a useful skill when reviewing archival documents. Likewise, I grew up with spellcheck. I never had to perfect my spelling, because I could always turn to that useful tool. (Readers of this blog will attest to my poor spelling). Grammar check came around later, but--to this day--it is not very useful. And I came of age just as the internet was booming. Plus we had Microsoft Encarta electronic encyclopedias. Information was usually a few clicks away. My seventh grade teacher criticized me for only using online resources for my research paper. Sensing a pattern? I never really needed to use actual books for research. To this day, I cannot find my way around a library. And so on.
Because of modern technology, I never developed skills that were essential in the past. I suppose I have managed okay, but I recognize these deficiencies.
This long-wind up brings me to the point of this post: I worry that ChatGPT, and similar AI, will make writers far too lazy. The impetus of this post was an Opinion column by Farhad Manjoo in the New York Times. He explains that ChatGPT has changed the way he performs his job as a writer.
First, Manjoo uses ChatGPT for "wordfinding."
Where [ChatGPT} does really help, though, is in digging up that perfect word or phrase you're having trouble summoning. . . I've spent many painful minutes of my life scouring my mind for the right word. ChatGPT is making that problem a thing of the past.
Selecting the right word is indeed one of the hardest parts of writing. For some high-profile op-eds, I can spend an hour thinking about the right word for an opening sentence. I consider so many factors: the length of the word; what sounds it makes (alliteration sometimes works, sometimes doesn't); whether I've used that word elsewhere in the piece; whether the word makes the sentence choppy or flow nicely; whether the word will be understood by my audience; and so on. Maybe ChatGPT can internalize all of those functions. I'm skeptical. But relying on ChatGPT to find words will weaken our ability to find words. Chatbots can only repeat what has said before; AI cannot generate new ideas (yet, at least). The very nature of this technology means literary innovation will be stifled--even with word selection.
Second, Manjoo uses ChatGPT to develop transitions:
Take the problem of transitions — you've written two sections of an article and you're struggling to write a paragraph taking the reader from one part to the other. Now you can plug both sections into ChatGPT and ask for its thoughts. ChatGPT's proposed transition probably won't be great, but even bad ideas can help in overcoming a block.
Transitions are extremely important. The best writers can guide a reader from the beginning to the end, without the reader even realizing they are on a ride. And, for people who skim, a transition can indicate whether a paragraph needs to be read or skipped. Transitions must be smooth, but instructive. Again, delegating this task to a Chatbot will weaken the author's ability to connect with the reader, and make the reading experience inviting.
Third, Manjoo explains that ChatGPT can take the role of a breaking-news reporter.
When big, complicated news stories break — a court ruling, an earnings report, a politician's financial disclosure forms — editors and reporters often have to quickly determine the gist of the news to figure out how to cover it. ChatGPT excels at this sort of boiling down: Give it a long document and it will pull out big themes instantly and seemingly reliably.
This capability can be very useful for legal decisions. When an opinion comes out, editors immediately want to know what happened.
Carlson [an editor] used it this way when Donald Trump was indicted: He gave ChatGPT the charging documents and asked it for a 300-word summary. "I want a reporter to read the whole indictment and understand it extremely well," he told me, but in the moment of breaking news, Carlson just wanted the big picture. "It did it, and it was helpful," he said.
Back in the day, I would do an "instant analysis" of court decisions. As I read through an opinion, I would update the post. They would be works in progress. Over time, I adopted a new policy: I (generally) will not write about a judicial decision till I have read it front to back. Reporters, alas, do not have that luxury. They are always on deadline, and need to produce content right away. And we know reporters have bolloxed that task before. For example, a few journalists completely blew NFIB v. Sebelius in 2012. So in this regard, ChatGPT provides a very valuable function. But again, a skillset is lost. Being able to quickly digest a document is very, very useful. And longtime Court watchers may spot things in an opinion, on quick glance, that a chatbot will not. Lyle Denniston has recalled that he used to dictate an article by phone to his newsroom in Baltimore. If more journalists turn to ChatGTP, the quality of breaking news analysis will ultimately suffer.
Manjoo lists other possible uses for ChatGPT:
There are so many other ways I can imagine ChatGPT being used in the news business: An editor could call on it to generate headline ideas. An audio producer could ask it for interview questions for a podcast guest. A reporter approaching a new topic might ask it to suggest five experts to talk to, to get up to speed.
I love writing headlines! I think carefully about length, concision, possible puns, and so on. Writing questions for a guest is a deeply personal task. Indeed, scripting questions will weaken the ability to generate follow-up questions on the fly. And developing a good network of sources relies on developing human relations. Offloading that task to a chatbot will weaken, in the long run, connections between interviewers and interviewers.
Manjoo compares ChatGPT to an ever-present editor:
"As a writer I like getting an idea from an editor to rewrite till it's mine," Carlson told me. ChatGPT functions as that editor — your always available, spitballing friend.
I don't think this analogy works. Generally, an editor only sees a piece after the author has put all of the necessary effort into the writing. I don't send my editors a half-baked essay, asking which words I can use. Treating ChatGPT bypasses the usual pre-submission labor. Simply put, it allows the author to slack off, and avoid the difficult process of writing.
***
People often ask me how I can write so much. My response is fairly consistent: writing is like a muscle, which you have to exercise. The more you write, the quicker you can write. Eliminating the need to pick out words, or develop transitions, or digest new material, will weaken writers.
I apologize if I sound like a Luddite. I am in the education business, and worry about the writing skills of students today. What will the writing skills of students look like in a decade when chatbots become fully integrated into pedagogy? From a meta perspective, I worry that the next generation will lose many of the essential skills needed to write on their own. We will become dependent on chatbots.
I'll admit, I was curious about this new technology for a bit, but I never actually used it for my own work. Chatbots are a quantum leap from innovations in the past, like a thesaurus, word processing, and spell check. But it's not for me. I take a lot of pride in my writing. I will not let students or research assistants generate prose, in large part, to avoid becoming lazy. But I may be an outlier. This technology will be too appealing for writers, and professors. Perhaps, at first, they'll use it to develop ideas, to brainstorm, or to hep refine writing. But over time, there will be less-and-less original content. Every time writers load the app, they are increasing their dependency on this technology.
Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Sure, just like TV coming along made playwrights lazy.
It's going to be a new subdiscipline in the medium, with different skills required. Coming out to pre-hate it is pretty silly.
I was too lazy to take the time to explain in detail why that seems like a strained analogy, so I just asked ChatGPT and skillfully pasted in its answer:
Yeah, I'm saying TV changed the medium of writing in some areas, because it was new. So too with AI.
I'm saying nothing specific to text versus visual, only that this is a new thing and will be used for some writing and not others, and projecting and crying about tech changes to the medium is something that has happened before, and made no one lazy.
I dont see that analogy. More like arguing that the advent of multiple law clerks made judges lazy. In actuality, what that did (along with technology like typewriter and then computers) was enable judges to issue more and (much) longer opinions. And that is neither an unmitigated good or bad thing.
I don't think it at all likely AI will be used by all writers, to do all writing tasks. It's a new style of writing, and will show up in some instances and not others.
Nobody needs writing skills anymore.
Just cut and paste the dictated narrative and have a cup of coffee.
end of snark; serious question begins:
Should 'writers' have to cite GPT when it is used for any (or all) of an article?
If they're using GPT text, they need to list it as a co-author on the article byline.
If they're using ideas or organization GPT suggested, thank GPT in the acknowledgments section.
If they're citing facts GPT found for them, no. We don't cite Google or research assistants, in articles, we cite the original sources Google or the human assistant found for us.
Indeed, until you actually consult ChatGPT's cited sources for what it claims are facts, you have no idea if they even exist. So a cite to ChatGPT would (should!) actually have negative credibility.
Those automatic pin-setters will ruin Bowling!!!!!!!!!!!!!
My "Penmanship's" always sucked (Try writing left handed sometime, you smudge and can't see what you already wrote or you have to arc your forearm/wrist around unnaturally like some James Bond Villain. Anesthesia records finally have gone electronic for the most part (Oddly the few paper records are in the "Reverend" Sandusky's "Better" Blue Cities)
Learned to type in 7th grade, no computers (does playing "Pong" count?) but Female/Male ratio was like 30:2, damn homo teacher put us in the front row.
Frank
By putting you in the front row, he was doing you a favor. All the girls stared at you all day, so many would like you, and you'd have a wide choice.
I am in the education business, and worry about the writing skills of students today...From a meta perspective, I worry that the next generation will lose many of the essential skills needed to write on their own. We will become dependent on chatbots.
I am in the business world, have hiring/firing authority, and I will simply say the writing skills of candidates for entry-level consultant positions has declined. I speak from the perspective of 20+ years in mgmt, and evaluating different candidates over that time period. The critical thinking and reasoning part of writing is not what it once was for job candidates. This is what I am seeing today.
Before you laugh at the bolded part, think about it. We could well become dependent chatbot/AI technology, that has pretty big implications to how we live our lives. Why do I need a lawyer when I can have a miniature Hal 9000 AI unit on the desk next to me making the argument on my behalf? Is that really so far away, considering the exponential nature of AI technology?
"I am in the business world, have hiring/firing authority, and I will simply say the writing skills of candidates for entry-level consultant positions has declined."
Why doesn’t US business actively support education reform?
Is it because you have all decided you can always get enough workers who grew up and went to primary school in Asia? Fear of unions and government? Something else?
Every single person in the US supports education reform, even the teachers' unions. We just disagree on a few details about the implementation.
You don’t have to take every question as an invitation to engage in a bullshitting competition.
I said "actively support". Most businesses are completely silent on K-12 education. The question is why.
No, it's because education reform is really hard. And they'll just spend the money on DEI cause that's easier, and they'll claim that's the root of the problem that needs to be addressed first.
You got school districts like Baltimore that are spending 16k per student per year. That's 320k per year for a 20 student class, and not a single one of them can read or do math at grade level.
Business supporting education reform is a joke, politicians and education officials will just say support more funding for us and we'll do the rest, and poor the money down a rathole that's already flooded with money.
You’re saying businesses have given up on American K-12 education. That’s probably a big part of the answer.
Many (most?) people have given up on it — even parents have given up in general, moving to a few select areas where the schools are still ok and leaving the rest to rot.
Businesses are going to need more than a few select employees in the future though. A family can find a way to escape the systemic failures for 2 kids. Businesses need hundreds or thousands of workers.
“I am in the business world, have hiring/firing authority, and I will simply say the writing skills of candidates for entry-level consultant positions has declined. I speak from the perspective of 20+ years in mgmt, and evaluating different candidates over that time period. The critical thinking and reasoning part of writing is not what it once was for job candidates. This is what I am seeing today.”
First a caveat that I give in my teacher’s education courses, and to keep the math simple let’s say you interview 100 applicants a year and the economy has been stable over the past 20 years so you get a constant 1000 applications a year. (I.e. there aren’t recessions when you’ll get better qualified candidates who have been laid off.)
The first year, there is a 100% chance that one of those candidates will have the best writing you have ever seen — as it’s the *only* applicant writing you have ever seen. The next year there is a 50% chance as your population sample is now 200. Twenty years out, there is a very small chance that you will see someone’s writing that is better than all the applicants you have seen in the past.
And that’s if you are objective, which you can never be. There’s Joe down the hallway who will probably get your job when you retire — you are thinking of the memo he wrote last not week and not what he wrote 20 years ago when you hired him. And then there’s Mary who gave up a promising career to “do the mommy thing” and you’re remembering the card she sent you about the mid-evil castle she’d toured last month and not what she’d written when you hired her.
And you don’t remember the writing of all the people you’ve fired over the years. This is human nature.
That said, writing HAS gotten worse over the past 30 years, and there are two reasons for it.
First is politics and the decision, circa 1990, to use college freshman writing courses to promote social justice politics. This distracted from the basic goal of teaching writing, and with grades being the goal, one could merely mirror the approved values instead of learning to express them well.
You don't learn critical thinking & reasoning when both have already been done for you and you are instead graded on your ability to parrot them back.
Second is what has happened to our college English departments over the past 40 years, and the introduction of critical theory. Instead of teaching Shakespeare as Shakespeare, it’s now being taught as White Male oppression, or how Shakespeare was actually gay, or how the shape of his stage affects his plays. Without going too deeply into the weeds, you have professors who can correctly use a compound-complex sentence trying to teach that to students who can’t manage agreement between subject and verb.
We’ve replaced diagramming sentences with “write like you talk”, with understandably disastrous results. And then there are the pronouns — a singular object is either “he”, “she”, or “it.” It comes from the Latin, and if you are a “they”, you are schizophrenic because you believe that you are multiple people.
We don’t teach the basic five paragraph essay anymore because it is to Eurocentric or something — I’ve given up trying to understand this stuff…
It’s going to take business saying “enough of this foolishness — teach basic writing if you want us to hire your grads” to end this. And as to AI, you still are going to need the competent human to review what the AI has written. Ask legal counsel why this will be a very good idea…
"…you still are going to need the competent human to review what the AI has written. Ask legal counsel why this will be a very good idea…"
Just like you need someone to review human-generated content, for the same reasons.
It’s completely fine (awesome, in fact) if it only replaces 30-75% of the human efforts.
There is some evidence handwriting helps with writing, especially in younger children, who may miss a window of learning.
It's not even that -- handwriting involves a different part of the brain than typing, and there is considerable evidence that taking handwritten notes will enable one to learn material better.
" I at least learned how to write in cursive–even if my penmanship was poor. And, I can still read cursive script–a useful skill when reviewing archival documents."</i.
The latter is most important -- imagine the consequences of a society where archival documents are unreadable.
Software will be able to read them and reformat them. It’s not an issue.
I don't trust software -- it's why Biblical scholars learn Hebrew and Aramaic. It's why Muslims learn Arabic -- there is no authorized translation of the Koran.
And now the typing horse displays his ignorance of the difference between language and writing.
"there is no authorized translation of the Koran"
Absolutely untrue. Just like with other 'bibles', anyone can claim their version is 'authorised'.
https://www.masjidtucson.org/quran/noframes/
The 'authorised' Christian bible is just a translation that was approved by an egotistical (and syphilitic) king of England.
While I can’t speak for the typing horse (of course of course), dollars to donuts what was meant is that practicing Muslims are expected to read the Quran in Arabic, as they believe that any translation from Arabic removes the “holiness” of the original text. Thus, there is no authorized translation of the Quran, for which a Muslim may eschew the original Arabic.
I would say that you should know that, but I don’t expect that much from davedave the dumbdumb.
Good for you. The rest of us have stuff to get done.
If we cannot read about the past, we are doomed to relive it without being aware we are repeating it?
"imagine the consequences of a society where archival documents are unreadable."
The typing horse once again demonstrates there are no limits to his ignorance. It has been a problem since the very first records were created. It's why we have archivists.
Yeah it certainly does. When I was in college I always took notes in class and almost never reviewed them, I didn't have to, by taking notes I had already absorbed the information, at least deep enough I would still know it through the final.
Lazy reader complains about lazy writing...
Writing stuff uses different pathways in the brain than saying stuff. In Organic Chemistry/Anatomy/Biochemistry/Flight training I'd write or draw out the reactions, formulas/emergency procedures, and say them at the same time, amazing how many of todays doctors wouldn't recognize a Chorda Tympani if you slapped them in the face with one (OK, the Chorda Tympani is pretty small)
Frank "Yes, I am talking to myself as a matter of fact"
Different PARTS of the brain as well.
There are two skill sets, one old, one new, arising out of these AIs. The old one is the ability to edit other people's material - and ChatGPT definitely needs its output to be edited. Few people have that skill, though most are aware of it.
The second skill set is the ability to spec the question you ask it. The more relevant detail you supply in a question, the better the output, as with Google searches - I am sure we all have friends and family who seem unable even to run a Google search properly (I have a friend who complained that when she was looking for a lawyer in Arizona, Google was useless as it had 39 million results when she entered "Arizona" and "lawyer" as the two search terms.) -
IMO, learning to write well, or competently, anyway, helps with another critical skill - reading comprehension.
That seems to me to have declined significantly. I hope ChatGPT won't make it worse.
How many times have you sent someone a question via email and gotten back a wholly non-responsive answer?
"How many times have you sent someone a question via email and gotten back a wholly non-responsive answer?"
I prefer strawberry milkshakes, myself.
I prefer strawberry milkshakes, myself.
Does your milkshake, in fact, bring all the boys to the yard?
I suspect, but have no way of confirming, that the evolution towards shorter forms of written communication, often engaged with while distracted by multiple other stimuli - e.g., texting, tweeting, etc. - has taught people to skim-read. It definitely happens in my experience that I get responses from people where it seems like they're not responding to what I say, but to keywords I've used, somehow.
Effectively communicating with these people often requires direct, short queries and responses.
That is likely at least partly true. Another factor relates to the actual hardware display.
Many people only read the part of an email that appears on their phone screen, or on the preview screen in the email program. Consequently they reply only to that part of the message.
I have done this so many times that I think it is some sort of mental/physiological mechanism, rather than just laziness. Something tells you that you are done reading, and you don't think to scroll down to see if there is more to the message.
10 yo JB - I understand that new technologies lead to the development of new skills rather than making people lazy
40 (?) yo JB - I don't understand that new technologies lead to the development of new skills rather than making people lazy
I use ChatGPT to improve my writing. For example, sometimes I write things in an awkward way. Asking ChatGPT for suggestions on how to rewrite it not only improves that particular writing, but enables me to recognize new patterns of expression I can incorporate into my writing.
There is an old saying about how I would have used less words, if only I had the time. With ChatGPT, you do have the time.
My suggestion for Blackman would be to ask ChatGPT for help shortening his writing, just as I sometimes do. If you can express complex ideas in fewer words, that can not only save time for the reader, but lead to writing that is more impactful.
Jesus Christ.
Josh, you are not a good writer, you are not a good reader, and you are not a good editor. You have nothing worthwhile to say on this topic.
I mean, just look at this bullshit.
Four paragraphs of personal storytelling, to lead into this graf:
Is this what you consider to be a "good transition"? Is there a reason you retained an introduction that you yourself describe as a "long-wind [sic] up"? This paragraph itself stop-starts to the putative "transition": you say, "this long-wind up" brings you to the point of the post, but also the impetus of the post is some NYT op-ed.
God.
This post evidently must not rank, then, since you open this a piece about how "ChatGPT will make writers lazy" with the following sentence: "My parents were in the tech business, so I always had computers growing up." So the fuck what? Were you hoping to demonstrate by your very first sentence that you, yourself, are already an abysmally lazy writer, who risks becoming even lazier, were you to start using ChatGPT?
This must explain why reading your posts feels like a death march.
Yes, we've seen the quality of your analysis. ChatGPT might help, actually.
Is this guy for real?
No, people ask you why you write so much. We all understand that it's perfectly easy to bloviate at length on any passing concern, given sufficient free time (of which you evidently have too much). But we also recognize that so little of what you say has merit. Your analyses are shoddy, half your writing about the Court is scurrilous, virtually everything you write is twice as long as it needs to be and relies unnecessarily on personal anecdotes.
You could have spent twice as long working on this post, written half as much, and produced a much better post. But you don't do that. You don't stop to think, you don't stop to edit. You just vomit out whatever's on your mind, hit post, and pat yourself on the back, like the VC and Newsweek are your own personal blog.
Jesus Christ, Josh.
"No, people ask you why you write so much"
Arf. Bit unfair, though. When you have a skill you're really, really proud of acquiring, something no-one ever thought you'd be able to do, you want to show it off.
He started off the sentence with something that is probably true, though.
"People often ask me how I can write"
Is that answer that he had such a high intake of lead as a child that he can sharpen his finger and use it as a pencil?
Josh isn't prolific because he's a good writer, or because he's proud of being a good writer. He's prolific because he recognizes that it's his desired path to career success. Write frequently, write everywhere, write at length - get yourself noticed, cited, promoted. It's the MTG effect. Doesn't matter what you say.
Take half the stuff he writes here - hell, more than half. Most of it's underbaked. Maybe some of it will play well, some of it he'll develop or "refine" later; he builds on that stuff. But he leaves everything else to disappear in obscurity, like food scraps becoming mulch. I've probably forgotten more of what Josh has said than I remember of what Jonathan, Will, or Orin have said - but their writing has actually improved my thinking or understanding, while Josh's writing just wastes my time.
Good writers take their time, and publish only what's survived rounds of revision.
I didn't say he's a _good_ writer. Like the dancing bear, the wonder is that he does it at all.
You are being very unfair expecting actual discernment from Blackman, let alone in comparing him to MTG's deliberate and cynical manipulations. He isn't doing it on purpose; he is just dimmer than a candle in the rain.
A: It's graphite, not lead.
B: A lead chip the size of your fingernail, if ingested, would be lethal.
Just sayin...
As usual, the typing horse can only display his ignorance. Graphite is called lead in that context for historical reasons, hence the joke.
And no, swallowing a 'lead chip' obviously won't kill you. An actual lethal dosage of lead, if it is entirely absorbed, would be about 35-40 grams for a typical adult (whereas a fingernail sized 'chip' would be perhaps a tenth of that at most) - but of course swallowing a piece of lead leads to shitting out an insignificantly smaller piece of lead fairly soon after.
Dr. Ed 2, legend in his own mind,
love to see what he thinks would happen if you swallowed Mercury from a Thermometer (nothing, elemental Mercury is non-toxic, not that I go around swallowing elemental Mercury). Ish-yew came up when a Corpse-man swallowed a glob of elemental Mercury as a goof, one of the other Docs freaked out, calling 911, until I looked it up in the "Poisindex" (in the days before AlGore invented the Internets we would actually look up stuff in these inventions they called "Books")
Somehow, most people think elemental Mercury is just a little less deadly than Plutonium (don't swallow elemental Plutonium)
Frank
Dude. Go touch grass or something. These nearly daily repetitive, angry rants are pointless beyond demonstrating the degree to which you're letting him live rent-free in your head.
Here, let me tell you my life story, as a segue into explaining why I do what I do...
Did you also notice the telltale sign of narcissism in his post?
Take note of how many times he referred to himself.
I blah blah blah and I blah blah. I blabbity blah. I blah.
Pathetic.
Get over yourself Simon, that's perfectly fine writing, especially for a blog post.
"You could have spent twice as long working on this post, written half as much, and produced a much better post. But you don’t do that. You don’t stop to think, you don’t stop to edit. You just vomit out whatever’s on your mind, hit post, and pat yourself on the back, like the VC and Newsweek are your own personal blog."
Physician heal thyself.
Anyone who's published as many books and OpEds and Law review Journals as Josh hardly needs your tips.
And whole point of blog writing is to get it down quickly with minimal editing, and move on.
“Anyone who’s published as many books and OpEds and Law review Journals as Josh hardly needs your tips.”
You are confusing quantity with quality.
I am unsurprised that someone like you would mistakenly defend the self-centered bloviating Blackman offers up as insight.
"Because of modern technology, I never developed skills that were essential in the past."
Like concision?
People facing competition always get lazy, right?
"Because of modern technology, I never developed skills that were essential in the past."
Umm, pretty sure they had Dorks before modern technology, and why does every generation think they're the first to enjoy "technology" ?? There was a greater % of the population who flew their own airplane in the 1950's than today (lots of WW2 Vets, cheap gas, and private airplanes didn't cost as much as a late model Porsche) Of course why fly an actual airplane that can crash, and costs money, when you can just get MS Flight Simulator....
Josh Blackman, Irina Manta, Eugene Volokh.
Quite a crew.
They're only missing Shemp
An industry of craftsmen telling themselves no one will want mass-produced versions of their work.
Look forward to a future where undistinguished writing and reasoning has a lot less value and only the truly excellent can command premium salaries. The rest will be outclassed by machines. It will be a huge win for the general public.
Good point.
Prof. Blackman, I don’t use the word “hero” very often. But you are the greatest hero in American history.
"Over time, I adopted a new policy: I (generally) will not write about a judicial decision till I have read it front to back."
Lots of times I do it in a bar with the lights turned off, but I get the drift.
Socrates wrote some 2,400 years ago that writing would destroy memory and weaken the mind.
Odd how we're still having basically the same conversation. Was Socrates right? Should we get rid of not just ChatGPT, not just typewriters but writing entirely? If not, what makes this different?
"Just because there's a quote with my name on it on the Internet, doesn't mean I said it." /Socrates
The Margrave of Azilia : ” … Internet, doesn’t mean I said it./Socrates….”
In the dialog Phaedrus, Socrates (Plato) tells a long story about the Egyptian King Thamus and his interrogation of the god Thoth (for some reason spelled Theuth). At the end of the speech, his dialog companion – the eponymous Phaedrus – sneers “you easily make up stories of Egypt or any country you please.”
Clearly Phaedrus was used to the ways of the internet! However, Socrates’ story & the following back and forth discussion all concerned Socrates’ (Plato's) belief that writing destroys memory and weakens the mind – just like Rossami says.
You can look it up…..
Just a thought, as someone who is not in the current business of interviewing, vetting or otherwise commenting or passing judgement on new hires (but was once)..
Give the candidate an empty fountain pen, a bottle of ink, and ask him/her to write sentence describing why they want the job at hand.
If you play silly games at interview, all the candidates you actually want to employ will say some variant on 'sorry, seems like this job isn't a good fit for me' and leave.
I’ll tell you more, GPT chat will make not only writers lazy, but also other people, or rather other professions. For example, I’m a designer, and I used AI to generate quantum images for one project, and it was easy and fast, the main thing is to know the right scripts, and then it’s up to you. With such opportunities, in the future, some professions, not even professions, but the human factor in these professions will disappear.
I fully agree, chatbot is certainly something incredible, but it both benefits and deprives workers of jobs, but this will most likely be in the future, but let’s not talk about the bad things now. For example, I provide video transcription service, and with the advent of AI in my life, my work has become two times easier, I have reduced my time by almost one and a half times when doing my work, it’s just a miracle, you would know how happy I was at that moment and now too. No souvenirs, chat, and AI in general, will take away some of the work, but the benefits they bring are incalculable.
Yes, you're right.
So cool topic
chatgpt deutsch is truly a breakthrough in many fields I believe within the next 5 years Ai will be included in many more fields