AI Can Do Paperwork Doctors Hate
With help from artificial intelligence, doctors can focus on patients.


Everyone dreads doctor's appointments. After navigating a maze of scheduling hurdles, patients find themselves waiting 26 days on average to see a physician in major cities. The whole ordeal culminates with patients twiddling their thumbs in office waiting rooms for 20 minutes (on a good day) before getting to see the doctor for 10 minutes (on a good day).
Luckily, artificial intelligence is already starting to solve some of these logistical headaches to improve both a doctor's productivity and a patient's experience. Companies such as Nabla and Glass Health have created products that utilize AI to help doctors give the type of care they know patients want and deserve.
The burden of medical record keeping has always fallen to physicians. When electronic health record (EHR) and electronic medical record (EMR) systems started popping up in the 1960s and 1970s they were too expensive for most practicing physicians to adopt. As the U.S. entered the digital age, abandoning handwritten medical records made sense, but the electronic record-keeping systems in place were far from perfect. That did not stop the federal government from basically forcing doctors to adopt these systems in 2009 by tying increased Medicare and Medicaid reimbursements to practices who demonstrate "meaningful use" of electronic record-keeping systems.
Now physicians dedicate a significant portion of their day, exceeding four hours on average, to managing and updating EHRs. To meet these documentation demands, doctors often resort to late-night "pajama time" to finalize EHR updates long after their workday has ended. Jeffrey Singer, a surgeon and senior fellow at the Cato Institute, has argued that EHRs are a distraction for doctors, hindering their ability to effectively care for patients.
Many physicians even cite EHRs as the most stressful aspect of their job. It's no surprise EHR documentation has also been linked to physician burnout. This excessive administrative burden significantly hinders the quality of patient care. A doctor's time is crucial and limited, and record keeping is stealing valuable minutes away from face-to-face patient interactions.
To get doctors to "enjoy care again," Nabla, a digital health startup, in March 2023 launched Copilot, an ambient AI scribe for physicians. The adoption process for doctors is simple: Just download the Copilot extension for the Google Chrome web browser, and the AI will do the heavy lifting in the background. It will work with the computer's built-in microphone to take clinical notes during patient visits and will automatically update the patient's health records.
Copilot fully integrates with EHRs and also makes it possible for doctors to share straightforward, distilled notes with patients after visits—which means the days of trying to remember a doctor's instructions may soon be behind us.
Information security concerns about the app are minimal, since Copilot does not store audio, transcripts, or notes and is compliant with the Health Insurance Portability and Accountability Act (HIPAA) and the European Union's General Data Protection Regulation (GDPR). The company claims that the fine-tuned models provide highly accurate outputs, with only 5 percent of the notes requiring adjustments.
They also promise that Nabla Copilot saves doctors, on average, two hours a day on clinical documentation. In a little over a year, more than 30,000 practitioners are already using Copilot. Nabla's CEO, Alex LeBrun, formerly at Meta, tells Reason that after 4 million encounters, not once has a patient stopped their doctor from using Copilot. In fact, many patients are pleased to learn that their doctors are staying up to date with the latest innovations in health care.
Glass Health, co-founded in 2021 by Dereck Paul and Graham Ramsey, a former engineer at Modern Fertility, was originally created as a "notebook physicians can use to store, organize and share their approaches for diagnosing and treating conditions." The company has now pivoted to embrace the generative AI revolution.
By feeding their large language model (LLM) thousands of peer-reviewed studies and evidence-based clinical guidelines, their AI is able to create treatment plans and make diagnosis suggestions. The distinction between having an artificial intelligence supporting clinical decisions and making clinical decisions is crucial. The latter scenario opens an ethical can of worms and would likely trigger stricter regulations from the Food and Drug Administration.
Artificial intelligence thrives in a scenario where it can track patterns and build off of algorithms. Not every patient diagnosed with Crohn's disease is anemic. Not every diabetic experiences swelling of the feet. AI could potentially overlook crucial details that would change a patient's treatment plan. As Singer says, "the danger is that patients vary," highlighting why doctors "should not surrender decision making to the robot." AI's role in medical decision making will continue to evolve, but for now it's only augmenting the expertise of health care practitioners.
The global AI health care market was valued at over $20 billion in 2023 and is projected to be worth $187 billion by 2030. As more sophisticated LLMs come online, AI's role in health care will continue to grow beyond just making paperwork processes easier. It's already being tested in a variety of advanced ways such as reading X-rays, screening for skin cancer, and analyzing patterns in a patient's medical history to predict potential health risks.
AI may still be a long way from replacing doctors entirely, but in the near future, it can help eliminate the worst part of the job for many doctors—and let them get back to building relationships with their patients.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Most of what primary care physicians do could be replaced by an app. Let's take the next step with this.
AI represents a significant change to human communication at what is already a time of unprecedented change, the internet.
While the benefits are touted, the risks of corruption by the many bad actors around us go largely unnoticed and played down as conspiracies to be ridiculed when they are. This is not by accident.
Just as the founders recognized that inalienable rights, those which can’t be taken away or sold, are necessary for the security of a free state, they didn’t foresee the internet.
Now, more than ever, with AI imminent we need to update the constitution to recognize that the internet is a public place where all our inalienable rights apply. Like free speech and privacy.
If we don’t update the constitution, AI will be used against us as we, the victims of it, can’t even imagine, and like you can bet the corrupt already have.
If the provider is just a decision tree and prescriber, AI will be able to do that. Possibly better.
I saw something that was saying the future of primary care is a nurse to do the in-person work (hook you up to things), and a robot does the rest. MDs/NPs will need to rework what they do if they want to stay relevant for primary care.
Perhaps the baby boomer groomer types could put down the Pizza Butt, Chunkin Donuts, and Milky Weighs while integrating daily exercise into their lives so regular visits to the doctor can potentially be reduced or eliminated. Or they can continue to provide little self control like J Diddy, demand more socialization of healthcare costs to help further subsidize their destructive lifestyle, and extend that auto loan to facilitate upsizing the vehicle again to manage the increasing girth.
Wake me when Skynet gets creative enough to write up police reports for LEO.
With skynet taking over healthcare and intentionally killing a few humans here and there (not enough to get noticed) – it will still be a better doctor than the average human with 6 years of university education.
Omni Consumer Products on line one...
Republican State AIs can interview pregnant gals to see if they are "dead enough already" to get past government efforts to deny them individual choice and cut off access to government-licensed physicians.
There's a failure mode here. If your bureaucratic paperwork is rarely read by humans, then having it also filled out by non-humans creates an opportunity to keep adding more and more paperwork that humans neither write nor read. The real answer should be identifying the needless paperwork and eliminating it.
This is the kind of thinking that red flag laws are created for.
lol
When a doctor sees 30 patients a day, there's no substitute for good record keeping.
And there's no substitute for humans.
We've all had our nav systems refuse to understand where we're trying to go. What happens when AI miscodes a diagnosis or procedure?
And you're 100% right about people not reading it. I've seen this at my own job where we developed simple excel macros to summarize notes and share them. The result is that no one on the team, or upstream of the team, ever read them. They just got reformatted and stuck on a hard drive.
People are too busy to think, and then we wonder why we all live in a Dilbert world.
A new feature in EHRs is AI passive listening. It listens to the conversation and writes the note. What people don't get, and this could be a patient safety issue, is that the provider or someone has to proofread and correct any errors in the note. Providers may not take the time to do that part. Same with coders and billers, they must double check the AI's work.
Like temples and marco?
If you are referring to the need to proofread. Yes.
Unless the note-taking AI is a lot better than, say, the Amazon AI, I would definitely agree that it's a patient-safety issue.
Being too busy to think is a compounding problem, because e.g. the too-busy software engineers just task an AI to take notes for the doctor, and the doctors adopt that practice because it's a button push. Instead of doing the harder task and bigger investment of automating the 'post processing' of those notes into the chart and EHRs. Like when stores and warehouses started using barcode scanners instead of manual inventory. For all the computers and electronic medical devices you see in doctor's offices and hospitals, the medical profession is way behind a lot of other, lower-brow industries in availing itself of automation.
Ah! Starting with government licensing of physicians? I recall seeing something about that in a libertarian party platform BEFORE the LP replaced all other spoiler parties with 4 million votes.
Already here, at least in the Medicare world.
I get a "visit summary" as I leave the office, pretending to be a recap of all the things discussed in the visit. It is all boilerplate based on the diagnostic codes entered by the PA who follows the prompts of the laptop before the doctor even shows up.
The doctor looks at my A1C and says "this is still OK, no need to change the insulin". The report says we discussed low calorie and low carb diets and the importance of exercise. It results in a 5 or 6 paragraph of prose about a three minute chat.
But hey, it's what I expect from 52 years of paying premiums by force.
I'm not on Medicare, and I get the same fictionalized visit summary from the clinic I go to.
IIRC, the visit summary was required as part of Meaningful Use. They have been around for a while now.
"" The report says we discussed low calorie and low carb diets and the importance of exercise. It results in a 5 or 6 paragraph of prose about a three minute chat.""
Sometimes macros or what I call canned free text statements that can be configured. Click one button, adds 5 paragraphs. Did they really discuss that? It's easy to check boxes to meet federal guidelines whether they did it or not. The ability to use temples and marco makes it easy to add things to the note that wasn't actually done.
My AI assisted intake transcription was so screwed up that I learned that I had been an orphan since the age of 4 and my mother was deaf. In fact my grandmother died when I was 4 and my mother wore hearing aids once she hit her 80's.
Glad it wasn't some important technical information that I would not have been able to pick up.
For the last time, Dr., I do not need treatment for Vitiligo. I've always been white.
Yeah, but how do you identify?
Generative AI Doctor: What seems to be the problem?
Me: I think I'm suffering from acute Oscillococcinum poisoning.
Generative AI Doctor: What?!
A simple expert system can replace most primary care docs.
The fundamental problem is that docs are incented to see too many patients. They get paid by the office visit and procedure. And they are the rate limiting step for revenue.
So the office/hospital/partnership/themselves constantly push for more revenue. AI will just increase the number of patients they try to see in a day.
Not entirely.
Some medicare plans just pay the medical facility a flat amount per month.
The excuse was that it will incentivize the doctors to keep you healthy.
What it does it make them want to get you out of the office as fast as possible before you can bring up something that needs treatment. Things that used to get two month follow up now get six month follow up, etc.
Does that apply to coroners too?
How good is this AI at lying under oath to help republicans ban everything except gin and tobacco?
Better Idea: Stop requiring doctors to do so much paperwork.
Making you do paperwork is what the government is good at.
How will we know what the health outcomes are? Or was the last ten years something that happened to other people?
Anything not worth writing is not worth reading. If AI can do it, it will either be boilerplate, expansion of numbers into words (A1C is higher than the target for this age demographic), or useless doggrel.
If this is the case, then we should reduce the amount of paperwork required since it clearly is not being used.
Yup, and the inverse as well. If it’s worth reading, then it needs to be accurate, which means the doctor still needs to proofread it. Letting a predictive text parser guess what the correct next word is going to be is too great a margin of error.
So if doctors are having to double check everything the ai is saying, how much time is truly being saved?
A description of the non-cannonical origin of the Butlerian Jihad: The parents were informed that the malformed child had been aborted, but … she was able to verify the processes of her gestation and came to believe that the child’s death was unnecessary. Studying records provided … she discovered that the thinking machine hospital director instituted a program of unjustified abortions.
Maybe rather than letting AI keep the records and eventually make medical decisions (yes, it WILL happen), it would be better for governments to get out of trying to manage healthcare?
I stopped going to hospitals/medical centers after COVID anyway. They're not in the business of medicine anymore.
I know a doctor who has earned my trust. I suggest you find the same. If he points me to a radiologist, or an anesthesiologist, or an internist, or an oncologist, or even a friggin' dentist - I'll take his word over Rando Clinic Visit. I'll 2ndO anything I'm told by him.
Medical community burned their currency after the scamdemic. Doctors, nurses, EMTs, techs - they're all forfeit in 2024. Now it's all about who you know and can trust. I'd rather be marginally less comfortable laid up in my living room with OTC tylenol and a trusted friend M.D. checking in, than in a hospital bed these days.