The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Politics

Bar Associations Threaten Pro-Se Litigant, Aided by AI, with UPL Suits

I warned about this risk nearly a decade ago.

|

Flash back to 2013. New firms began to use machine-language to mine trends and insights from legal databases. Their output looked an awful lot like legal advice, even if it was generated by algorithms. At the time, I worried that these firms may inadvertently run afoul of laws barring the unauthorized practice of law (UPL). Over lunch, I warned an executive of one of the leading firms about these risks. He acknowledged my concern, and said he would have a memo prepared. Who knows what came of it. I sketched some of these concerns in a short article titled Robot, Esq., a book chapter, and in a post titled, "The Looming Ethical Issues for Legal Analytics." Here is a snippet:

The fourth issue, and the other elephant in the room, is Unauthorized Practice of Law (UPL). Reading graphs to offer advice on how a case should settle, or where it should transfer to, is at its heart the practice of law. That an algorithm spit it out doesn't really matter. Non-lawyers, or even lawyers not working for a law firm, are unable to give this type of advice. Data analytics firms should tread carefully about handing out this type of personalized advice outside the context of the attorney client relationship.

Though, for the time being, I'm not too worried about this final issue The vast majority of the UPL problems are obviated when a law firm, or a general counsel, serves as an intermediary between a data analytics firm, and a client (non-lawyer). As long as a lawyer somewhere in the pipeline independently reviews the data analytics recommendations, and blesses it, I don't see any of these as significant problems (though bad advice may result in a malpractice suit). I'm working on another paper that analyzes the law of paralegals (this is actually a thing), and what kind of legal tasks can be delegated to paralegals under the supervision of a lawyer.

But, when data analytics firms try to expand to serve consumers directly–like LegalZoom–we hit this problem hard. When there is no lawyer in the pipeline, things get difficult very quickly.

Flash forward to the present day. ChatGPT and other similar AI tools directly help pro-se litigants litigate. Consider the best-laid plans of Joshua Browder, who created a system to challenge traffic tickets.

A British man who planned to have a "robot lawyer" help a defendant fight a traffic ticket has dropped the effort after receiving threats of possible prosecution and jail time.

Joshua Browder, the CEO of the New York-based startup DoNotPay, created a way for people contesting traffic tickets to use arguments in court generated by artificial intelligence.

Here's how it was supposed to work: The person challenging a speeding ticket would wearsmart glasses that both record court proceedings and dictate responses into the defendant's ear from a small speaker. The system was powered by a few leading AI text generators, including ChatGPT and DaVinci.

The first-ever AI-powered legal defense was set to take place in California on Feb. 22, but not anymore.

This strategy would not go well. Apparently, Browder was threatened with UPL.

As word got out, an uneasy buzz began to swirl among various state bar officials, according to Browder. He says angry letters began to pour in.

"Multiple state bar associations have threatened us," Browder said. "One even said a referral to the district attorney's office and prosecution and prison time would be possible."

In particular, Browder said one state bar official noted that the unauthorized practice of law is a misdemeanor in some states punishable up to six months in county jail.

Lawyers are very good at using cartels to clamp down on competition. Legal tech firms, beware.