By now most lawyers are familiar with the story of DoNotPay, the company that promised to provide a traffic offense defendant with a “robot lawyer” to fight the ticket in court. The plan went like this: The defendant would be assisted at trial by generative artificial intelligence tools (e.g., ChatGPT and DaVinci) that monitored courtroom proceedings and, based on these inputs, transmitted legal arguments back to the defendant via a speaker embedded in smart glasses.
But the cyber-showdown never transpired because, according to DoNotPay, the company abandoned its “robot lawyer” plan after several state bar organizations threatened to bring legal actions accusing DoNotPay of engaging in the unauthorized practice of law. (A quick review of the company’s website suggests that DoNotPay has since back-burnered computer-assisted legal representation in favor of providing consumer education on a wide array of law-related topics.)
“For now, the law seems to be that generative artificial intelligence can be used to provide legal services only when the AI’s output is filtered through the professional judgment of a licensed attorney. And those licensed attorneys are responsible for all errors arising from the use of AI.”
Artificial intelligence in the delivery of legal services remains an ethical problem child. Artificial intelligence can lawfully supply legal information not legal advice.
When technology providers offer legal advice directly to consumers, they run the risk of being accused of the unauthorized practice of law (UPL). UPL is a crime in many states. The better that AI gets, the more likely AI vendors will find themselves in the crosshairs of UPL enforcement activities.
On the other hand, when lawyers offer legal advice (and legal services) to clients using the very same technology, the main issues are (1) whether the lawyer is using the technology competently, and (2) whether the lawyer has made reasonable efforts to ensure that the technology is being used in a way that is “compatible with the professional obligations of the lawyer.” The latter ethical obligation is established by Rule 5.3 (Responsibilities Regarding Nonlawyer Assistance) of the ABA Model Rules of Professional Conduct, although the model rule clearly is directed toward nonlawyer persons not computer algorithms.
For now, the law seems to be that generative artificial intelligence can be used to provide legal services only when the AI’s output is filtered through the professional judgment of a licensed attorney. And those licensed attorneys are responsible for all errors arising from the use of AI.
This seems to be the position adopted by The Florida Bar in a proposed ethics opinion on generative AI. Florida state bar officials approved the use of generative AI in legal practice but cautioned that lawyers should “carefully consider what functions may ethically be delegated” to AI tools.
“A lawyer may not delegate to generative AI any act that could constitute the practice of law such as the negotiation of claims or any other function that requires a lawyer’s personal judgment and participation,” state bar officials wrote.
Elsewhere in the opinion, the drafters observed that “a lawyer must always review the work product of a generative AI” in the same way that the lawyer would review the work of nonlawyer assistants such as paralegals.
With this view of the law in mind, it’s worth considering how much – if any – of the legal work that goes into preparing for and conducting a deposition can be ethically delegated to artificial intelligence.
It seems clear that conducting a deposition cannot be delegated to a “robot lawyer” in the way that DoNotPay proposed defending a traffic ticket in court. Conducting a deposition, as we’ve noted in a previous article, constitutes the practice of law. As such, this task can’t ethically be handed over to a nonlawyer, or technology assistant. For example, it would be unethical for a lawyer to pose questions to a deposition witness merely because a generative AI technology asserted that the question would be appropriate to ask.
Could a lawyer use generative AI to find inconsistencies in prior witness testimony or suggest possibly fruitful areas of inquiry based on its analysis of the law and facts of the case? Perhaps. But first, the lawyer must possess a basic understanding of how the AI deposition assistant operates, which would include a working knowledge of how the technology produces results as well as an appreciation of the limitations of the technology. The lawyer shouldn’t automatically accept the AI’s outputs as true or accurate.
Even if the lawyer possessed a sound understanding of the technology, the use of AI for depositions would seem to be limited – as an ethical matter – to pre-deposition preparation where the attorney has time to evaluate and double-check the accuracy of the AI’s suggestions. The fast-paced environment of most depositions may not afford the attorney an opportunity to carefully consider the merits of an AI’s suggestions.
Finally, there’s the not-necessarily far-fetched question of whether an AI deposition assistant is allowed to “attend” a deposition. Several jurisdictions have rules limiting attendance at depositions. Maryland and Texas, for example, both have court rules that limit deposition attendance to a defined set of “persons,” “parties,” “officers,” “counsel,” and “attorneys.” All of these terms presumably refer to human beings. If an AI deposition assistant can be a “nonlawyer” or “person” for purposes of Model Rule 5.3, is it also a “person” that may – or may not – be allowed to attend a deposition?
No one quite knows how technology will be applied to deposition practice in the next few years. If the legal community’s recent experience with remote depositions and trials is any indication, any promising technology will be rapidly and enthusiastically put to work on behalf of clients. Whether the profession’s ethics code drafters can keep up is another matter entirely.
When one of your cases is in need of a construction expert, estimates, insurance appraisal or umpire services in defect or insurance disputes – please call Advise & Consult, Inc. at 888.684.8305, or email experts@adviseandconsult.net.