The buzz around artificial intelligence — its ethical pitfalls, its potential to shape a more efficient workflow and the fears that it could replace humans in a variety of industries — can make it difficult to discern the reality of how it is actually being used right now.
On Friday April 19 at the seventh annual Northern District of Texas Bench Bar Conference, which will take place at The Westin Irving Convention Center at Las Colinas, attendees will hear from federal judges and practitioners about the ethics of AI.
U.S. Fifth Circuit Court of Appeals Judge Jennifer Walker Elrod will moderate the panel that includes U.S. District Judge Brantley Starr, David Coale of Lynn Pinker Hurst & Schwegmann and Shamoil Shipchandler, who formerly served as director of the SEC’s Fort Worth regional office and currently is chief counsel for the risk and regulatory legal group at Charles Schwab.
Ahead of the conference The Lawbook spoke to a handful of law firm leaders whether they are using AI internally and, if so, how?
Some firms declined to comment. David Beck of Beck Redden told The Lawbook his firm hasn’t made any final decisions on using the technology yet.
“We do not currently use the AI tool but are in the process now of evaluating potential tools,” he wrote in an email.
That’s similar to the approach Houston-based Robin Russell, Hunton Andrews Kurth’s deputy managing partner, said her firm is taking.
“Providing outstanding client service and innovative, effective solutions has always been our top priority,” she said. “We recognize the potential that artificial intelligence has to offer and are studying it to understand how it might eventually aid in the delivery of legal services.”
But some firms are testing the waters, seeing how AI can be used to streamline certain processes, with attorney oversight, and have already noticed an impact on clients’ bottom lines.
“Even now, being on the frontend of the AI revolution, what we’re finding is that our [litigation] budgets have come down — and I’m kind of spit-balling here — by about 10 percent,” said John Zavitsanos of AZA. “That’s pretty material because our litigation budgets [for clients] are usually going the other way. We’re able to kind of get to the tasks a little faster because of the initial use of AI. And I’ll tell you, it’s really going to be a boon for these mass tort lawyers on the plaintiffs side.”
Zavitsanos also predicted utilizing the technology will prove to be “the great equalizer” for smaller and solo law firms facing Big Law opponents. He said he’s used ChatGPT for things like gathering information about the public perception of certain companies or as a starting point in crafting discovery requests.
“And again, you can’t check common sense at the door,” he said of double-checking and verifying the results generated by the technology. “But from a client standpoint, it’s going to bring legal costs way down — assuming lawyers are being honest about how they’re using it.”
Yetter Coleman partner Jim Zucker said his firm has used AI technology to aid document review.
“We are comfortable and have used it to assist in document review and are looking at it as a research tool,” he said.
Client privacy concerns that arise with the use of public AI tools have led to a directive that firm attorneys do not use those versions of the technology, Zucker said.
“I think there’s a sense that, in every industry, there’s going to be a role for AI. It’s coming. It’s just a question of timing and of its quality,” he said. “I think the legal industry raises particular issues because you have duties of candor to the court and privilege issues. … But it’s certainly on the horizon, and to give the best service to our clients we’re going to have to figure it out.”
The chair of Bracewell’s technology section, Houston partner Constance Rhebergen, also is a coleader of the firm’s internal technology committee that serves lawyers of the firm. In a recent interview with The Lawbook she said it’s important for law firms to understand the distinction between using public or open AI programs such as ChatGPT — which raises client confidentiality issues — and the use of private or closed-system AI.
Rhebergen said Bracewell was an “early adopter” of using private, closed AI systems.
“We are embracing AI internally for use in our legal services, provided it meets the … important confidentiality restrictions that we as lawyers have,” she said.
The technology presents firms with two basic options, she said: ignore it or educate.
“Like every tool, we feel education is the key,” she said. “So we educate and continue to educate our lawyers about the use of any generative AI tool, which is basically that our ethics require we not provide any confidential information at all.”
The firm’s IT group is using AI tools to help educate firm attorneys and to perform certain tasks, she said, and Rhebergen said she uses AI “every day” to save time finding information.
“The beauty of that is it will give you what it believes to be the answer, and it will give you a cite so you can jump over to that site and verify the answer,” she said of using Microsoft’s Copilot AI program.
Rhebergen also noted that many vendors of legal technology services have started incorporating AI modules into the tools the firm uses.
“We’ve all read the horror stories,” she said, recounting media coverage of attorneys filing briefs that cite made-up cases. “But I still think, and my firm still thinks, it is a tool that, when used responsibility, it can offer some positive benefits.”