Thank you!
The full article is available below.
You will also receive a follow-up email containing a link so you can come back to it later.
There is no denying that the extent to which artificial intelligence now shapes day-to-day legal work is constantly expanding, and legal departments and law firms are navigating major decisions about how they manage matters, deliver value, and evaluate risk. That was the focus of the recent UnitedLex webinar, Applied AI for Legal: Practical Strategies to Stay Ethical and Efficient, which anchored this shift in operational reality.
Speakers from corporate legal operations, litigation support, and the judiciary focused on what AI changes, where it helps, and where it creates new pressure. Moderated by Beverly Rich, Vice President, AI and Innovation at UnitedLex, the panel of speakers included Chris Potter, Director of Legal Operations at Johnson & Johnson, Tara Emory, Principal at Aligned Discovery, and Hon. Allison H. Goddard, U.S. Magistrate Judge for the U.S. District Court, Southern District of California.
AI as a Practical Tool
From the outset, the panel framed AI as a tool already embedded in legal workflows. Judge Goddard described its role in judicial practice not as a substitute for analysis, but as support that speeds specific tasks. AI can help generate timelines, organize prior rulings, structure early drafts, and test arguments. These uses shorten drafting cycles and expand analytical reach, but accountability remains unchanged. Judicial responsibility, like legal responsibility, remains with humans.
Emory extended this view to law firms. As matters grow more data-heavy and deadlines tighten, AI supports document organization, search, and drafting. The benefits often appear upstream, bringing order to workflows before a client sees a deliverable. AI’s value, she noted, often lies in imposing structure on complexity rather than simply generating text.
Potter offered the corporate legal department perspective. Legal operations teams face constant pressure to control spend, move faster, and increase transparency. As AI tools become part of billing review, contract analysis, and internal knowledge systems, expectations for change have started to reach outside counsel. Efficiency is no longer treated as experimentation. Rather, it is part of performance.
Risk Expands with Capability
While the panel acknowledged the benefits of efficiency gains, the conversation repeatedly returned to risk. Judge Goddard emphasized the tension between moving too slowly and too quickly. Delay can create a disadvantage. Rapid adoption without a strategy can threaten competence and confidentiality. AI literacy is becoming essential even for those who limit direct use. Oversight requires an understanding of the technology.
Emory stressed that many operational risks, like data commingling and template errors, predate AI. However, new generative AI systems increase both the scale and speed of those mistakes, especially when relying on outdated or unclean data. Chat-based tools may pull prior information into new contexts. Citations may sound convincing yet be wrong. Polished language can encourage overconfidence, leading people to trust outputs simply because they read well.
Speed adds another layer. AI produces volume instantly; human review remains sequential and limited. That mismatch creates pressure to move faster when care is still required. The panel’s overall message: Automation increases the need for skepticism and verification. When designing AI-enabled legal workflows, these risks must be adequately chronicled and accounted for.
Data Governance and the “Walled Garden”
Governance emerged as the central stabilizer. Potter outlined Johnson & Johnson’s approach to secure AI deployment, emphasizing controlled environments designed to prevent data leakage and misuse. Because AI systems learn from inputs, disciplined data practices become a core safeguard. Limits on personal information, prohibitions on data commingling, and structured policies reflect operational necessity.
These controls directly affect outside counsel. Ethical duties tied to confidentiality and competence remain fully intact in AI-assisted work. Judge Goddard reinforced that Rule 11 obligations—requiring attorneys to certify the accuracy and legitimacy of filings—apply regardless of drafting method. Courts judge submissions on diligence and accuracy, not on whether AI was used. Filing unverified AI-generated content exposes counsel to avoidable risk. For law firms and their clients, having a well-structured and consistent diligence and verification process is critical to mitigating risk while taking advantage of the efficiency AI has to offer.
Human-in-the-Loop as a Control Mechanism
Across roles, one safeguard anchored the discussion: Human oversight governs AI-assisted work. Judge Goddard referenced chamber guidance that reflects this principle, especially regarding sealed or confidential materials. AI may assist research or drafting, but sensitive information generally remains outside generative systems. When non-public materials are involved, disclosure and consent issues may arise.
The panel suggested that ethical AI use is shifting away from the simple question of adoption toward the quality of supervision. Responsibility attaches to decisions and outcomes, not to tools.
Evaluating Tools: Start with the Problem
When asked how legal teams should evaluate AI technologies, Emory’s guidance was to begin with a clearly defined problem. Test the tool and measure results. Without clear improvements in workflow, accuracy, or efficiency, AI risks adding complexity, defeating the purpose. Domain-specific tools, she noted, often provide stronger safeguards and more reliable results in legal settings. This is why an in-depth and well-informed evaluation and procurement process is central to success.
Efficiency Is Not Enough: The Justice Standard
Judge Goddard drew a clear line between saving time and improving results. The real question, she said, is whether AI improves the work and helps courts deliver justice faster, and in a way the public can trust.
Speed alone is not enough. A shorter draft is of little value if it does not sharpen reasoning or improve clarity. The same holds for billing. If AI cuts the time required for a task, that gain should reflect real value, not simply higher margins. Courts, as well as clients, expect substance.
AI can help organize and structure the work. Responsibility remains human.
Final Perspective
AI is a force multiplier. It speeds research, drafting, and organization. It also exposes and can exacerbate the damage caused by weak controls and weak judgment.
Potter urged participants not to sit on the sidelines. Avoiding AI, he suggested, is not a strategy. Even skeptics should test the tools, critique them, and understand their limits. Otherwise, they risk being left behind. At the same time, he was clear about discipline. Adoption must be cautious and controlled: walled gardens, vetted vendors, clear training, and tighter guardrails in high-risk areas such as healthcare data. Move forward, but with structure.
Judge Goddard’s advice was equally direct: run toward it, not away from it. Putting your head in the sand is not an option. Lawyers who ignore AI risk falling short of their duty of competence. Curiosity is part of the profession’s core. Even those late in their careers must understand what these tools are and how they are being used. But she returned to a higher standard: Efficiency must enhance the work. Faster decisions only matter if they remain careful, reasoned, and worthy of public trust.
Emory echoed both themes. Early adopters will push the technology forward, but skeptics have a role, too. The cautious voices—the ones who question, slow the process, and demand proof—help build stronger systems. Progress requires builders and critics alike.
This philosophy is core to the recently launched UnitedLex AI Advisory Services and will serve as a guiding principle in how we help corporate legal departments and law firms turn AI ambition into high-impact adoption.
Unlock the Full Article
Bring Your Goals Within ReachTell us a little about yourself and your goals to display the full article and gain access to more resources relevant to your needs.
Interested in reading more? Fill out the form to read the full article.