How much trust should you place in AI for contracts? This question anchored a recent UnitedLex Summit panel exploring where legal teams stand in adopting AI technologies. Panelists discussed AI’s potential to streamline contract workflows while examining the risks and accuracy of current solutions that must be addressed before full-scale implementation.
Key takeaways from the panel “Trusting AI in Contracts” include:
Quality, centralized, and organized data must be the first step
To ensure accurate, consistent results from AI in contracts, organizations need to have a single source of truth. That means organizing and centralizing contracts and ensuring that contracts are complete and up to date.
“When you start the journey, you need to know what is needed,” said David Gaiser, Head of Legal, UnitedLex. “Some tools may claim that they can find it all. And then it turns out contracts are in somebody’s drawer somewhere.”
Data quality is a serious consideration and shouldn’t be underestimated, agreed Alessandro Galtieri, Deputy General Counsel of Colt Technology Services. That is particularly true for companies that have been in business for a long time. Without thorough quality control, AI may be using information that is decades old. “If we were to trust or train a model to make decisions or suggestions based on that, there’s a fair chance there would be some hallucinations. Cleaning the data and bringing it into a data lake can be extremely painful if you have a history of legacy systems.”
However, the contract data doesn’t need to be perfect immediately to start seeing benefits. “It’s better to have at least all the contracts in one place and to have a set of data,” said Gian-Reto Schulthess, Legal Counsel for Swiss Mobiliar Insurance Co. “Even if it’s 90-95% accurate, it’s still better than what we had before. The important thing, of course, is to create awareness that if decisions are made based on this data, you have to do a manual review as to whether the data is correct.”
Ensure employees are on the same page before progressing with new tools
By creating enterprise policies, offering shared learning environments, and updating playbooks, companies can break down silos and minimize risks. Structured environments and centers of excellence allow employees to experiment with AI tools and understand what works, while offering opportunities for continuous learning.
“Our GC is working very closely with our CIO in terms of what the AI policy and responsible usage policy need to look like,” said Jo Walker, Head of General Counsel Capability Centre at Rolls Royce. “Before we get into adoption, we are training our people in terms of what is AI, what are the risks, what are the benefits, and what is the policy. The key for us is we are all coming along on the same journey at the same level.”
Colt has created a center of excellence, where employees can share what they’ve done, Galtieri said. “It provides a framework, because you don’t want people to go rogue.” It also allows employees to use AI in a way that is responsible, meets regulatory requirements, and is compatible with the company’s values.
Update contract playbooks and align processes
Walker encouraged participants to update contract playbooks, starting with the ones that are most frequently used or have the highest value. “If you’ve got thousands of different contract templates across an organization, it’s really overwhelming to fix that,” she suggested. “I wouldn’t underestimate the work that is needed in terms of the playbooks.”
Yulia Egorova, Head of Commercial Contracts at MSC Cruises S.A., agreed. “To use AI in any sort of structured and consistent way, you need to have your rule books and data in place and understand what you are going to search against.”
Leverage legal’s visibility across the organization
The legal department has unparalleled touchpoints across the entire organization. This gives legal teams a unique position to identify where AI can deliver value and surface critical obligations or exposures quickly.
“We’re the only function that really has all the touch points,” said Galtieri. “We literally look at every contract, where every dollar comes through the door, all the way to board advisory roles. We’re uniquely positioned to add value across the enterprise.”
That breadth of view allows the legal department to find solutions beyond just the legal environment. “When AI becomes an enabler for business transformation overall and we’re embedded in that, we will really see that transformation and sea change,” he said.
Hold AI accountable
While there are limits to AI and important guardrails to keep in mind, its effectiveness can be measured much like we would measure a human’s performance.
“If you have an agent, you should treat it like a team member,” said Schulthess. “You should have KPIs and goals for it. You should measure it and see whether it’s successful or not. And if it’s not successful, you should have some measures to improve it. If that doesn’t work, replace it.”
Walker is excited about the business enablement aspect of AI and is working on a project for contracting excellence that employs the right rigor, quality, and efficiency. Layering in systemization with AI is a “massive game changer,” she said. “It’s not going to replace people right now, but it’s definitely going to help us speed up our response and deliver more value to the bottom line. And that’s what the CEO is interested in.”
Get humans and AI to work together
AI and humans bring different strengths, and by working together, exponential value can be unlocked. AI excels at quality control, pattern recognition, and reducing knowledge leakage. As Egorova said, unlike people, AI doesn’t get tired. “You’re losing that level of consistency from team member to team member, and even from one day to the next,” she said.
However, humans are still essential for catching nuances, providing judgment, and determining key review touchpoints. The future is about collaboration, not replacement, with clear roles for both.
Gaiser pointed out that AI doesn’t have empathy and can’t view things from a lawyer’s point of view. “You lose that human ability to look at a contract and know the business situation, the cultural perspective.”
All the panelists agreed that it isn’t a question of replacing people with AI, but how the two will cooperate. How can humans and agents work together? How can the system be set up to maximize value?
While organizations are still determining the best ways to deploy AI, there is no question it will have a profound impact on many aspects of legal operations, particularly contracts. By organizing data, training employees on proper usage, and ensuring that humans and AI work together, lawyers and other stakeholders can incorporate new AI tools with confidence. Talk to someone at UnitedLex to find the best way to evolve your legal operations.