Legal software company Luminance hosted reporters in its London office this week to show its AI tool called Autopilot negotiating a nondisclosure agreement with a customer without involving lawyers in the process except for sign-off.
When a laptop belonging to Luminance General Counsel Harry Borovick received an email with an NDA attached as a Microsoft Word document, Autopilot opened, scanned and started making changes to the document without input from any person, according to accounts of the demonstration by CNBC and BBC.
“A six-year term is unacceptable, so it's changed to three years,” says the BBC report, describing the back-and-forth between the two laptops.
The software flags risky clauses and automatically makes changes based on what Luminance says is 150 million legal documents that the software has been trained on.
“[One] risky clause imposes an unlimited liability, meaning there's no ceiling on how much Luminance might have to pay if the terms of the NDA were breached,” the BBC report says. So, the software replaced the clause with a proposed liability cap of £1 million and removed a hold harmless clause that “the AI knew ... wasn’t okay.”
The laptop belonging to the counterparty, Connagh McCormick, the general counsel of research company ProSapient, opened the revised contract and inserted a liquidated damages provision in response to the removal of the hold harmless provision. “That effectively turns the £1 million maximum liability into an agreed compensation to pay if the agreement is broken,” the BBC report says.
The laptop belonging to Luminance’s GC replaced the liquidated damages provision with language stating the company is only liable for direct losses incurred.
“Version four of the contract is acceptable to both parties,” the BBC report says, and the AI belonging to Luminance’s GC “accepts all the changes and sends it to DocuSign” for signing.
“At that point you get to decide whether we actually want the human to sign,” Jaeger Glucina, a managing director at Luminance, told reporters at the demo. “That would be literally the only part that the human would have to do. We have a mutually agreed contract that's been entirely negotiated by AI.”
The company says its tool is trained entirely on legal documents, not on content from the public internet like ChatGPT. That could reduce exposure to incorrect, private or protected information that has some AI critics concerned.
To teach the AI what contract terms are and aren’t standard for the user, what terms are negotiable and what terms are unacceptable, companies create what Luminance calls a knowledge bank based on their signed documents.
“The AI [is] not only legally trained … but also understands your business,” said Glucina in CNBC’s report.
Lawyers have the final say. “It ends up in the laps of both lawyers signing,” said McCormick. “At that point, I'm going to read the contract and the other lawyer's going to read the contract. If there's anything I disagree with, I've got the opportunity to flag it. I'm not committed to anything the AI has done.”
Autopilot isn’t out yet; it’s slated for beta testing before the end of the year with plans for a 2024 rollout.
If it catches on, said McCormick, “a lawyer's time is going to be spent doing something more interesting, more valuable” than negotiating NDAs.