- Microsoft says it will defend users of its Copilot generative AI tools if they’re sued for copyright infringement.
- “We will defend the customer and pay the amount of any adverse judgments or settlements,” Chief Legal Officer Hossein Nowbar said in a Sept. 7 joint statement with Microsoft President Brad Smith.
- Copyright issues have become a major concern among intellectual property lawyers because of the way generative AI companies like OpenAI use published material to train their large language models.
The legal focus today is on generative AI companies, like Microsoft and its partner, OpenAI, with a major lawsuit filed against them late last year for how they incorporate code and other published content into their models. But there’s a growing concern among users of AI products that they’ll be hit with lawsuits if content they generate using the tools contains copyrighted material.
“The chances that AI prompts might output proprietary code are very high,” Sean O’Brien, founder of the Yale Privacy Lab, has said. “Tools such as ChatGPT and Copilot … have been trained on a massive trove of code of both the open source and proprietary variety.”
O’Brien’s concern is over code, but content is poised to become the next legal focus, especially if copyright trolls take a page from patent trolls and exploit the coming explosion of protected work getting incorporated into AI-assisted content, as some analysts say.
“Not only will there be armies of legal trolls trying to find folks to sue, there will be hackers, criminals, rogue nation states, high school students, and crackpots, all attempting to feed erroneous data into every AI they can find, either for the [fun of the prank] or for much more nefarious reasons,” says David Gewirtz of ZDNet.
Microsoft released its first Copilot product last year as part of a GitHub tool to help software engineers do their job quicker and more easily by using generative AI-created code.
The company has since added OpenAI technology to Outlook, Teams and Word, among other products and tools.
In the legal tech space, Microsoft has partnered with Thomson Reuters to support contract drafting.
Microsoft says its legal defense effort, called Copilot Copyright Commitment, is an extension of its IP indemnification coverage, which applies to customers who use or resell its open source software and Azure cloud products.
“For roughly two decades we’ve defended our customers against patent claims relating to our products, and we’ve steadily expanded this coverage over time,” Nowbar and Smith said. “Expanding our defense obligations to cover copyright claims directed at our Copilots is another step along these lines.”
To be covered for copyright claims, users can’t bypass the guardrails the company has built into the Copilot tools.
“Customers must use the content filters and other safety systems,” the Microsoft leaders said.
What they describe as Copilot’s guardrails include content filters, classifiers, metaprompts, monitors and detection systems.
“Our new Copilot Copyright Commitment requires that customers use these technologies,” Nowbar and Smith said.
Microsoft isn’t the only company that offers this kind of indemnification for its customers; Adobe in 2021 rolled out something like it to help protect users of its AI-assisted design tools worried about inadvertently infringing copyright.
Adobe has said its tools only use content from its own databases, Creative Commons and in the public domain.
“They must have some very strong assurances from their legal team that they’re in the clear,” Andres Guadamuz, an intellectual property law researcher at the University of Sussex, says in a Fast Company report. “I can’t imagine that they would do this if there was some doubt that they would get sued out of existence.”
Microsoft may face higher exposure given the possibility of users tapping data sources containing protected material.
“There are potential ways that our technology could intentionally be misused to generate harmful content,” Nowbar and Smith said. “Customers … must not attempt to generate infringing materials, including not providing input to a Copilot service that the customer does not have appropriate rights to use.”