Embracing AI, Episode 3 | Courts grapple with AI litigation as regulation strives to catch up

  • Podcast 19 July 2024 19 July 2024
  • UK & Europe

  • Cyber Risk

In the third instalment of our Embracing AI podcast series, host Chris Williams explores how litigation is developing in the generative AI space, as regulation strives to catch up with the technology. He welcomes three speakers from Clyde and Co's Dispute Resolution Group around the world: Alexandra Lester, Partner in the Middle East, Marc Voses, Partner in New York, and Dr Florian Pötzlberger, Counsel in Munich.

*This podcast was recorded before the hearing that took plface on 11 July. 

This episode kicks off with the key trends relating to Generative AI litigation, before discussing the main legal challenges and risks of using the technology. Guests then explore how regulation is evolving to help manage these risks and encourage innovation, concluding with a look at the measures clients should take to avoid facing legal action. 

Williams describes GenAI as the “new kid on the block” of AI, and thus the courts are grappling with the issue, with cases primarily consisting of: “what happens when AI gets it wrong?” or what happens when it uses someone else's original work without permission. Voses outlines several cases where AI has gone awry, including the “horrific” Murphy Murphy vs EssilorLuxottica involving a man who was wrongly sent to prison for theft due to faulty technology. 

Pötzlberger expands on the IP and copyright theme, stating “there are currently 24 cases pending in the US,” including high-profile examples, such as The New York Times vs. Open AI and Microsoft. Pötzlberger then discusses the case of Getty Images vs Stability AI in more detail, in which the judge determined within a preliminary judgment that “the claimant had a real prospect of success….” for its claim for IP infringement. Voses also mentions Scarlett Johansson versus OpenAI, which caused the AI product to be “promptly shut down,” noting: “There's been some pretty good success by those looking to limit its (genAI’s) use.”

On the legal risks of using GenAI, Lester raises the reliability of outputs, citing an Australian case where “a group of academics had to apologise after submitting fake evidence to an Australian Parliamentary inquiry…” This serves to highlight the critical importance of verifying AI outputs, and using trusted, industry specific tools. Attributing AI authorship is another key challenge is, with a range of approaches currently being taken around the world. Furthermore, there is a real danger of biased or discriminatory genAI outputs, due to training with incomplete data sets: “So if you train a facial recognition system on a data set that only includes light skin faces, then obviously that AI will perform better when dealing with light skin faces…,” explains Lester.  

Turning to regulation, Pötzlberger provides a high-level view, spanning the world’s first genAI regulation in China, and the EU AI Act approved in May this year. In the US, Colorado has recently enacted the Consumer Protections for Artificial Intelligence Act, which will come into effect in 2026. And, while the Middle East is yet to pass any AI specific legislation, Lester mentions that “the DIFC courts in Dubai and the United Arab Emirates have issued guidance on the use of AI in (court) proceedings,” noting that countries around the world are following suit. 

What measures should clients take to mitigate the risk of AI litigation? For Voses, they should aim to build “a culture of AI risk management,” including robust protocols and in-house policies, while Lester stresses the role of due diligence when selecting an AI provider, ensuring a watertight contract is in place. Added to this, Pötzlberger notes that reviewing insurance policies for AI coverage is critical, as is remaining cognisant of IP and data protection rights and ensuring transparency in use of AI-generated materials.

For any questions on topics raised in our Embracing AI podcast series, to suggest future topics, or to subscribe, please visit the dedicated AI webpage.

 

End

Stay up to date with Clyde & Co

Sign up to receive email updates straight to your inbox!

You might be interested in...