With A-list actors walking out on the red carpet, you can’t have missed the news that the Screen Actors Guild and American Federation of Television and Radio Artists (SAG-AFTRA) have gone on strike, joining the ongoing strike by the Writer’s Guild of America (WGA). One of the catalysts for this action (which is the first time performers and writers have been on strike simultaneously in Hollywood for over 60 years) was a proposal that would allow studios to use AI generated replicas of background actors in perpetuity in return for a single day’s pay. Both SAG-AFTRA and WGA are concerned about “AI exploitation” and they propose significant reforms in the regulation of AI.
Here in the UK, voice over artists represented by the performing arts and entertainment trade union Equity have long been aware of AI’s potential to fundamentally transform performance. As is the case across the pond, the concern for performers stems from the development of tools that can mimic an actor’s voice which can be used for audiobook narrations or advertising voiceovers.
The opportunities created by AI for performers are vast: we are potentially heading towards passive income for performers, who by licencing their image or voice might not have to step foot on set or in a studio again. In the musical world, Grimes recently released her own AI tool offering a split of royalties for any works derived from her AI generated voice, and an AI-generated Drake and The Weeknd song went viral on social media. The downside to this is that there are significant risks that performers may be exploited or have their voice or their likeness used in a manner detrimental to their reputation, given gaps in the current intellectual property (IP) framework.
Frustrated by the lack of government intervention, last year Equity launched its “Stop AI Stealing the Show” campaign and as part of this campaign, Equity has put together a toolkit for performers, to help performers navigate this increasingly complex technological landscape.
Performers' AI considerations
Equity has long been campaigning for performers' rights to be protected against exploitation by brands using AI. The union recently played a key role in blocking the IPO’s proposal to expand the scope of the text and data mining (TDM) exception that currently applies for non-commercial use only, to use for any purpose.
Equity has also produced guidance on the key issues for performers to consider when contracting with brands that might utilise AI:
- Gather as much information as possible about the work itself, including the scope of use of the performer’s image, voice or likeness.
- Find out where the performance will be used (i.e. online, on an app, on social media etc).
- Ensure that the brand has obtained the performer’s permission for an agreed scope of work. Where work goes beyond an agreed scope, the performer should ensure that they retain the right to be paid for further exploitation of their voice or image.
- Find out how the work will be created and be wary if brands are asking for work to be created in unusual ways, such as over the phone.
It is a grey area whether copyright law gives performers the right to control synthetic AI performances which reproduce their voice or likeness. S182(1) of the Copyright Designs and Patents Act 1988 (the “Act”) does not expressly include the synthetisation of recordings as an act of making a copy for the purposes of infringement. The UK government has recently stated in response to a parliamentary question on this issue that it considers that the provisions of the Act give performers the “right to control who is able to record and make reproductions of their performances… regardless of the technology used to make such reproductions, including AI technology”. However, we think that the government’s response here is something of a red herring, because the issue is not what technology is used to reproduce the performance, but rather that the technology allows a new “deepfake” performance to be created without the original performance actually being copied. It is worth noting that whilst current copyright law may not provide performers with any clear recourse, performers do have other legal routes open to them, for example, they could try to rely on their rights under data protection law, which provides individuals with a right to erasure of their data.
Equity AI toolkit
Unless and until copyright law catches up with the rapid growth of AI across the entertainment industry, Equity has released an AI Toolkit which gives performers practical tools that can be used now to help protect themselves against the misuse of their voice and likeness when contracting with brands. The toolkit includes sample AI clauses for performance cloning and digitisation, suggested terms for commissioning artists for performance cloning, and a template letter for content takedown.
What should brands be doing now?
Brands are already using AI for the purposes of automation and enhancing analytics, and the use of voice avatars (i.e. a synthesized narrator which sounds like a human voice) and deepfakes (synthetic media where a person in an existing media is replaced with someone else’s likeness) is becoming increasingly common.
Performers are becoming increasingly aware of their rights, and brands will need to make it clear at the outset how they intend to use a performance, especially in regard to the use of performers' voices, as well as imagery. Where the talent is known, it is imperative that brands secure their written approval before using their image, voice or likeness. However, the very nature of AI means that it will not always be possible to identify the talent or any other rights holders. Brands should therefore be scrutinising the T&Cs of AI platforms prior to use to check what warranties the platform gives in respect of IPR. If, as we are routinely seeing across AI platforms, no warranties are given in respect of non-infringement of third party IPR, then brands need to consider and get comfortable with their exposure to this risk before using the AI.
For more on this, do check out our AI 101 series of articles, including this one on the infringement risks of using AI generated works and this one on the legal backlash we are seeing as a result of AI use.
The use of Artificial Intelligence (AI) has grown rapidly across the entertainment industry in recent years, from automated audiobooks and voice assistants to deep fake videos and text to speech tools. But UK law has failed to keep pace and this is leading to performers being exploited.
https://www.equity.org.uk/campaigns/stop-ai-stealing-the-show