Can trade mark rights be used to take down deepfakes?
Key contacts
Matthew McConaughey has registered trade marks of various short audio and video clips of himself, including his catchphrase “Alright, alright, alright”.
According to his lawyers, these rights could be used to prevent unauthorised (deepfake) copies of him.
Will it work? It seems unlikely (and McConaughey’s lawyers do acknowledge the position is untested). Take an example where I make an audio deepfake of Matthew McConaughey exalting the benefits of CMS’ “future-facing” approach.
Do McConaughey’s lawyers have a good case here? Well, remember that the registered trade marks don’t cover his voice generally but instead specific clips (which presumably bear no real resemblance to my deepfake save for the sound of Mr McConaughey’s voice).
Even assuming all other elements of a case could be satisfied, the requirement to show a sufficient level of similarity between the deepfake and registrations would likely prove challenging.
A case based on passing off (or publicity rights in the US) is more viable and currently remains the best course of action in this scenario – Eddie Irvine and Rihanna have both had success with passing off in previous, non-AI cases. But proving goodwill in the voice or image is required, meaning the average Joe remains exposed from the risk of deepfakes.
A whole host of other rights, like copyright and data protection, could be useful avenues in the right circumstances, but none of these are perfectly fit for a deepfake world.
Bespoke rights to address deepfakes
Legislators are considering bespoke rights to address the deepfake threat.
Denmark, for example, proposes to amend its Copyright Act to give protection to all individuals over their body, facial features and voice.
We’re not quite at that stage in the UK yet, but a first step has been taken: a quasi-image right protection of sorts is imminent for the creation of sexually explicit deepfakes (see Read the GOV.UK article about government efforts to crackdown on sexually explicit deepfakes).