Deep Fake Meets Real Law?

read time: 6 mins

ITV recently premiered a new comedy TV show called ‘Deep Fake Neighbour Wars’. The premise of the show is the following: deepfake technology is used to turn the UK’s best impressionists into the world’s most famous celebrities. The impressionists then act out life as normal neighbours, bickering over minor issues. Except they’re pretending to be celebrities and therein lies the comedy. Celebrities ‘used’ in the show include Idris Elba, Harry Kane, Nicki Minaj and Tom Holland.

The show has been criticised for using AI technology to capture the likeness and voice of celebrities without compensating them and without obtaining their consent.

What is Deep Fake technology?

Deepfakes are a type of synthetic media. AI technology is used to manipulate photos, video and audio. Typically, a person in an image or video is replaced with someone else’s likeness.

Unfortunately, deepfakes have been used for malicious means, ranging from fake news to revenge porn. However, there has been a recent online trend whereby content creators use deepfake technology to attract viewers by creating humorous sketches. As such, larger budget film production companies have caught onto this trend; Tom Hanks and Robert Zemeckis’ upcoming film, “Here”, is set to use deepfake technology.

The introduction of deepfake technology has generally been negatively received in the entertainment industry. Equity, a union for performers, actors and creative practitioners, has launched its own campaign called ‘Stop AI Stealing the Show’.

Are Deep Fakes Potentially Unlawful in the UK?

Intellectual Property

The immediate concern is that deepfake technology may infringe the intellectual property rights of celebrities and others who somehow “trade” off of their image. However, intellectual property law within England and Wales is not clear cut when it applied to deep fakes  – perhaps unsurprising given the novelty of this emerging technology.

There are four major intellectual property rights recognised by the law in this country under English law are:

  1. Copyright – which protects original literary, dramatic, musical and artistic works. Broadcasts, sound recording and films are also protected;
  2. Registered trade marks – a trade mark is a sign used by traders to differentiate their goods/services from those of other traders;
  3. Registered and Unregistered Designs – these rights protect the appearance of the whole or part of a product; and
  4. Patents – these protect inventions.

Of the above, copyright is the intellectual property right some commentators claim that deepfakes infringe.

The argument is that someone’s image or voice should not be used in the show without their consent. However, for copyright to subsist, it must fall into one of the categories listed above. A person’s image or voice would not ordinarily do so. If a celebrity owned the copyright to a sound recording of their voice, which was used in a show, the celebrity would have a claim for copyright infringement arising out of the use of that specific recording. Also, if the words they had spoken or written previously had been fixed/recorded in some way and those words were used in the show, then the celebrity would have a claim for copyright infringement. However, a celebrity would not have a claim for breach of copyright if it was only the sound of their voice which was imitated.

AI technology needs to be trained to know what the celebrity in question looks like. It does this by scouring the internet for various photos/music/videos of the person it is copying. The owners of the copyright in the photos/videos/music will have a cause of action for infringement if their works are copied without their permission. But the celebrities themselves won’t unless they are the said copyright owners – which is unlikely in most cases.

Image rights

One could also argue that a TV show using the image of someone else without their permission is an infringement of that person’s image rights. However, English law does not recognise a person’s right to control their image per se, i.e. an ‘image right’. Rather, other laws have to be relied on, for example the common law tort of ‘passing off’ can sometimes be used to protect a celebrity’s image.

Passing off can occur where a purchaser of a product or service is likely to be misled into believing that the product/service in question has been endorsed by a celebrity, when in fact that is not the case. See, for example, Irvine -v- Talksport Ltd and Robyn Rihanna Fenty -v- Arcadia Group Brands Ltd (t/a Topshop).

In Irvine -v- Talksport, Eddie Irvine, a former F1 race car driver, successfully persuaded the Court that Talksport had (without his consent) used his image in such a manner that consumers would (incorrectly) conclude that he had endorsed their radio show. Similarly, in Rihanna -v- Topshop, pop-star Rihanna ran a similar argument and succeeded in a claim against Topshop, who (without her consent) were selling t-shirts displaying her image. Rihanna argued that consumers would purchase the t-shirts wrongly believing that Topshop been granted a licence to use her image.

A key takeaway is that the Courts are slowly recognising the monetary value in someone’s image and offering protection. However, would the public be ‘confused’ into thinking that the celebrities being impersonated in Deep Fake Neighbour Wars have  endorsed the show or licensed the producers to use their image/voice/likeness? Possibly, but probably unlikely.


Traditionally, in order to establish a claim for defamation, it would be necessary to demonstrate that the Defendant had published an untrue statement which diminishes the Claimant’s reputation in the eyes of right thinking members of the public. If someone were to falsely claim that Justin Bieber, without provocation, kicked a swan his reputation would be tarnished likely giving rise to a claim for defamation. It must therefore follow that if someone created footage using deepfake technology which seemingly showed Justin Bieber kicking a swan, that too would tarnish his reputation and the publication of such footage, ought logically to give rise to a claim for defamation.

It is so far an untested legal proposition and of course in defamation claims, context is key. Where such footage is so obviously a parody such that it is clear that it is not really Justin Bieber kicking the swan it is unlikely to be defamatory.

As at the date of this article there is no English judgment in the civil courts concerning the legal implications of deepfakes. It is only a matter of time before a Judge has to grapple with these issues. The Court has no interest in stifling creative freedom but it will step in where creativity trespasses on the rights of others, particularly when it’s pursued for a profit. Reputable creators and publishers of deep fake material ought to exercise caution otherwise they may find themselves as a Defendant in a deepfake test case in the High Court.

If you would like to discuss the use of deepfakes generally or any of the points of law arising out of the above article, please contact Chris Fotheringham at or Liam Tolen at

Sign up for legal insights

We produce a range of insights and publications to help keep our clients up-to-date with legal and sector developments.  

Sign up