ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

The rise of deepfakes

Alan Harper at Walker Morris discusses the challenges and concerns for intellectual property rights that AI-powered deepfakes present to businesses

 

The use of deepfakes is on the rise having gained an abundance of conflicting publicity. For the most part, this novel use of artificial intelligence (AI) is used for entertainment with the likes of Channel 4 and ITV using deepfake celebrities in TV shows and advertisements.

 

However, with deepfake technology advancing faster than most people could have predicted, it brings with it a range of challenges.

 

Deepfake technology uses AI to digitally manipulate images and videos into content which looks realistic but is not. Deepfake AI uses learning algorithms based on existing data of real people and transforms the source content into something entirely different.

 

To do this, the AI alters the face, voice and mannerisms of the person in the original content to make it seem as though they have said or done something which they have not. Deepfakes have become so convincing that in many scenarios, it is nearly impossible to determine whether certain media content is real or has been altered by AI. 

 

One of the main legal issues faced by the increased use of deepfakes is their usage as an instrument for fraud and facilitating data protection breaches. For example, many people have fallen victim to deepfake AI which has breached their privacy to create harmful and offensive content, putting their safety at risk.

 

Not only can deepfakes be used to spread misinformation and damage personal reputation, but they also are also a growing concern for intellectual property rights holders. 

 

Issues with deepfakes 

A major concern surrounding deepfakes is the potential for copyright infringement. Under the Copyright, Designs and Patents Act 1988, copyright protection is granted to various different works, such as photographs, films and sound recordings and their usage without consent of the copyright owner is copyright infringement.

 

However, bringing a successful copyright infringement in a deepfake scenario is quite difficult given the complex way in which the AI is being used to effectively ’steal’ and recreate deepfake content. The AI uses data available on the internet to train itself on what someone (such as a celebrity) looks like and uses this information to generate an entirely new piece of video or audio content.

 

Since the sound of a person’s voice or their image is not protected by copyright law, it is difficult to prove who is the rightful owner of the AI generated deepfake and who owned the content used to create it.  

 

Controversially, it has been widely debated as to whether AI and the deepfakes created by the algorithms should have protection too. Whilst the AI itself would not have any rights, the creators of deepfake algorithms are claiming that the content produced by their AI, such as video and audio content, should be rightfully owned by them and protected by copyright. This has set alarm bells ringing for intellectual property owners who believe that this would be a step in the wrong direction. 

 

The increase in deepfake use to spread misinformation is also a growing concern for brand protection. Many brand protection agencies use metadata to detect key words or algorithms to expose deepfakes and determine what is real and what is not.

 

However, as deepfake authentication tools are still largely a work in progress, brands must stay vigilant to ensure that the media associated with their brand is not being manipulated online. Deepfake scam videos can have a catastrophic impact on brand image and in extreme cases can even have an impact on share prices. 

 

Protection against deepfakes

Unlike some countries, such as the US, English law does not recognise a person’s right to control their image, known as image rights, and instead, alternative methods of protection are being used. For example, the law of ’passing off’ prevents someone from exploiting your image without your permission.

 

For this protection to apply, you must be able to prove that you have some commercial value in the form of goodwill for your name or appearance. You must also be able to show that exploiting this goodwill without your permission would deceive people into thinking that you have consented to the use of your image. This form of protection is especially relevant for celebrities and public figures who are often the subject of deepfake’s used to deceive and scam the public.

 

However, in contrast to other jurisdictions, passing off is unlikely to protect someone who has not previously commercially exploited their own image. There are other forms of protection, such as privacy and data protection laws, however these cannot be relied upon to protect intellectual property rights. 

 

Another alternative is to use licences as a form of protection. Many celebrities have licensed their image to allow others to use their likeness without being challenged. This type of agreement allows the licensor to take control of their image and be remunerated where otherwise, their image may have been used anyway without their consent.

 

However, this approach should still be treated with caution as fundamentally, licences are used to give away rights and are therefore still often exploited. 

 

As is the case with most AI, there has been a delayed regulatory response to deepfake technology with little clarification regarding protection. Music artists, producers and performers are at the forefront of the battle against deepfakes as their work and industry is the easiest to target and manipulate.

 

Deepfake technology is becoming increasingly more accessible to the average user which means that without prompt regulatory changes, intellectual property will continue to be subjected to deepfakes. 

 


 

Alan Harper is Partner and Head of Intellectual Property at Walker Morris 

 

Main image courtesy of iStockPhoto.com

Business Reporter

23-29 Hendon Lane, London, N3 1RT

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2024, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543

We use cookies so we can provide you with the best online experience. By continuing to browse this site you are agreeing to our use of cookies. Click on the banner to find out more.
Cookie Settings

Join the Business Reporter community today and get access to all our newsletters, and our full library of talk show episodes

Join the Business Reporter community today and get access to all our newsletters, and our full library of talk show episodes

Join free today
Join Business Reporter