ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

Asda’s facial recognition trial: a battle worth pursuing?

Rachel Griffith at Foot Anstey argues that retailers attempting to put a stopper on theft-related losses using facial recognition technologies are entering the eye of a technological storm

 

The backlash over Asda’s recent trial of live facial recognition technology (FRT) has shown just how sensitive customers are to the use of such technologies in a retail setting. Despite this sensitivity, retailers are increasingly turning to FRT for a wide range of reasons – from looking to enhance customer experience through biometric payments and more detailed retail analytics to combatting a dramatic spike in theft and violence in stores. 

 

As a result, retailers must navigate a minefield of data security and privacy concerns, and stakes are high. One wrong turn and the reputational damage could be explosive. Striking the right balance between innovation and privacy invasion can be a huge challenge; though there are some helpful questions retailers can ask themselves when deciding if FRT adoption is the right approach for them.

 

 

Why use FRT?

Importantly, retailers should start by identifying what problem (or indeed, problems) they are seeking to address by using FRT. According to Avery Dennison, 65% of food retailers in the US and UK say the impact of theft has reached a crisis point. In a retail environment where thousands of customers can enter daily, the precision and accuracy of FRT could offer greater visibility over threats such as known shoplifters.

 

FRT has also been used to successfully warn employees of violent shoplifters, help recover missing minors, close major organised retail crime cases, prevent unauthorised access to warehouses, measure customer footfall and alert staff to individuals who have registered to self-exclude from such adult gaming centres to tackle problem gambling. 

 

 

What will customers and employees think?

Do they think there are less invasive measures that can be used to prevent crime and analyse consumer behaviour? As the ’Stop Asda Spying’ campaign demonstrates, using FRT in combination with pre-compiled ’watch-lists’ of people of interest carries significant reputational risks. FRT has been linked to racial bias, profiling and even cases of mistaken identity, none of which a retailer wants to be associated with.

 

Asking for customer and employee feedback on FRT and/or running trials using dummy data are useful tools for retailers to assess if the benefits of adopting such technology are outweighed by the risk of reputational damage.

 

 

Is there a lawful basis for using FRT?

FRT involves the processing of special category biometric data. As such, retailers will require a valid condition for processing it under UK data protection law, such as substantial public interest. The data found on ’watch-lists’ used for matching will also require valid conditions for processing - especially if such data relates to criminal convictions where specific rules for processing apply.

 

 

Is the use of FRT necessary and proportionate?

Once a valid condition for processing data using FRT has been identified, retailers will need to constantly assess whether they can achieve their objectives without using FRT. They can do so by preparing a data protection impact assessment (DPIA) to understand if the use of FRT is necessary and proportionate in the store, if there are less invasive means of achieving their objectives or if there are controls or safeguards that should be implemented to ensure the FRT is necessary and proportionate.

 

As the Information Commissioner’s Office (ICO) found in the Serco case, even if there is a valid condition for processing, if retailers fail to show why it was necessary or proportionate to use biometric data such processing may be considered unlawful.

 

 

Does the team understand how FRT works?

Understanding the mechanics of FRT is critical to ensure retail teams can explain how the technology works to affected individuals and fulfil transparency obligations under UK data protection law. Does the technology include hardware (i.e. CCTV cameras) and software (i.e. software that matches faces to a database or unique hexadecimal reference numbers)?

 

If the FRT is AI-enabled, there are heightened risks involved, including bias. While not directly applicable in the UK, the risk mitigation measures in the newly passed EU AI Act offer a useful framework for risk management and potentially provide an indication of what we can expect a similar UK law to look like. Retailers may wish to consider these risk mitigations when contemplating the use of AI.

 

 

Is the store being sufficiently transparent?

Both the internal and external privacy information (i.e. your privacy policies) should be updated to reflect the introduction of the technology. Retailers should be creating and displaying public signage inside and outside the premises which clearly alerts affected individuals to the use of the technology on the premises. As Bunnings, the Australian DIY and hardware retailer, recently found, it’s not enough to place signs and update your privacy policy - the signage also needs to be obvious enough and adequately inform customers about the collection/retention of data.

 

Retailers must also remember to consider responses to any objections they may receive from customers and ensure consistency of messaging.

 

Does the process involve human oversight?

Facial recognition has well-documented issues with accuracy and bias, and has already led to distressing cases of innocent shoppers being publicly branded as shoplifters. FRT is not designed to replace human judgment but to enhance it.

 

Retailers should implement strict data governance and privacy practices with specific user access rules and permissions; ensuring authorised users receive comprehensive training and conduct regular audits of compliance, accuracy and use.

 

 

Has due diligence been undertaken?

Not all FRT providers are created equal. Retailers should consider their vendor’s track record of operational stability, availability and compliance with current and future privacy regulations. Engagement with relevant stakeholders across the business (i.e. your DPO, IT team and HR) to conduct due diligence including an information security assessment can help to decipher whether the technology is cyber-secure.

 

Further, careful review of the supplier agreement is important, to understand how risk and responsibility are allocated and ensure that you implement data processing agreements with your FRT providers (and there may be multiple parties involved) to impose binding obligations on each of them to process data securely.  

 

FRT is a powerful and potentially effective tool in a retail environment but there are multiple reputational and legal considerations retailers need to work through prior to adoption. Taking a privacy-first approach could be the difference between protecting people and stock and a PR nightmare.

 


 

Rachel Griffith is an Associate at Foot Anstey

 

Main image courtesy of iStockPhoto.com and gorodenkoff

Business Reporter

Winston House, 3rd Floor, Units 306-309, 2-4 Dollis Park, London, N3 1HF

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543