Sainsbury's ejects man misidentified as offender

Kateryna PavlyukLondon
News imageWarren Rajah A man with black hair and a beard smiles at the camera, wearing a white and blue checked shirt, blue tie with small red spots, and a grey tweed jacket.Warren Rajah
Warren Rajah says he is especially concerned for vulnerable customers

A man was instructed to leave a Sainsbury's supermarket without explanation, after staff incorrectly identified him as an offender flagged by facial recognition software.

Warren Rajah was in his local Elephant and Castle store in south London when he was told to abandon his shopping and escorted outside – an experience the data professional described as "Orwellian".

The store is one of six in London where Sainsbury's has recently rolled out Facewatch technology, in response to rising theft and violence against staff.

Facewatch later told Rajah there were "no incidents or alerts associated with [him]" on its database, and Sainsbury's has apologised for the "human error".

News imageShutterstock A large orange Sainsbury's sign fills the frame, with a man out of focus in the foreground, wearing a hoodie, coat and black hat.Shutterstock

A Sainsbury's spokesperson said: "We have been in contact with Mr Rajah to sincerely apologise for his experience in our Elephant and Castle store. This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store."

Sainsbury's added nobody had been wrongly identified by Facewatch technology and this was the first instance of someone being wrongly approached by a store manager.

But Rajah is not reassured, saying the Facewatch system relies on humans to interpret the alerts and he sees Sainsbury's staff as being insufficiently trained.

He said: "Am I supposed to walk around fearful that I might be misidentified as a criminal?"

Rajah is especially concerned for vulnerable customers. "Imagine how mentally debilitating this could be to someone vulnerable, after that kind of public humiliation," he said.

Rajah is not the first to fall foul of human error in a store using facial recognition technology. In November, Byron Long was wrongly accused of being a shoplifter, after staff at a B&M store in Cardiff incorrectly added him to their Facewatch watchlist.

News imageWarren Rajah A Sainsbury's poster in a store's front window, outlining its use of facial recognition technology, above a QR code.Warren Rajah

Rajah said he had suddenly been approached by three Sainsbury's staff, including a security guard, and questioned about whether he shopped there regularly.

He described how one of the staff members had looked between him and a phone in her hand, before nodding to her colleagues. Rajah was then told to leave the supermarket.

When he asked why, he was directed to a poster in the shop's front window about the use of facial recognition technology and told to contact Facewatch.

When he got in touch with Facewatch, the company confirmed he was not on its database, and told him "Facewatch did not play any part in [him] being approached at the store".

He was redirected to Sainsbury's, which apologised and offered him a £75 shopping voucher.

In order for Facewatch to verify that Rajah was not on its system, he had to send a copy of his passport and photo of himself.

Facewatch told the BBC that, to comply with the law, it needed to conduct "appropriate identity checks" before it could "confirm or disclose sensitive information".

But Rajah questioned why it had taken him giving his personal information to a third party to understand what had happened to him and prove he was misidentified.

He criticised Sainsbury's staff for failing to explain why he was being kicked out of the store, and for not giving "proper recourse to challenge" their decision.

He said: "What would happen if I asked the police to be called? What rights do I have?"

Instead, Rajah said he had faced a "trial" in the supermarket aisle, with the three Sainsbury's staff members acting as his "judge, jury and executioner".

"It was traumatic - being kicked out of a store, with everyone watching you," he said.

Jasleen Chaggar, legal and policy officer at Big Brother Watch, said her organisation "regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance".

The Information Commissioner's Office said: "Retailers should carefully consider the risks of misidentification and have robust procedures in place to ensure the accuracy and integrity of the personal information they collect and process."

Sainsbury's told the BBC that management in the Elephant and Castle store would be receiving additional training.

'Fewer frightening moments'

Sainsbury's, the UK's second-largest supermarket chain, announced in January that it would be expanding its use of the Facewatch software to include five London stores - Dalston, Elephant and Castle, Ladbroke Grove, Camden and Whitechapel.

The rollout follows the company's pilot of the technology in its Bath and Sydenham stores, which it reports led to 92% of offenders not returning, and a 46% drop in theft, aggression and antisocial behaviour incidents.

Sainsbury's said this meant "fewer frightening moments for colleagues and a more reassuring experience for customers".

Retail trade union Usdaw recently reported interim results from its annual survey, which found that 71% of staff had experienced verbal abuse, 48% had been threatened by a customer and 9% had been assaulted.

Facewatch describes itself as "the UK's leading facial recognition company providing a cloud-based facial recognition security system to safeguard businesses against crime".

Its software has been adopted by retail chains including Budgens, Sports Direct, B&M and Home Bargains.

Its cameras scan the faces of customers, which are compared against a database of recorded offenders. Any matches alert store managers, who can then verify the match.

In Rajah's case, the error was made at the second stage of human verification.

Both Facewatch and Sainsbury's point to the software's "99.98% accuracy" – but Rajah suspects the margin of error is higher and has questions about the dataset behind this claim, and if it is representative of a range of body types and skin colours.

As someone working in the industry, Rajah said: "I still do believe that AI and tech can be an amazing thing.

"The caveat is that it's only ever as good as the people behind it."

Listen to the best of BBC Radio London on Sounds and follow BBC London on Facebook, X and Instagram. Send your story ideas to hello.bbclondon@bbc.co.uk