Academic says deepfake law delay was frustrating

Jason Arunn MurugesuNorth East and Cumbria
News imageSupplied Prof Clare McGlynn has blonde hair and black glasses. She is wearing a black top. Supplied
Prof Clare McGlynn, of Durham University, said she was happy the law was due to come into force

An academic involved in drafting the law against creating non-consensual intimate images has said it was "frustrating" how long the government took to enact it.

Law researcher Prof Clare McGlynn, of Durham University, was part of a coalition that lobbied for the law, which will come into force in February amid the recent furore over X owner Elon Musk's Grok AI chatbot.

McGlynn said she had not been given a reason for the delay but was happy it was due to be enacted, several months after it was passed by the government.

Deputy prime minister David Lammy said he was "repulsed" by abusive behaviour online and perpetrators would face the "full force of the law". X said it had taken action.

McGlynn said the government had not givena reason for the delay, after it was passed last June.

"They could have done it in August, September, October," she said. "We have been quite frustrated."

She said that she understood laws could not be enacted immediately after they were passed, but said it felt like "quite a long delay" considering how "urgent" the problem was.

The law was passed following a campaign by McGlynn, who first recommended it in 2024, and others including Tory peer Baroness Charlotte Owen and charities End Violence Against Women and the Revenge Porn helpline.

"It was very much a group effort," said McGlynn.

'Firm and robust'

The issue made headlines in recent weeks after Grok AI - which can be used on social media platform X - was used to create explicit images of people without their consent.

While it is currently illegal to share intimate, non-consensual deepfakes of adults in the UK, it has not been a criminal offence to use an AI tool to create them.

McGlynn praised the government's response to the recent news.

"[The government] have been really firm and really robust in their reaction to X," she said. "And they have been really clear in what they expect from Ofcom."

McGlynn and her colleagues are working on a second law that is currently going through the House of Lords.

This law would make it so that web platforms in the UK would need to take down non-consensual intimate images, if requested by the individual who the picture is of.

"If your image is up there and X aren't doing anything about it, there's very little you can do at the moment," she said.

In an announcement on the social platform, X said it had "implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing".

Lammy said: "I am repulsed by the disgusting, abusive behaviour we've seen online.

"X's announcement is welcome, but it is imperative this government continues to take urgent action to stop vile criminals using these tools to exploit innocent women and children online.

"On Monday, we said we would fast track work to bring this legislation into force, meaning it will soon be illegal for anyone to create or request deepfake intimate images of adults without consent.

"And today we have, meaning it will now become law within weeks."

Follow BBC North East on X, Facebook, Nextdoor and Instagram.

Related internet links

More from the BBC