This week, it will finally be made illegal to create AI-generated sexual images – or ‘deepfakes’ – of someone without their consent. This follows a public outcry over Grok, the AI chatbot created for Elon Musk's social media platform X, which has generated hundreds of non-consensual images of women, and some children, at the behest of various X users.
For the past two years, Glamour has been campaigning to Stop Image-Based Abuse, which includes (but is not limited to) deepfake abuse, so-called ‘revenge porn’, and semen images (look it up). We've succesfully lobbied two successive governments to criminalise the creation of AI-generated sexual images, in partnership with the End Violence Against Women Coalition (EVAW), Not Your Porn, Jodie Campaigns, and Clare McGlynn, Professor of Law at Durham University. So why is the legislation only coming into force now?
Last year, the government passed the Data (Use and Access) Act, which contained the vital legislation on banning the creation of of non-consensual deepfakes. However, the section has yet to be implemented – despite multiple requests for a clear timeline. In that year, Glamour has published an investigation into Grok creating sexualised images without consent (as far back as June 2025), the Women and Equalities Committee has published a series of recommendations for the government to effectively deal with image-based abuse (most of which the government rejected), and, of course, hundreds of women have been victimised by this form of abuse.
The UK's media watchdog Ofcom has launced a formal investigation into Grok, which, if X fails to comply, could result in a fine of up to 10% of its worldwide revenue or £18 million, whichever is greater. Under powers granted to them by the Online Safety Act, Ofcom could also seek a court order that would force internet service providers in the UK to block access to X.
As well as announcing the implementation of the legislation on creating deepfakes, Liz Kendall, the Secretary of State for Technology, Science, and Innovation, has confirmed that the government will criminalise apps which allow users to create nude fake images of people. Again, a promise that had previously been made by the government under its long-awaited Violence Against Women and Girls strategy published in December last year.
Elena Michael, the co-founder of Not Your Porn, a survivor-powered movement to end image-based abuse, tells Glamour, “We are pleased that these provisions are finally being brought into force. However, it shouldn’t have taken public outrage at Grok to get the Government to enact laws they’ve already set in motion a long time ago.”
She continues, “The government’s reactive approach is at odds with its commitment to tackle violence against women and girls. Tackling online violence against women and girls should not be a marketing exercise, which is what it currently seems like for the government given the countless announcements about how these provisions coming into force but no action until now.”
Jodie Campaigns, a deepfake abuse survivor-campaigner, says, "I am relieved that this law is finally being brought into force. But this is a moment that should have been one of collective celebration. Instead, it feels painfully late, arriving only as a reaction to fresh and very real harm unfolding in public view.
“It should never have taken days of outrage and new victims being created for action to be taken, when this legislation has been sitting ready, with Royal Assent, for months. Survivors and campaigners warned, again and again, that delaying this law would cause real harm. We were right.”
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
Glamour is campaigning for the government to introduce an Image-Based Abuse Law in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.
For more from Glamour UK's Lucy Morgan, follow her on Instagram @lucyalexxandra or on TikTok at @lucyalexxandra.
We've come so far in the fight against image-based abuse, but there's still more to do.




