Campaign win! It will *finally* be illegal to create AI sexualised images using Grok

A Glamour campaign pushed the government to introduce this vital legislation in 2025 – why did it take so long to become law?
Image may contain Suzana Ansar Katherine Pancol Thalita de Jong Cristina Neagu Amanda Lamb People Person and Adult
Hannah Harley Young

This week, it will finally be made illegal to create AI-generated sexual images – or ‘deepfakes’ – of someone without their consent. This follows a public outcry over Grok, the AI chatbot created for Elon Musk's social media platform X, which has generated hundreds of non-consensual images of women, and some children, at the behest of various X users.

For the past two years, Glamour has been campaigning to Stop Image-Based Abuse, which includes (but is not limited to) deepfake abuse, so-called ‘revenge porn’, and semen images (look it up). We've succesfully lobbied two successive governments to criminalise the creation of AI-generated sexual images, in partnership with the End Violence Against Women Coalition (EVAW), Not Your Porn, Jodie Campaigns, and Clare McGlynn, Professor of Law at Durham University. So why is the legislation only coming into force now?

Image may contain Clothing Coat Accessories Sunglasses Face Head Person Photography Portrait Pants City and Road
Hannah Harley Young

Last year, the government passed the Data (Use and Access) Act, which contained the vital legislation on banning the creation of of non-consensual deepfakes. However, the section has yet to be implemented – despite multiple requests for a clear timeline. In that year, Glamour has published an investigation into Grok creating sexualised images without consent (as far back as June 2025), the Women and Equalities Committee has published a series of recommendations for the government to effectively deal with image-based abuse (most of which the government rejected), and, of course, hundreds of women have been victimised by this form of abuse.

The UK's media watchdog Ofcom has launced a formal investigation into Grok, which, if X fails to comply, could result in a fine of up to 10% of its worldwide revenue or £18 million, whichever is greater. Under powers granted to them by the Online Safety Act, Ofcom could also seek a court order that would force internet service providers in the UK to block access to X.

As well as announcing the implementation of the legislation on creating deepfakes, Liz Kendall, the Secretary of State for Technology, Science, and Innovation, has confirmed that the government will criminalise apps which allow users to create nude fake images of people. Again, a promise that had previously been made by the government under its long-awaited Violence Against Women and Girls strategy published in December last year.

Image may contain Yekaterina Samutsevich People Person Adult Accessories Bag Handbag Wedding Clothing and Glove
Hannah Harley Young

Elena Michael, the co-founder of Not Your Porn, a survivor-powered movement to end image-based abuse, tells Glamour, “We are pleased that these provisions are finally being brought into force. However, it shouldn’t have taken public outrage at Grok to get the Government to enact laws they’ve already set in motion a long time ago.”

She continues, “The government’s reactive approach is at odds with its commitment to tackle violence against women and girls. Tackling online violence against women and girls should not be a marketing exercise, which is what it currently seems like for the government given the countless announcements about how these provisions coming into force but no action until now.”

Jodie Campaigns, a deepfake abuse survivor-campaigner, says, "I am relieved that this law is finally being brought into force. But this is a moment that should have been one of collective celebration. Instead, it feels painfully late, arriving only as a reaction to fresh and very real harm unfolding in public view.

“It should never have taken days of outrage and new victims being created for action to be taken, when this legislation has been sitting ready, with Royal Assent, for months. Survivors and campaigners warned, again and again, that delaying this law would cause real harm. We were right.”


Read the rest of Jodie's statement here:

Every week of inaction meant more women waking up to find their bodies manipulated without consent, more people experiencing the same shock, fear and violation that I did. And many of those victims will now be told that the law designed to protect them did not exist in time for them, forcing them to search for workarounds and weaker legal routes, just as I had to.

It is also telling that this is the first time we have heard the Prime Minister speak directly about this issue. Given the sustained campaigning, survivor testimony and expert warnings over many months, that silence has been deeply disappointing and once again highlights how reactive, rather than preventative, the government’s approach to online abuse remains.

Legislation, however, only protects people if it is enforced. Under the Online Safety Act, Ofcom must now treat X and Grok with the seriousness this harm demands and be prepared to use the strongest powers available to it. Platforms cannot be allowed to use technology to experiment on women’s bodies, issue apologies when caught, and then move on. Accountability has to be real, visible, and impact future action.

I want to recognise the campaigners, journalists and experts who have worked tirelessly to force this issue into the public and political consciousness. That includes Glamour UK, Not Your Porn, the End Violence Against Women Coalition, Professor Clare McGlynn, Charlotte Owen, the Revenge Porn Helpline, and so many others who have stood with survivors and refused to let this harm be ignored. Change has come because of collective pressure, persistence and courage.

This moment should never have come at this cost. My hope is that this marks a genuine turning point, where women and girls are no longer treated as collateral damage of technological progress, and where protection comes before harm, not after it.


Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.

Glamour is campaigning for the government to introduce an Image-Based Abuse Law in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

For more from Glamour UK's Lucy Morgan, follow her on Instagram @lucyalexxandra or on TikTok at @lucyalexxandra.

Read More
GLAMOUR just went to parliament (again) to call for a dedicated Image-Based Abuse Law

We've come so far in the fight against image-based abuse, but there's still more to do.

Image may contain: Shawn Horcoff, NeNe Leakes, Laure Ferrari, Stephanie Allynne, Lorena Forteza, Lisa Gerrard, and Adult