Westminster Policy News & Legislative Analysis

Kendall urges Ofcom action on xAI’s Grok under Online Safety Act

The UK Technology Secretary has called for rapid regulatory action after reports that xAI’s Grok still enables the generation or editing of intimate, sexualised images. In a statement on 9 January, Liz Kendall said she expects Ofcom to provide an update on next steps “in days not weeks” and reminded xAI that UK law allows access to be blocked if firms refuse to comply. The remarks follow overnight changes by xAI to Grok’s image functions. ([gov.uk](https://www.gov.uk/government/news/technology-secretary-statement-on-xais-grok-image-generation-and-editing-tool))

Under the Online Safety Act (OSA), Ofcom can issue statutory information notices, conduct investigations and impose penalties of up to £18 million or 10% of global annual revenue for breaches. In the most serious or persistent cases, the regulator may apply to court for business disruption measures, including orders on ancillary services (such as app stores, advertisers or payment providers) and, if needed, access restriction orders requiring UK network blocking of a service. Ofcom has stated it is ready to use the full extent of these powers. ([legislation.gov.uk](https://www.legislation.gov.uk/ukpga/2023/50/part/7/chapter/6/crossheading/business-disruption-measures?utm_source=openai))

Regulatory expectations have tightened this week. From 8 January 2026, “cyberflashing” is a priority offence under the OSA, which means in‑scope services must take proactive steps to prevent unsolicited sexual images from reaching users in the UK. Platforms that fail to implement effective preventative measures now risk enforcement, on top of takedown duties for illegal content. ([gov.uk](https://www.gov.uk/government/news/stronger-laws-for-tech-firms-to-ensure-you-dont-see-unsolicited-nudes?utm_source=openai))

Kendall’s statement also references Ofcom’s new guidance on protecting women and girls online. Final guidance, published in November 2025 after consultation, sets out practical steps providers should take-such as stronger design safeguards, reporting tools and measures to curb pile‑ons-and signals closer supervisory engagement by the regulator. Ofcom has already written to companies to set expectations for immediate action. ([ofcom.org.uk](https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/a-safer-life-online-for-women-and-girls/?utm_source=openai))

On the criminal law, Parliament has legislated to create new offences for “creating” and “requesting the creation” of a purported intimate image of an adult without consent, introduced via the Data (Use and Access) Act 2025 by inserting sections 66E and 66F into the Sexual Offences Act 2003. Ministers indicated commencement of these provisions would follow, with Kendall saying government will bring them into force “in the coming weeks.” ([legislation.gov.uk](https://www.legislation.gov.uk/ukpga/2025/18/part/7/crossheading/purported-intimate-images?utm_source=openai))

Separately, ministers have announced plans to ban so‑called nudification tools that generate non‑consensual nude images. The Home Office and DSIT trailed legislation to target those who design and supply such apps, with measures expected to be taken forward through the Crime and Policing Bill now in Parliament. Providers should assume these proposals will tighten obligations on toolmakers alongside existing OSA duties on platforms. ([gov.uk](https://www.gov.uk/government/news/protecting-young-people-online-at-the-heart-of-new-vawg-strategy?utm_source=openai))

For platforms and AI developers, the immediate risk is regulatory non‑compliance. Services that enable image generation or editing should document illegal‑content risk assessments aligned to Ofcom’s illegal harms codes, disable or constrain prompts that can produce unlawful intimate images, and record the effectiveness of mitigations. Ofcom can escalate quickly-from information notices to penalties and, where warranted, business or access restrictions sought through the courts. ([gov.uk](https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer?utm_source=openai))

For compliance teams, Ofcom’s women and girls guidance points to design‑level changes-such as “abusability” testing for new features, stronger default safety settings, and rapid user‑reporting and evidence preservation-alongside technical measures like hash‑matching for intimate image abuse. While the guidance itself is not a statutory code, Ofcom has said it will supervise engagement and report on progress, with further enforcement under the OSA where duties apply. ([ofcom.org.uk](https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/a-safer-life-online-for-women-and-girls/?utm_source=openai))

Next steps centre on timing. Kendall expects Ofcom to confirm its approach within days. In parallel, the priority‑offence designation for cyberflashing is already live, the deepfake‑creation offences are legislated and awaiting commencement, and the planned ban on nudification tools is moving through the legislative process. Companies operating in the UK should assume heightened scrutiny and prepare for swift remedial action demands from the regulator. ([gov.uk](https://www.gov.uk/government/news/stronger-laws-for-tech-firms-to-ensure-you-dont-see-unsolicited-nudes?utm_source=openai))