Westminster Policy News & Legislative Analysis

UK consults on under-16 social media ban amid Lords pressure

A jury in Los Angeles on 25 March found Meta and Google liable over platform design that harmed a young user, a high‑profile verdict arriving as the UK weighs tighter controls on children’s access to social media. Ministers are consulting on options that include an under‑16 access ban and measures targeting so‑called addictive features. (en.wikipedia.org)

The government consultation opened on 2 March 2026 and closes on 26 May 2026. It seeks views on setting a minimum age for social media access, enforcement models, and whether services should be required to disable features such as infinite scroll and autoplay for children, with a formal response due in summer 2026. (commonslibrary.parliament.uk)

Ministers have signalled a wide remit. Prime Minister Keir Starmer has said that no option is off the table, including a higher age limit and restrictions on addictive design, and that the UK will study international practice such as Australia’s statutory model. (apnews.com)

Peers have twice pressed the case for a firm age limit through the Children’s Wellbeing and Schools Bill. On 21 January, the House of Lords backed a cross‑party amendment tabled by former schools minister Lord Nash by 261 to 150, envisaging regulations to prevent under‑16s from using social media with ‘highly effective’ age checks and a 12‑month window for ministers to specify which services are in scope. (itv.com)

When the Bill returned to the Commons on 9 March, MPs voted 307 to 173 to reject an immediate ban. Instead, the Government secured amendments in lieu creating powers for the Science Secretary to make secondary legislation to restrict or prevent access by children to specified internet services or to particular features, and to set an age within 13–16 for any such restriction; opposition front‑benchers, including Shadow Education Secretary Laura Trott, urged swifter action. (theguardian.com)

The result leaves a live ‘ping‑pong’ between the Houses: peers have indicated they will continue to push for an outright statutory ban, while ministers argue for an enabling framework linked to the consultation’s evidence base and Ofcom’s online safety regime. Further changes are therefore possible before the Bill completes passage. (theguardian.com)

Addictive design is explicitly in scope for potential regulation. The consultation asks whether platforms should be required to switch off features that encourage prolonged use late at night-such as infinite scroll and autoplay-while the European Commission has issued preliminary findings that TikTok’s core design breaches the Digital Services Act for similar reasons. (gov.uk)

Any new age‑limit or feature restrictions would sit alongside the Online Safety Act 2023. Ofcom’s children’s safety codes begin to take effect from 25 July 2025, requiring robust age assurance for the most harmful content; the regulator can levy fines up to the greater of £18 million or 10% of qualifying worldwide revenue for breaches. (ofcom.org.uk)

Internationally, policy is shifting. Australia’s Online Safety Amendment (Social Media Minimum Age) Act 2024 requires providers to prevent under‑16s from holding accounts; France’s National Assembly has approved a ban for under‑15s; in the United States, the federal Kids Online Safety Act has advanced and several states are testing age‑verification and curfew rules, many subject to legal challenge. (legislation.gov.au)

For departments, schools and platforms the near‑term milestones are clear: the consultation runs to 26 May, with a government response slated for summer 2026; in parallel, the Children’s Wellbeing and Schools Bill provides an immediate legislative route for targeted, feature‑specific restrictions by statutory instrument. Providers should plan for stronger age assurance, night‑time protections, and default‑off engagement mechanics for young users. (commonslibrary.parliament.uk)