Children’s charities and safety groups including the Molly Rose Foundation, the NSPCC and Childnet argue that a blanket prohibition on under‑16s using social media is the wrong solution. They say ministers should prioritise enforcement of existing online safety law. Their intervention comes as the Children’s Wellbeing and Schools Bill returns to the House of Lords for report stage on Monday 19 January, with peers expected to debate age‑limit amendments. ([sg.news.yahoo.com](https://sg.news.yahoo.com/molly-russells-dad-says-under-171823685.html?utm_source=openai))
Ian Russell, father of Molly Russell, told the BBC’s Newscast that a statutory ban risks unintended consequences and that bereaved families want the focus on implementation rather than headline bans. The Molly Rose Foundation has previously set out why prohibitions could push risks to other services and stall regulatory progress, calling instead for stronger, faster enforcement. ([sg.news.yahoo.com](https://sg.news.yahoo.com/molly-russells-dad-says-under-171823685.html?utm_source=openai))
A joint statement signed by the Molly Rose Foundation with the NSPCC and Childnet calls for a broader, targeted approach: keeping under‑13s off services, introducing evidence‑based feature blocks tailored by age, and strengthening duties so platforms deliver genuinely age‑appropriate experiences. The signatories argue that service‑specific minimum ages should reflect risk. ([sg.news.yahoo.com](https://sg.news.yahoo.com/molly-russells-dad-says-under-171823685.html?utm_source=openai))
Peers have tabled proposals to embed a 16+ expectation into the Online Safety Act via the Children’s Wellbeing and Schools Bill. One amendment led by Lord Storey would set 16 as the default minimum age for ‘social media services’ within children’s risk assessments, require ‘highly effective’ age assurance, and allow evidence‑based deviations under Ofcom guidance; an earlier committee‑stage amendment from Lord Nash sought regulations to prevent under‑16s becoming users but was withdrawn after debate. ([bills.parliament.uk](https://bills.parliament.uk/bills/3909/stages/20215/amendments/10031835?utm_source=openai))
Government positioning has shifted. Prime Minister Sir Keir Starmer told Labour MPs that all options are on the table following Australia’s move, while Health Secretary Wes Streeting has asked officials to hear external experts and examine tougher limits. Ministers continue to stress that any additional step must be evidence‑led alongside current statutory duties. ([theguardian.com](https://www.theguardian.com/politics/2026/jan/13/keir-starmer-tells-mps-he-is-open-to-australian-style-social-media-ban?utm_source=openai))
Opposition pressure is explicit: Conservative leader Kemi Badenoch says a future Conservative government would legislate for an under‑16 ban, and the NASUWT teaching union has urged ministers to act, citing deteriorating behaviour and mental‑health concerns linked by teachers to social media. ([itv.com](https://www.itv.com/news/2026-01-11/tories-would-ban-under-16s-from-social-media-badenoch-says?utm_source=openai))
The regulatory baseline already in force matters. Ofcom’s protection‑of‑children codes took effect on 25 July 2025: services likely to be accessed by children must complete and keep updated risk assessments and implement safety measures; platforms that allow pornography must apply ‘highly effective’ age assurance under separate duties. Ofcom has signalled it will enforce compliance. ([ofcom.org.uk](https://www.ofcom.org.uk/online-safety/protecting-children/protecting-children-from-harms-online?utm_source=openai))
Early enforcement activity includes penalties where checks fall short. In December 2025 Ofcom fined a pornography operator more than £1m for inadequate age verification and failing to meet information requests, and it has warned larger platforms over weak risk assessments. ([ft.com](https://www.ft.com/content/abe78aa2-6e62-419b-925c-9818e86e7179?utm_source=openai))
International context is shaping UK debate. Australia’s new social media age‑restriction regime took effect on 10 December 2025, requiring designated platforms to prevent under‑16s from holding accounts, with penalties up to A$49.5m. The eSafety Commissioner emphasises the obligation sits with platforms, not families; early figures indicate millions of accounts have been removed or restricted. ([esafety.gov.au](https://www.esafety.gov.au/about-us/industry-regulation/social-media-age-restrictions?utm_source=openai))
If Parliament adopts a 16+ threshold, compliance teams should expect materially broader age‑assurance obligations across mainstream social networks. Providers would need age‑checking that Ofcom deems ‘highly effective’, refreshed children’s risk assessments, and mitigations calibrated by user age, while continuing to meet the ICO Children’s Code requirements on high‑privacy defaults and avoiding design choices that nudge children to share data. ([ofcom.org.uk](https://www.ofcom.org.uk/online-safety/protecting-children/statement-age-assurance-and-childrens-access?utm_source=openai))
School‑day device use is also in scope of the wider bill, which has seen Lords debate on phones in schools during committee, while ministers note most schools already restrict phones through existing guidance. Any new duty would add to, not replace, current online safety and behaviour frameworks. ([parliament.uk](https://www.parliament.uk/business/news/2025/may/childrens-wellbeing-and-schools-bill-committee-stage/?utm_source=openai))
Next steps: the Lords will continue report stage on Monday 19 January and Wednesday 21 January. If peers back a 16+ provision, the question returns to the Commons. The Department for Science, Innovation and Technology says a ban is not current policy, though all options remain under review; for now, the Online Safety Act’s enforcement remains the immediate lever. ([parliament.uk](https://www.parliament.uk/business/news/2026/jan-2026/childrens-wellbeing-and-schools-bill-house-of-lords-report-stage/?utm_source=openai))