In January 2026, the government announced a consultation on how to improve children’s relationships with mobile phones and social media. The consultation opened on 2 March 2026 and closes on 26 May 2026.
An amendment to the current Children’s Wellbeing and Schools Bill is described in a “ping-pong” state between the House of Commons and the House of Lords.
Speaking in the Commons on 27 April 2026, Education minister Olivia Bailey has said, “we are clear that under any outcome we will impose some form of age or functionality restrictions for children under 16” and, “I can also confirm that consideration of restrictions such as curfews will be in addition not instead of this.”
It is anticipated, at some point in 2026, the Bill will be passed and Royal Assent given.
Comment from Terry Green, partner at law firm Katten Muchin Rosenman LLP:
“The age or functionality restrictions imposed by the Government on social media platforms does not necessarily mean an outright ban for under 16s, it can range from an outright ban for under 16s like Australia, to requirements to impose proactive technologies, similar to some that are already in the Online Safety Act. However, the recent announcement highlights the changes at pace in online safety and the difficulties for social media platforms to keep up with frequent policy changes, given recent policy announcements on AI chat bots just two months ago.
“Ofcom already expects major platforms to enforce its minimum age policies through highly effective age verification. Platforms such as Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube have already been asked by Ofcom to enforce its under 13s policy so the Government may look to extend this to beyond the biggest platforms.
“Ofcom’s Age-Appropriate Design Code also already requires platforms to have specific functionality restrictions and defaults to keep children safe, such as privacy and profiling defaults, switched off geolocation and restricted direct messaging functions. The Government may look to enshrine the use of these technologies and restrictions, as well as other guidance into the Online Safety Act, similar to existing obligations to require the use of accredited and proactive technology for child sexual abuse and terrorism prevention. This will allow for stricter enforcement and greater clarity on platform obligations.
“Ofcom may also take inspiration from other jurisdictions, such as a greater focus on restricting ‘addictive design features’ such as infinite scrolling and enhanced parental controls in the European Union, or more restrictive methods such as time and usage management tools, as implemented in China’s TikTok where under 14 users are restricted to 40 minutes of use per day between 6am and 10pm.
“We will likely see more prescriptive requirements on platform functionalities as well as greater use of highly effective age verification for restrictions. This will also raise an important point on data privacy and the use of children’s data for platforms to consider amongst all the other changes.”





