UK Pushes for Stricter Social Media Age Checks to Protect Children
Britain's regulators urge social media giants to enforce stricter age checks to protect children online. With growing concerns over harmful content, platforms like Facebook, YouTube, and TikTok face pressure to demonstrate improved safety measures. Failure to comply could result in substantial fines from Ofcom and the ICO.
Britain's media and privacy regulators have renewed calls for major social media platforms to enhance measures that keep minors from accessing their services. This comes as concerns grow over companies not adequately enforcing minimum age rules, despite knowing the risks involved in children accessing harmful content online.
The UK is considering introducing stricter regulations akin to those in Australia, which would block under-16 users from social media platforms. Both Ofcom and the Information Commissioner's Office (ICO) have voiced worries about the exposure of minors to potentially addictive content via algorithmic feeds and are demanding that companies make children's safety a priority.
Ahead of the next phase of implementing Britain's Online Safety Act, Ofcom has warned companies like Meta-owned Facebook and Instagram, Roblox, TikTok, YouTube, and Snapchat, to prove by April 30 how they plan to enforce age checks and safeguard young users. The ICO also issued statements urging these platforms to use modern technology for age verification, highlighting there is no excuse not to do so given the tools available.
ALSO READ
-
Social Media Fame and Firearms: Arrest in North Delhi Highlights Dangers
-
UK Regulators Demand Stricter Age Checks on Social Media Platforms to Protect Children
-
Britain Demands Tougher Social Media Protections for Children
-
Supreme Court Strikes Back: Tackling Social Media Misinformation
-
Tragic Clash Over Social Media Reels: A Student's Life Cut Short