Australia Holds Gaming Giants Accountable for Child Safety

Australia's internet regulator is demanding online gaming platforms like Roblox and Minecraft disclose how they safeguard children from grooming and radicalization. The eSafety Commissioner has issued transparency notices, requiring platforms to detail safety systems and face daily penalties for non-compliance. The move comes amid rising scrutiny of online gaming safety.

Australia Holds Gaming Giants Accountable for Child Safety
This image is AI-generated and does not depict any real-life event or location. It is a fictional representation created for illustrative purposes only.

The Australian internet regulator has demanded that online gaming platforms, including Roblox and Microsoft's Minecraft, clarify their child protection measures against grooming by sexual predators and potential radicalization. This development highlights growing concerns over child safety on popular gaming platforms.

The eSafety Commissioner has issued legally binding notices to platforms like Fortnite, Steam, and others, asking for details on their safety protocols and staffing. Non-compliance with these notices could result in penalties of up to A$825,000 daily. The Commissioner, Julie Inman Grant, emphasized that gaming services with encrypted messaging could be initial contact points for children with offenders.

This scrutiny follows increasing incidents where gaming environments are used by predators to lure children into private messaging services. Inman Grant referred to the platforms as significant social spaces for minors, warning of the heightened risks of criminal contact and radicalization. Meanwhile, Roblox faces over 140 lawsuits in the U.S. for allegedly facilitating child exploitation and is implementing changes to enhance child safety.

Give Feedback