Australia Holds Gaming Giants Accountable for Child Safety
Australia's internet regulator is demanding online gaming platforms like Roblox and Minecraft disclose how they safeguard children from grooming and radicalization. The eSafety Commissioner has issued transparency notices, requiring platforms to detail safety systems and face daily penalties for non-compliance. The move comes amid rising scrutiny of online gaming safety.
The Australian internet regulator has demanded that online gaming platforms, including Roblox and Microsoft's Minecraft, clarify their child protection measures against grooming by sexual predators and potential radicalization. This development highlights growing concerns over child safety on popular gaming platforms.
The eSafety Commissioner has issued legally binding notices to platforms like Fortnite, Steam, and others, asking for details on their safety protocols and staffing. Non-compliance with these notices could result in penalties of up to A$825,000 daily. The Commissioner, Julie Inman Grant, emphasized that gaming services with encrypted messaging could be initial contact points for children with offenders.
This scrutiny follows increasing incidents where gaming environments are used by predators to lure children into private messaging services. Inman Grant referred to the platforms as significant social spaces for minors, warning of the heightened risks of criminal contact and radicalization. Meanwhile, Roblox faces over 140 lawsuits in the U.S. for allegedly facilitating child exploitation and is implementing changes to enhance child safety.
ALSO READ
-
New Online Gaming Regulations: A Digital-First Approach
-
India's New Online Gaming Rules: Striking a Balance
-
Meity notifies rules to regulate online gaming, enabling formation of a digital gaming authority; norms to come into effect on May 1.
-
Rains Trigger Land Threat: Locals Demand Safety Measures
-
Andhra Pradesh Ramps Up Road Safety Measures with PM RAKSHA Scheme