The Office of the eSafety Commissioner announced Tuesday that age verification will now be required for users accessing pornography platforms in Australia’s digital space.
eSafety Commissioner Julie Inman Grant introduced nine new industry codes aimed at shielding minors from what she described as “lawful but awful” content—material that is legal but potentially harmful, such as online pornography and sexual artificial intelligence companions.
“We’ve been concerned about these chatbots for a while now and have heard anecdotal reports of children—some as young as 10—spending up to five hours per day conversing, at times sexually, with AI companions,” Grant said. “We know there has been a recent proliferation of these apps online and that many of them are free, accessible to children, and advertised on mainstream services, so these codes must include measures to protect children from them.”
Under the new rules, parent companies behind adult content platforms must implement “appropriate age assurance measures.” Acceptable methods include identity checks, credit card verification, biometric age estimation powered by AI, and similar technologies.
Noncompliance could result in civil penalties running into the millions of dollars. Companies are also required to test and monitor the effectiveness of their age verification systems on an ongoing basis, according to reports from News.com.au and Crikey.