NEW YORK — State officials are moving forward with plans to require age verification on social media platforms as part of new child protection measures. The proposal, unveiled Monday by the office of Attorney General Letitia James, stems from the SAFE for Kids Act, passed by lawmakers during the 2023–2024 legislative session.
“Children and teenagers are struggling with high rates of anxiety and depression because of addictive features on social media platforms,” James said in a statement. “The proposed rules released by my office today will help us tackle the youth mental health crisis and make social media safer for kids and families.”
The draft regulations call for age assurance systems, a range of tools designed to determine users’ ages without necessarily requiring government-issued IDs. These measures have been endorsed by the Age Verification Providers Association (AVPA), a trade group representing companies that develop such technology.
“As an organization that represents more than 30 companies that provide privacy-preserving age assurance technology, we are certain the preliminary rules issued by Attorney General James establish a meaningful but flexible standard that online platforms can meet with existing solutions,” said AVPA executive director Iain Corby, praising the proposal as both practical and economical.
He added that AVPA hopes other U.S. states will follow New York’s lead.
The AVPA has been a vocal advocate of stricter online age checks, often clashing with free expression advocates and adult entertainment stakeholders who argue that such measures threaten privacy and First Amendment rights.
If adopted, the regulations would directly affect the country’s largest social media platforms, including X, Instagram, TikTok, Reddit, and Snapchat — networks that are also widely used by adult content creators for marketing and audience engagement.