The New York Children’s Online Safety Act would require platforms to make children’s profiles private by default.
New York State Sen. Andrew Gounardes (D) is a driving force behind kids online safety laws. He sponsored the Stop Addictive Feeds Exploitation (SAFE) for Kids Act and the New York Child Data Protection Act, both signed by Governor Kathy Hochul earlier this year.But when parents talked to him about the rules, he says, they kept asking a question he hadn’t considered: “Does it cover Roblox?”
Like the vast majority of online regulation, New York’s rules were aimed at traditional social media companies like Meta, Snap, and TikTok. But there’s been increasing scrutiny of the massively popular and overwhelmingly child-focused social gaming platform, and Gounardes is introducing a new bill to target it.
The New York Children’s Online Safety Act (NYCOSA) regulates how minors can communicate on social networks, aiming to prevent strangers from contacting them. While the bill could apply to a vast range of online serviceswith users under the age of 18, Gounardes told The Verge in an exclusive interview that the “seed” of the idea came from those Roblox questions. It’s a new moment of reckoning for a platform that’s flown under the radar until recently — possibly because many legislators barely realize it exists.
NYCOSA includes provisions that overlap and build on a number of other state- and federal-level online regulation bills, including Gounardes’ earlier legislation. It would require any public or “semi-public” platform used by minors aimed at letting users create profiles, post content, and interact with others to make the profiles of minors private by default, preventing them from having their profiles viewed by strangers.It would also prohibit minors from being tagged in posts and sent messages or digital currency by users they’re not connected with.
The bill would also give parents more control over their children’s social media accounts. For kids under 13, sites would be required to let parents see and change privacy settings, approve all friend requests, and view connections. (Roblox is one of the few major platforms that allows children this young; Facebook, Instagram, TikTok, and most other networks set the minimum age at 13.) For minors under 18, they’d have the right to approve financial transactions.
On top of that, covered platforms would have to implement “commercially reasonable and technically feasible age verification” and would be banned from using design tricks known as “dark patterns” to discourage using the features above.
Online age verification is a major regulatory flashpoint since the most bulletproof systems — like providing government-issued IDs — require collecting information that could fundamentally compromise privacy for children and adults alike. A longstanding Supreme Court precedent bars these intrusive methods, but the court is set to take up the issue again next year while considering a Texas anti-porn law.
What’s more, significant parts of the rule would be left up to New York’s attorney general to establish. The definition of a covered platform doesn’t have a size threshold or specify a precise kind of service — some elements, like letting users “create or post content that is viewable by other users” or “socially interact with each other,” could describe anything from a behemoth social network to a blog’s comment section.
The AG is also in charge of defining “reasonable” age verification. Gounardes says this shouldn’t require uploading a photo ID or storing large amounts of personal data. He says platforms already have strategies to infer users’ ages based on their activity, often for advertising purposes. And the bill asks the AG to consider the “size, financial resources, and technical capabilities” of a given platform. Even so, it means legislators could pass the law with wide latitude for how law enforcement interprets its rules.
Gounardes’ communications director, Billy Richling, tells The Verge in an email that the bill is not intended to cover news sites or blogs and says other definitions in the legislation make clear it’s focused on private communications like direct messages. “We’re really focused on ensuring unconnected accounts are not able to direct message under-18 accounts without a friend request being approved first,” Richling says. Likewise, he says, “a small app that’s really unable to conduct any age assurance” is unlikely to be a target, but “if we used an explicit size threshold, we’d risk arbitrarily exempting an offending app just below the cutoff.”
The broad definition — including avoiding specifying particular kinds of covered service — is intended to make the rule “content-neutral” and avoid triggering strict scrutiny under the First Amendment in court. Numerous state-level laws regulating internet platforms have been the target of lawsuits, and several have been blocked by courts as likely unconstitutional. That includes the California Age-Appropriate Design Code Act, a high-profile law that contains some similar provisions to NYCOSA, particularly a requirement that sites estimate users’ ages.
Gounardes based the legislation on the Federal Trade Commission’s 2022 settlement with Fortnite maker Epic Games over alleged violations of the Children’s Online Privacy Protection Rule (COPPA) and alleged use of dark patterns to get kids to buy things on the platform. If enacted, NYCOSA would let the New York attorney general seek $5,000 in damages per violation.
While CEOs from Meta, Google, Snap, and X have been hauled before Congress to face angry lawmakers and parents, Roblox has so far mostly escaped the limelight, despite its large user base of kids and teens. Gounardes says that for lawmakers like himself, that may be due to a lack of familiarity with the service. At 40 years old, he says, “my initial understanding of the social media landscape is informed by what I remember when I was using it”:the early days of Facebook.
Plus, he says, his concept of video games was shaped by consoles like the Nintendo 64, not internet-connected games that double as hugely open online social spaces. Parents have been trained to worry about the dangers of kids doomscrolling, while playing video games can look quite different. “It’s not the same as seeing someone in their room just scrolling on Instagram for hours on end,” says Gounardes. “That’s a very different visual of what harms could be befalling them. That’s kind of why I think this has gone under the radar.”
But Gounardes says that lack of familiarity can have “serious repercussions.” He recalled a Bloomberg article from July describing the “Sisyphean task” of keeping the platform safe from child predators.
Facing increasing regulatory pressure and the threat of federal legislation, Roblox and others have already taken some of the steps Gounardes is proposing. Roblox announced earlier this week that it would prohibit kids under the age of 13 from sending private messages outside of games or experiences and require parental permission to send messages within games. Meta’s Instagram recently announced that all accounts for users under 18 would become default-private “Teen Accounts,” with more restrictive settings.
“These developments are only the result of the public pressure that has been growing from legislative action, from regulatory enforcement and just grassroots people power,” says Gounardes. “Companies are feeling the heat. They are smart to try to get ahead of this, but those measures are in and of themselves insufficient, and we should not be letting the wolf guard the hen house.”