June 21, 2024 – Albany, NY: Governor Kathy Hockle signs S.7694A/A.8148A into law the Stop Addictive Feeds Exploitation (SAFE) Kids Act, which requires social media companies to restrict addictive feeds on their platforms for users under the age of 18. Source
Last Thursday, Governor Kathy Hockle signed two bills into law in New York state, both aimed at protecting children from online harm. The bills are among hundreds of bills that have been considered, and in some cases passed, by U.S. states over the past few years, including age-appropriate design codes, laws requiring parental approval for social media use, and laws regulating access to pornographic sites.
The focus of this article is the Social Media-related “Preventing Child Exploitation of Addictive Feeds (SAFE) Act” (S7694). The New York State Child Data Protection Act (S7695B) is a broader law that restricts the collection, use, and sharing of data by users of online services (or connected devices) who are minors between the ages of 13 and 18 (the law states that the rules regarding user data for those under 13 are defined in the federal COPPA law). For these users, a short list of “strictly necessary” purposes is defined, and services must obtain “informed consent” (as defined in the bill) for any collection, use, or sharing of data for purposes other than these. Notably, the standard for establishing a user’s age is actual knowledge, relying on the user’s declaration, or signals from the user’s device or browser (it should be noted that this is not yet a common standardized technique, although some commentators have proposed this as the preferred model for age assurance). However, the minor-level restrictions also apply to services that are primarily targeted at minors.
What’s inside the SAFE for Kids Act?
The SAFE for Kids Act appears to be taking the same approach as some previous bills: Rather than enacting an entire code that would impose a patchwork of risk assessments, specific safety features, and general liability, it focuses on one aspect of social media that lawmakers deem harmful—in this case, the addictive potential manifested in nighttime notifications and algorithmic feeds. Such feeds have long been criticized for providing harmful content to minors, but here (presumably with First Amendment considerations in mind) they focus only on their “addictive” potential.
Who are the minors?
Similar to the New York State Child Data Protection Act, the standard for determining who is a minor (under 18 years of age under this law) is actual knowledge, except that platforms must implement restrictions unless “the business has used commercially reasonable and technically feasible methods to verify that the targeted user is not a targeted minor.” The definition of “commercially reasonable and technically feasible methods” is one of the key issues that need clarification in the New York Attorney General’s rulemaking in order to assess its impact.
Restrictions Content feeds that rank content based on user data (such as past engagement with content, and the user’s demographics and known interests) are prohibited for minors. This requires social media companies to only allow these users access to chronological-only feeds, or feeds ranked based on non-personalized criteria and simple search. (Some platforms, most notably TikTok, rely heavily on personalized algorithmic curation, but major services, at least in Europe, already offer personalized feeds that do not use algorithms, in compliance with Article 38 of the EU Digital Services Act. Exceptions are made for algorithmic ranking actions taken to reduce the visibility of problematic content, but in any case these are unlikely to be associated with a specific user.) A second restriction for minor users is the prohibition on sending notifications about feed activity between midnight and 6 a.m. Eastern time. Parental consent
Either or both of the above restrictions can be lifted with “verifiable parental consent.” As with age verification, the acceptable implementation of this is also left to definition by the New York State Attorney General’s Office.
Execution
The Attorney General can bring lawsuits on behalf of New Yorkers to obtain damages and compensation, to require the disgorgement or destruction of any illegally obtained data or algorithms trained on that data, and to impose fines of up to $5,000 per violation. Individuals can file complaints about the platforms with the Attorney General through their website.
Law and freedom of expression
Despite its focus on addictive properties and not content as discussed above, the SAFE for Kids Act may run afoul of the First Amendment by violating platforms’ right to expression to arrange content as they see fit (which is not without debate) and the right of minor users to consume that arrangement of content. While the impact of this law does not appear to be as dramatic as the parental consent laws recently enacted in Florida and Ohio, some key aspects of the district court ruling support the Ohio law, such as the law’s intrinsic relevance to expression and content, as it applies only to social media platforms, and the right of minors to access content is constitutionally protected. New York may be able to outperform these laws in this regard by tailoring its law somewhat narrowly and making it less ambiguous. The issue of whether these states can demonstrate a compelling state interest is discussed further below.
Operational challenges
The definition of “commercially reasonable and technically feasible methods” for age verification and “verifiable parental consent” is left to the Attorney General’s Office, following many other laws that leave regulators to make difficult trade-offs between accuracy, privacy, fairness, convenience, and costs associated with age verification. With regard to age verification, the law itself specifies what the Attorney General must consider and requires that several acceptable methods be approved. Given the complexity of the work, this process could cause significant delays, as the law is to take effect 180 days after the date the Attorney General issues the rules.
What’s the harm?
No doubt, social media platforms use notifications and algorithmic feeds to keep users (of all ages) engaged; after all, that is their basic business model. And some people (including minors) may be “too much” by their own standards or those of their friends and family. But it’s not at all clear that social media, much less these specific features, cause “addiction” according to any solid definition, let alone a clinical one, or whether courts would consider it a “compelling interest” that requires some restriction on freedom of expression.
There is certainly evidence from academic studies and leaked internal investigations that a simple reverse chronological feed on Facebook and Instagram reduces usage of the platforms. Perhaps that’s enough for supporters of the New York law. The question is, where is the line between addictive and simply attractive? Makers of consumer products and services all strive to be as attractive as possible, but the consumer protection rules that put limits on this usually apply when direct physical harm can be proven, which may not be the case here.
***
While the law will likely be subject to legal challenges, it could also be an opportunity to think creatively about how to rank news and recommendation feeds if not by user demographics and engagement history. Some platforms already incorporate quality scores, but how can this be further developed or enhanced? There may be ways to select from widely popular content to go beyond what one researcher described as the “lowest common denominator in human culture.” New algorithms may actually improve society, as companies entering the current race hope. Bluesky’s open approach to feed algorithms could also be a promising model for replacing platform-determined personalization with user choice.
Many of us find ourselves scrolling hopelessly through social media despite being “engaged” by standard platform metrics. Perhaps this law will inspire platforms to develop ways to make feeds more enduringly engaging than engagement-based personalization currently does.