The Demise of the California Age-Appropriate Design Code Act

Children’s online safety and privacy has been a major concern for regulators, and it is against this backdrop California passed the California Age-Appropriate Design Code Act (CAADCA).  CAADCA, signed into law in 2022, was far reaching, swiftly opposed, and ultimately enjoined in its entirety.  NetChoice, a collective of technology companies challenging online regulation of speech on behalf of their members, raised multiple First Amendment challenges.  Similar arguments have been successful in overturning age verification legislation in Texas, Arkansas, Mississippi, and Ohio, and are currently being used to challenge legislation in Georgia, Louisiana, Maryland, Tennessee, and Utah.  

CAADCA Provisions

CAADCA required all businesses that provide online products and services children access or “are likely to access” to adhere to specific requirements, including to: create a Data Protection Impact Assessment (DPIA) report describing whether the product could harm children or lead children to harmful activity; estimate the age of child users with a reasonable level of certainty or provide high default settings to all users; provide information about data usage; and enforce published terms of service and community standards.  Covered businesses were prohibited from using personal information of any child in a way materially detrimental to the physical health, mental health, or well-being of the child; collecting, selling, sharing, or retaining personal information not necessary to provide an online service, product, or feature; collecting or sharing geolocation information; or using dark patterns to encourage children to provide personal information.

NetChoice Lawsuits

NetChoice, LLC v. Bonta, 692 F. Supp. 3d 924 (N.D. Cal. 2023)

On December 14, 2022, NetChoice filed a lawsuit in the Northern District of California seeking to enjoin enforcement of the legislation, which was scheduled to take effect July 2024.  NetChoice argued the legislation violated the First Amendment of the U.S. Constitution by being an unlawful prior restraint on covered business speech, was facially overbroad and vague, and failed to satisfy strict scrutiny for regulating protected speech.

The Court focused its analysis on the dispositive question of whether the State could meet the requisite level of scrutiny by analyzing (1) whether the act regulated protected speech, (2) what level of scrutiny was required, and (3) whether the act met that burden.  

The Court inquired whether the law regulated protected speech by restricting speech based on its message, idea, subject matter, or content. It determined that CAADCA sought to regulate protected speech for four reasons.  First, the Court highlighted that the “creation and dissemination of information are speech” and when a company possesses information that is subject to restrictions regarding the “availability and use” of that information the statute in question regulates protected speech.  Second, the DPIA report contemplated by the statue required businesses to express their ideas about its business model, and thus compelled speech.  Third, the law required covered businesses to provide information to users about data usages, again compelling speech and turning businesses into censors for the government.  And finally, the requirement to estimate age had the likely effect of preventing both adults and children from accessing certain content because companies choosing not to utilize age verification would self-censor and apply the standard of what content is appropriate for children to everyone.

The Court had some difficulty determining which level of scrutiny was appropriate.  Specifically, the Court was uncertain whether the Act regulated purely non-commercial speech, or alternatively non-commercial speech inextricably intertwined with commercial speech.  NetChoice failed to meet its burden to demonstrate strict scrutiny applied, so the Court applied the less rigorous standard for commercial speech.[1]

Even under the lower standard, CAADCA failed because the provisions did not directly advance the government interest of “protecting the physical and psychological well-being of minors.”   Specifically, the confidentiality of the DPIA report and the requirement to assess and create a plan to reduce risk due to data management practices did not further the goal of protecting the well-being of children. Furthermore, requiring age estimation would not provide greater data and privacy protections for children because accurately determining a user’s age would necessitate the collection of more sensitive personal information and companies choosing not to estimate age would be forced to “reduce the adult population . . . to reading only what is fit for children,” which the Court reasoned would be substantially excessive.

The Court also noted CAADCA went far beyond the existing Children’s Online Privacy Protection Act (COPPA).  COPPA limits collection of user data by operators of websites and services “directed to children,” while CAADCA covered businesses children are “likely to access.”  Where COPPA protects children under the age of 13, CAADCA would have applied to children under the age of 18.  And while COPPA gives parents authority to make decisions about the use of their children's personal information, CAADCA gave that authority to online providers.

The Result: The Court found the offending provisions were not functionally severable and granted the injunction as to all provisions of CAADCA.

NetChoice, LLC v. Bonta, 113 F.4th 1101 (9th Cir. 2024)

The State quickly appealed the Northern District’s injunction to the Ninth Circuit Court of Appeals. Before the Ninth Circuit could issue an opinion, the Supreme Court decided Moody v. NetChoice[2] (a separate challenge to a Florida law seeking to make platforms liable for content moderation) and reemphasized the two-step process for bringing a facial First Amendment challenge: (1) assess the state laws’ scope, then decide which of the laws’ applications violate the First Amendment, and (2) measure the unconstitutional applications against the constitutional ones.

Following Moody, the Ninth Circuit determined that strict scrutiny was required.  The Court reasoned that the DPIA provision both compelled speech with a particular message about controversial issues and deputized private actors into censoring speech based on its content.  Analyzing the DPIA provision under strict scrutiny, the Ninth Circuit determined that the means did not address the harm because what information is harmful is subjective and varies per child.  Additionally, the State chose to delegate the definition of what is harmful to businesses, making it unlikely CAADCA would address the harm identified in the legislation.  Finally, the required confidentiality of the DPIA report would undercut the goal of transparency to the public.

The Ninth Circuit remanded the case to the lower court to conduct a proper facial analysis under a strict scrutiny standard.  They additionally reversed the lower court’s ruling enjoining the provisions besides the DPIA report, as the record was not sufficient to determine whether the remaining provisions would fail strict scrutiny.

NetChoice, LLC v. Bonta, No. 22-CV-08861-BLF, 2025 WL 807961 (N.D. Cal. Mar. 13, 2025)

On remand, NetChoice brought both facial and as-applied challenges to each provision of the statute and requested that the Court enjoin the statute in its entirety.  It also argued the coverage definition rendered the entire act unconstitutional.  Limiting coverage to businesses providing products or services likely to be accessed by children is a regulation targeted at specific subject matter, they argued, and is therefore content-based even if it does not discriminate among viewpoints within the subject matter.

The Court agreed with NetChoice, finding the coverage definition made the Act content-based in every application.  Businesses that provide online services, products, or features “likely to be accessed by children” would be subject to heightened regulation, while other businesses are not.  The State did not satisfy strict scrutiny because there was no evidence the harms were real and could be addressed by CAADCA, that the coverage was narrowly tailored, or that this was the least restrictive means to achieve the goal.  Judge Freemen also scolded the State for failing to protect children by enforcing other laws like COPPA: “a law is not narrowly tailored when the State's interest could be served by vigorous enforcement of existing, less restrictive regulations.”  Because the coverage definition applied to all applications, there was no permissible application of CAADCA.

As to the remaining provisions, the Court found NetChoice was not likely to prevail in challenging the information use restrictions and prohibitions on using dark patterns.  However, the Court found the policy enforcement requirement had no legitimate application because CAADCA required a business to enforce its own published content policies and standards, limiting the exercise of editorial judgement which is protected by the First Amendment (“[a]n entity exercising editorial discretion in the selection and presentation of content is engaged in speech activity.” Moody, 603 U.S. at 731, 144 S.Ct. 2383).  The age estimation requirement met the same fate as “[e]very covered business will be forced to choose between intruding into user privacy, thereby chilling publication of and access to protected speech, or publishing only child-appropriate content, thereby restricting access to protected speech for users of all ages.” 

The Result:  As the offending provisions were not volitionally[3] severable, the Court preliminarily enjoined the entire statute.

What’s Next

The State filed its appeal on April 14, 2025, with the opening brief due June 10, 2025.  While it remains unclear which specific issues will be raised on appeal, one trend is unmistakable: plaintiffs are increasingly successful in challenging state-level age verification laws, and these legal battles are only intensifying.  The First Amendment has emerged as the primary tool in these challenges, and the legal landscape continues to evolve rapidly as new circuit court decisions are issued each month.

A pivotal moment is approaching with the Supreme Court expected to issue its opinion in Free Speech Coalition v. Paxton in the coming weeks. At the heart of that case is whether Texas’s age verification law, H.B. 1181, should be evaluated under strict scrutiny or the more lenient rational basis review.  The Fifth Circuit previously applied rational basis review, a decision now under scrutiny by the high court.  The outcome will have sweeping implications for how states can craft and defend age verification laws under the First Amendment.

Despite the uncertainty surrounding these legal developments, companies must remain vigilant.  COPPA, a longstanding federal statute, continues to require verifiable parental consent before collecting personal information from children under the age of 13.  This obligation remains in full force, regardless of the shifting judicial landscape.

As these legal battles unfold, the Supreme Court’s forthcoming decision is poised to shape the future of age verification laws nationwide—potentially redefining the balance between protecting minors and preserving free speech online.

 

[1] For that standard, the Court asks whether: (1) the commercial speech is misleading or related to illegal activity; (2) the asserted governmental interest is substantial; (3) the regulation directly advances the governmental interest; and (4) the regulation is not more extensive than is necessary to serve that interest. NetChoice, LLC v. Bonta, 692 F. Supp. 3d 924, 946 (N.D. Cal. 2023).

[2] Moody v. NetChoice, LLC, 603 U.S. 707, 724, 144 S. Ct. 2383, 2397, 219 L. Ed. 2d 1075 (2024).  

[3] A statutory provision “is volitionally separable if it was not of critical importance to the measure's enactment.” NetChoice, LLC v. Bonta, No. 22-CV-08861-BLF, 2025 WL 807961, at *27 (N.D. Cal. Mar. 13, 2025).

Next
Next

New Rules for Mass Arbitration Cases with AAA