Nebraska and Vermont are the latest U.S. states to join the growing landscape of children’s online safety laws that have swelled in state chambers in recent years. On May 30, 2025, Nebraska Governor Jim Pillen signed the Age-Appropriate Online Design Code Act (the Nebraska AADC). On June 12, 2025, Vermont Governor Phil Scott signed the Vermont Age-Appropriate Design Code Act (the Vermont AADC). In doing so, Nebraska and Vermont join California and Maryland, which in 2022 and 2024, respectively, enacted age-appropriate design code laws of their own. Notably, the ongoing legal challenges1 to the California and Maryland AADCs do not appear to have dissuaded state legislators from enacting AADC-style and other children’s online safety laws. The Nebraska AADC takes effect January 1, 2026 (though the state Attorney General (AG) must wait until July 1, 2026, to seek civil penalties). The Vermont AADC takes effect January 1, 2027.
The provisions and key takeaways of the two AADC laws are summarized below.
Nebraska’s Age-Appropriate Online Design Code Act
Scope
The Nebraska AADC sets forth obligations, restrictions, and prohibitions that “covered online services” must abide by when the covered online service has “actual knowledge” that particular users are minors (under 18) or children (under 13). In this respect, the Nebraska AADC is narrower than the California and Maryland versions, which apply to online services likely to be accessed by minors.
The Nebraska AADC defines “actual knowledge” as “all information and inferences known to the covered online service relating to the age of the individual” which includes self-identified age, and “any age the covered online service has attributed or associated with the individual for any purpose, including marketing, advertising, or product development” (emphasis added). The definition specifies that if the covered online service’s classification of an individual for purposes of marketing or advertising is inconsistent with the individual’s self-identified age, the covered online service must disregard self-identified age for purposes of compliance with the law. This is a significant departure from the knowledge standards used in most state comprehensive privacy laws and the Children’s Online Privacy Protection Act (COPPA), which generally do not adopt constructive knowledge standards for age.
A “covered online service” is defined as an entity that provides an online service that:
- conducts business in Nebraska;
- determines the purposes and means of processing consumers’ personal data;
- has an annual gross revenue greater than $25 million;
- annually buys, receives, sells, or shares the personal data of 50,000 or more consumers, households, or devices (alone or in combination with affiliates); and
- derives at least 50 percent of its annual revenue from the sale or sharing of consumers’ personal data.
This definition is significantly narrower than the coverage thresholds used in California and Maryland AADCs, which uses an “or” standard for meeting covered business qualifications.
Nebraska’s definition contains a narrow exception for online services with actual knowledge that fewer than two percent of its users are minors. It also contains exemptions for “personal data subject to a statute or regulation that is controlled by a covered online service that is required to comply with” various laws, including the GLBA and HIPAA.
Tools for Covered Design Features
If a covered online service offers a “covered design feature,” meaning any feature or component that encourages or increases the frequency, time spent, or activity of a user on the online service, it must comply with various requirements. The law enumerates certain examples of these covered design features, which include: 1) infinite scroll; 2) rewards or incentives for frequency of visits or time spent; 3) notifications or push alerts; 4) in-game purchases; and 5) appearance-altering filters.
The Nebraska AADC requires that covered online services offer minors tools that accomplish the following with respect to covered design features: limiting communication with other users; preventing unauthorized access to their personal data; opting out of “unnecessary” covered design features; controlling personalized recommendation systems (by allowing the minor to opt into a chronological content feeds or preventing certain categories of content from being recommended); opting out of or place limits on in-game purchases and transactions; and restricting the sharing of precise geolocation information and provide notice of tracking of precise geolocation information. Covered online services must establish default settings for these safeguards at the level or option that provides the highest level of protection available for the safety of the covered minor.
Covered online services are also required to provide parents with certain tools “to help parents protect and support” minors using the covered design features, with these tools required to be turned on by default for users known to be “children” (U13). This includes the ability for parents to view and control their child’s account settings, monitor the total time spent on the service, and set reasonable limits on usage, including the ability to restrict access during school hours and at night. For “minors” (U18), parents must also be provided tools to restrict purchases and financial transactions. Further, while any of these tools are in effect, the covered service must notify the minor and explain which settings have been applied.
Data Minimization
Notably, covered online services may only collect and use the minimum amount of a minor’s personal data necessary to provide the specific elements of the service with which the minor is knowingly engaged, only use that data for the reason it was collected, and only retain such data for as long as necessary to provide those specific elements.
Additional Requirements
In addition, all covered online services must provide covered minors with options to limit the amount of time they spend on the service. They must also provide an “obvious signal” to a covered minor when precise geolocation information is being collected or used or when parental monitoring is being used.
Covered online services are prohibited from using personal data collected for age verification for other purposes, facilitating targeted advertising to minors, and profiling minors that is not necessary for service provision. Notifications and push alerts to a covered minor are banned between 10 p.m. and 9 a.m. and between 8 a.m. and 4 p.m. on weekdays during the school year in the minor’s local time zone. Covered online services must also establish mechanisms for covered minors and parents to report harms on the services, are prohibited from facilitating advertisements for prohibited products, such as tobacco and alcohol, to covered minors, and are prohibited from using dark patterns. Finally, covered online services are required to designate at least one of its officers to be responsible for compliance with the law.
Enforcement
The Nebraska AADC takes effect on January 1, 2026. However, the state AG must wait until July 1, 2026, to recover civil penalties. A violation of the Nebraska AADC constitutes a separate deceptive trade practice violation under Nebraska’s Uniform Deceptive Trade Practices Act. The maximum civil penalty for each violation is set at $50,000.
Vermont Age-Appropriate Design Code Act
The Vermont AADC applies to covered businesses that provide online services, products, or features that minors are reasonably likely to access. It is substantively similar to the Nebraska AADC with regards to many default setting requirements (e.g., restricting communication with minor accounts by unconnected users and push notifications) and prohibited data and design practices (e.g., data minimization and dark patterns). But it goes further than the Nebraska AADC in that it establishes a “minimum duty of care” for covered businesses to avoid using a minor’s personal data or design features that will result in reasonably foreseeable “emotional distress,” reasonably foreseeable “compulsive use,” or discrimination against the minor. In contrast with the Nebraska AADC, the Vermont AADC does not contain specific requirements for parental tools. Further, the requirements for tools for minors and default settings specifically only apply to covered businesses that operate social media platforms. In addition, the Vermont AADC contains a number of unique provisions, described below.
Scope
Subject to enumerated exemptions, a “covered business” is defined as an entity that provides an online service that:
- conducts business in Vermont;
- determines the purposes and means of the processing of consumers personal data;
- collects consumers’ personal data or has consumers’ personal data collected on its behalf by processors;
- generates a majority of its annual revenue from “online services” (not defined); and
- whose online products, services, or features are “reasonably likely to be accessed by a minor.”
The statute prescribes specific indicia for determining whether an online service, product, or feature is reasonably likely to be accessed by a covered minor. These indicia are:
- if the service is “directed to children” as defined by COPPA and the FTC’s COPPA Rule;
- if the service, product, or feature is determined—based on competent and reliable evidence regarding audience composition—to be “routinely accessed” by an audience that is composed of at least two percent minors aged 2-17;
- if internal research indicates a similar composition of minors aged 2-17; or
- if the business is “knew or should have known” that at least two percent of the audience of the online service, product, or feature includes minors aged 2-17.
Notably, one of the threshold indicators for determining whether a business is “reasonably likely to be accessed by a minor” is set at a very low bar—just two percent of the audience being aged 2-17, including where a business “should have known” that at least two percent of the audience is comprised of minors aged 2-17. This could mean that many online services, regardless of its primary or intended audience, could suddenly find themselves covered by this law because a de minimis fraction of its visitors may be young users, such as teens.
Duty of Care
The Vermont AADC establishes a “minimum duty of care” for any covered business that processes a “covered minor’s data”2 in any capacity. A “covered minor” is defined as “a consumer who a covered business actually knows is a minor or labels as a minor pursuant to age assurance methods in rules adopted by the Attorney General.” As discussed further below, the Vermont AADC requires the AG to adopt rules establishing methods for identifying when a user is a minor (but the rules presumably should not require covered businesses to take steps to verify users’ ages, as the Vermont AADC does not require covered businesses to do so).
Establishing a minimum duty of care entails ensuring that the use of the minor’s personal data and the design of any online service, product, or feature will not result in the following harms:
- reasonably foreseeable “emotional distress”;
- reasonably foreseeable “compulsive use,” which is broadly defined in the statute to mean “the repetitive use of a covered business’s service that materially disrupts one or more major life activities of a minor, including sleeping, eating, learning, reading, concentrating, communicating, or working”; or
- discrimination against a covered minor based upon race, ethnicity, sex, disability, sexual orientation, gender identity, gender expression, religion, or national origin.
Minor Tools
For social media, the Vermont AADC also requires covered businesses to provide tools for covered minors to request the deletion or de-publication of their social media accounts.
Prohibited Data and Design Practices
The Vermont AADC also outlines certain prohibited data and design practices regarding minors. Covered businesses cannot collect, sell, share, or retain personal data of a minor that is unnecessary for the service with which the minor is actively and knowingly engaged, use previously collected data for any secondary purposes (except as required by law), or allow anyone, including parents or guardians, to monitor or track a minor’s online activity or to track their location without providing a conspicuous signal to the minor. Further, covered businesses cannot use a minor’s personal data to select, recommend, or prioritize media for the minor, unless it aligns with the minor’s i) express requests to receive such content, similar content, or categories of content, ii) user-selected privacy or accessibility settings, or iii) search query, provided the search query is only used to select and prioritize media in response to that search. Similar to Nebraska AADC, Vermont AADC prohibits sending push notifications to minors between midnight and 6 a.m.
Transparency
The Vermont AADC imposes a number of prescriptive disclosure requirements on covered businesses, including, for every feature of the service that uses the personal data of covered minors, descriptions of the purpose of the feature, the personal data collected by the feature, how that data is used or disclosed, the identity of third parties or processors that receive the data, the purposes for the disclosures, and how long the personal data is retained.
Further, covered businesses that use an “algorithmic recommendation system,” defined as “a system that uses an algorithm to select, filter, and arrange media on a covered business’s website for the purpose of selecting, recommending, or prioritizing media for a user,” must also disclose the purpose of each algorithmic recommendation system and details about the inputs used by these systems, including how they are measured, their use of minors’ personal data, their influence on recommendations, and how the inputs are weighted relative to other algorithm inputs.
Age Assurance
The Vermont AADC requires the state AG to adopt rules to establish “commercially reasonable and technically feasible methods” for determining if a user is a covered minor, including review processes for users appealing their age designations and additional privacy protections for age assurance data. The Vermont AADC outlines several privacy protections for data that covered businesses and processors collect during the age assurance process. Namely, it requires these entities to only collect personal data that is “strictly necessary” for age assurance, to delete any such data immediately after determining if a user is a covered minor, and to retain the personal data only for age range determination purposes. Covered businesses conducting age assurance are also prohibited from using age assurance data for secondary purposes, combining it with other personal data that is not necessary to determine age range, or disclosing it to third parties not acting as processors. Further, the law requires a review process to be implemented for users to appeal their age determination.
Rulemakings
In addition to the Age Assurance rulemaking described above, the AG also must adopt rules that prohibit covered businesses from engaging in data processing or design practices that lead to compulsive use or subvert or impair user autonomy, decision making, or choice during the use of an online service, product, or feature of the covered business.
The AG may initiate its mandatory rulemaking procedure starting as early as next month, July 1, 2025, with final rules required to be adopted by the law’s January 1, 2027, effective date.
Key Takeaways
The Nebraska and Vermont AADCs are the latest additions to the patchwork of state regulations focused on minors’ safety online. Notably, in an effort to avoid the type of constitutional defects that sank the California AADC, the Nebraska and Vermont AADCs are narrower. For example, Nebraska does not require covered businesses to assess and disclose whether content may harm minors, and the Vermont AADC includes language suggesting that there is no duty to mitigate or prevent harm stemming from user generated content.3 Further, the Nebraska and Vermont AADCs lack the prescriptive data protection assessment requirements that are included in the California AADC and Maryland AADC, and which have been challenged as unconstitutional.
Given state and federal regulators’ intense focus in this area, companies should expect to see increased regulatory scrutiny and enforcement on issues pertaining to child and teen online privacy and safety. While we think that an eventual legal challenge to these laws is likely, it remains to be seen how far a court will go in limiting or striking these laws, which were passed with overwhelming bipartisan lawmaker support. As such, companies should pay close attention to the requirements laid out in these laws as further AADC-copycat legislation may proliferate.
Wilson Sonsini Goodrich & Rosati routinely helps companies navigate complex privacy and data security issues and specializes in issues pertaining to children and teen privacy and online safety. We will continue to monitor developments at the state, national, and international level in order to assist companies with compliance. For more information, please contact Tracy Shapiro, Chris Olsen, Doo Lee, or another member of the firm’s Data, Privacy, and Cybersecurity practice.
[1]In March 2025, the U.S. District Court for the Northern District of California issued a preliminary injunction blocking enforcement of California’s Age-Appropriate Design Code based on First Amendment and constitutional grounds, stating that the CA AADC is content-based and likely fails strict scrutiny. In April 2025, the California Attorney General filed an appeal. The Maryland Age-Appropriate Design Code is also facing a similar legal challenge. In February 2025, NetChoice filed a lawsuit challenging the law. Both cases remain currently pending in court.
[2]This is a potentially broad threshold given that the duty applies to the processing of a covered minor’s “data,” rather than a covered minor’s “personal data.” It is unclear if the legislature intended to cover non-personal data such as publicly available information and aggregated data, or if this is an instance of imprecise drafting.
[3]Vermont’s duty of care section contains an express disclaimer that “the content of the media viewed by a covered minor shall not establish emotional distress, compulsive use, or discrimination” and that “[n]othing in this section shall be construed to require a covered business to prevent or preclude a covered minor from accessing or viewing any piece of media or category of media.”