On April 19, 2022, the BBB National Programs’ Center (BBB NP) for Industry Self-Regulation launched the TeenAge Privacy Program (TAPP) Roadmap, a new operational framework to help companies develop digital products and services attuned to privacy risks facing teenage consumers. In the United States, children 12 and under are protected by the Children’s Online Privacy Protection Act (COPPA). Once these children become teenagers, they age out of COPPA’s protections and, with limited exceptions, are treated as adults online. Yet a growing body of research indicates that these teenage consumers are uniquely affected by privacy risks resulting in harms ranging from cyberbullying, to platform addiction, to amplified insecurities.1 Regulators are increasingly interested in investigating these harms. For instance, in a widely publicized incident, a coalition of state Attorneys General recently opened an investigation into Instagram following news reports of a whistleblower’s allegations that Facebook’s privacy practices harmed teenage users. Despite increased public and regulatory scrutiny, no federal law has been enacted to provide companies with guidance on these issues. While it is not legally binding, the TAPP Roadmap aims to help fill this guidance gap by providing organizations with concrete operational considerations and best practices to address teen privacy risks.
These best practices are likely to be particularly relevant for technology companies that may attract large teenage audiences, such as certain social media or gaming platforms. For example, a social media platform that is popular with teens may consider incorporating guardrails into their advertising and content practices to prevent teens from being targeted based on insecurities, or served age-inappropriate or inflammatory content. Companies that wish to align with the TAPP Roadmap may need to invest substantial technological and workforce resources to implement the BBB NP’s recommended best practices around content moderation and abuse prevention. More broadly, the TAPP Roadmap asks companies to evaluate teenagers as a standalone consumer group with distinct and evolving privacy needs. Again, while these suggestions are best practices and not legally enforceable mandates, they may help companies that are interested in incorporating privacy by design principles into new products or services. Moreover, as the call for teen-focused privacy legislation gets stronger,2 it is very possible that the BBB NP’s framework becomes part of the discussion for future law. In short, it’s all the more reason for companies to consider proactive measures to protect their teenage audience.
The full text of the TAPP Roadmap is available here. We provide an overview of some of the best practices identified in the Roadmap below:
- Obtain opt-in consent before engaging in behavioral advertising to known teens, and/or limit the purpose of collection to contextual advertising. At the time of obtaining opt-in consent, provide conspicuous notice that targeted ads based on information collected from teen users may be shown to the teen user across different sites or devices they use.
- Avoid targeting content to teens using a single criterion that could be especially sensitive to teens or amplify existing insecurities (e.g., body odor, hair loss, weight). Consider that sensitivities may be specific to age groups.
- Allow teen (or all) users to see what information was used to target them with ads.
- Set default to not collect precise geolocation data unless opted in via a clear, up-front disclosure.
- Turn off collection and use by default after inactivity or the end of the session.
- Limit precise geolocation collection and use to clearly disclosed purposes.
- Do not target salacious, incendiary, or highly polarizing content to teens, such as political topics. Avoid content that could be especially sensitive to teens or amplify existing insecurities.
- Consider implementing NSFW filters, or 18+ content filters. For U.S. audiences, consider Common Sense Media guides for appropriateness.
- Provide information to teen users to explain why they are seeing specific content, for example due to the content they “Like” or the types of content they engage with for longer periods of time. Empower users to adjust their preferences over time.
- Allow teen users to flag and remove unwanted reactions to their own user-generated content and give teen users control over which users can contact them directly in areas where direct messaging is possible.
- Provide the option to use safety mechanisms anonymously to avoid retribution from other users or ostracization or reputational harm within peer groups. Safety mechanisms may include the functionality to block, mute, or pause other users, or to filter keywords or reduce frequency of certain content.
- Provide fine-tuned audience controls and the ability to limit the visibility of their own content.
- Provide user friendly technical controls to empower users to flag and report harmful or illegal content or conduct.
Content moderation and abuse prevention:
- Implement technical features to monitor for inappropriate connections (“predator detection”) and harmful content (e.g., adult content, hate speech, drug use). Automate suppression of harmful content.
- Create and adhere to internal policies for suspending and removing based on strikes or extreme policy violations.
- Implement measures to prevent banned users from opening new accounts.
- Consider mechanisms to report certain content types to relevant law enforcement (e.g., CSAM, reputable threats, violence, and self-harm).
Retention of personal information:
- As teenagers become adults, minimize the potential of profiling them based on their teenage interests, behaviors, and activities. Consider shortening retention periods of teen information where there is a reasonably known increased risk of harm.
- Give teens control over their digital footprint and allow for changes in behavior and interests to be reflected.
For companies that are interested in distinguishing their privacy practices by implementing the TAPP Roadmap, please contact Maneesha Mithal, Libby Weingarten, Erin Delaney, or any member of the firm’s privacy and cybersecurity practice for additional information and advice.
See, e.g., Lisa M. Jones, Trends in Youth Internet Victimization: Findings from Three Youth Internet Safety Surveys 2000-2010, 50 J. Adolescent Health 179 (2012); Jennifer S. Saul & Rachel F. Rodgers, Adolescent Eating Disorder Risk and the Online World, 27 Child and Adolescent Psychiatric Clinics of N. Am. 221 (2018); BBB National Programs’ TeenAge Privacy Program, Risky Business: The Current State of Teen Privacy in the Android App Marketplace (2020), https://industryselfregulation.org/docs/librariesprovider5/default-document-library/tapp_whitepaper.pdf.
See, e.g., President Joseph Biden, Remarks of President Joe Biden—State of the Union Address as Prepared for Delivery (Mar. 1, 2022) (“It’s time to strengthen privacy protections, ban targeted advertising to children, demand tech companies stop collecting personal data on our children.”) https://www.whitehouse.gov/briefing-room/speeches-remarks/2022/03/01/remarks-of-president-joe-biden-state-of-the-union-address-as-delivered/.