On September 15, 2022, the Federal Trade Commission (FTC) held an open Commission meeting that covered three agenda items: 1) a rulemaking on impersonation scams, 2) a policy statement on enforcement related to gig work, and 3) a staff report on dark patterns. While items (1) and (3) moved forward with a bipartisan 5-0 vote, the policy statement on the gig economy was adopted with a 3-2 vote along party lines. This alert provides some insight into the implications for future FTC activity in these areas.

Notice of Proposed Rulemaking on Impersonation Scams

During the meeting, the FTC voted 5-0 to issue a notice of proposed rulemaking codifying the established principle that impersonation scams violate the FTC Act. The proposed rule would also allow the FTC to recover money from, or seek civil penalties against, scammers who impersonate businesses or governments.

Analysis/Takeaways: This proposed rulemaking is not particularly controversial. Although some have expressed skepticism about the FTC’s exercise of its so-called Magnuson-Moss rulemaking authority in other contexts, there appears to be bipartisan support for using it for this type of narrow issue. This stands in sharp contrast to the disagreements surrounding the FTC’s other recent proposed rulemaking on privacy, which moved forward on a 3-2 vote, asked for public comment in 95 areas affecting the entire economy, and generated concerns about the FTC potentially exceeding its statutory authority.

Policy Statement on Enforcement Related to Gig Work

The FTC set forth multiple areas for enforcement priority with respect to the gig economy:

  • Earnings claims: The FTC stated that false, misleading, or unsubstantiated claims about workers’ earnings could be considered unfair or deceptive under Section 5 of the FTC Act. The FTC also stated that, under the Business Opportunity and Franchise Rules, gig companies that require new participants to make payments may need to disclose their earnings claims and materials that support those claims.
  • Undisclosed costs or terms of work: Similar to earnings claims, the FTC said that deceptive claims or nondisclosures about start-up costs, training fees, other expenses, or other material terms related to gig work employment can violate Section 5 of the FTC Act, the Franchise Rule, or the Business Opportunity Rule.
  • Algorithmic decision making: The FTC highlighted how gig economy companies could violate Section 5 of the FTC Act when they use algorithms to dictate employment-related decisions, such as hiring and firing, amount of pay, availability of work, and performance evaluation.
  • One-sided contractual terms: The FTC warned against lopsided, non-negotiable gig worker contracts that include provisions such as prohibiting negative employee reviews or seeking other employment during or after someone’s time at the company. It stated that these one-sided terms could be considered unfair under Section 5 of the FTC Act.
  • Unfair competition: The FTC stated that it will investigate evidence of agreements between gig companies to illegally fix wages, benefits, or fees for gig workers. The FTC will also challenge mergers that will substantially lessen competition and investigate exclusionary or predatory conduct that could cause harm to customers or reduced compensation or poorer working conditions for gig workers.

Commissioners Noah Phillips and Christine Wilson dissented. Both commissioners suggested that the FTC should focus its activities on enforcement efforts, rather than policy statements. Commissioner Wilson expressed concern that the FTC was overstepping its mission by addressing worker harms, as opposed to consumer harms.

Analysis/Takeaways: The discussions at the meeting confirmed what has been clear to FTC watchers: competition, consumer protection, and privacy issues in the gig economy will continue to be a major focus for Chair Lina Khan’s agenda.

Staff Report on Dark Patterns

The FTC voted 5-0 to issue a staff report on dark patterns, stemming from an April 2021 FTC workshop on the same topic. The FTC defined dark patterns as “design practices that trick or manipulate users into making choices they would not otherwise have made and that may cause harm,” and stated that it would take enforcement action when companies use these patterns to deceive consumers. In its report, the FTC provided many examples of problematic dark patterns. Some examples are well grounded in the law and precedent, such as use of deceptive testimonials or endorsements, the formatting of advertisements to falsely appear to be independent journalism or other content, and the failure to inform consumers of recurring subscription charges or allow easy cancellation of subscriptions. But the FTC also highlights newer and more unexpected examples of dark patterns, such as the following:

  • In the area of sales tactics:
    • Creating pressure to buy a product by falsely saying that demand is high (“20 other people are viewing this item”) or stock is low (“only one left!”)
    • Baseless/fake countdown clocks, false limited time messages (e.g., offer ends in 00:59:48), and even false “discount” or “sales” claims
    • Keeping shoppers from easily comparing prices by bundling things, using different measures (price per unit v. price per ounce), or listing the price per payment (such as $10 per week) without disclosing the total number of payments or overall cost
    • Adding hidden fees or introducing fees very late in the purchasing process without prior disclosure (e.g., unexpected “convenience fee” appearing only right before checkout)
  • In the area of privacy:
    • Obscuring or subverting privacy choices by using double negatives (“Uncheck the box if you prefer not to receive email updates”), confirm shaming (e.g., “No, I don’t want to save money”), and preselecting defaults that are “good for the company and not the consumer”
    • Making users create an account or share their information to complete a task
    • Asking repeatedly and disruptively if a user wants to take an action
  • With respect to children’s advertising:
    • Hiding real costs by requiring consumers to buy things with virtual currency (e.g., “coins” or “acorns” in kids’ apps)
    • Automatically playing another video once one video ends in a manner that is unexpected or harmful (e.g., after the first video, a less kid-friendly video—or a sponsored ad camouflaged to look like a recommended video—automatically plays)
    • Using cartoon characters to encourage children to make in-app purchases

Analysis/Takeaways: Although it is questionable whether the FTC would be able to prove that some of their specific examples rise to the level of deceptive or unfair practices, many of these examples mirror examples of dark patterns provided in the proposed California Privacy Protection Agency Regulations issued this summer. Given regulatory scrutiny of these issues, companies should review their consumer interfaces in light of these examples to make sure their practices will not catch the eye of these regulators.

Wilson Sonsini Goodrich & Rosati routinely helps companies navigate complex privacy, data security, and consumer protection issues and respond to FTC and other regulatory investigations. For more information on privacy issues, please contact Maneesha MithalLydia ParnesRoger Li, or another member of the firm’s privacy and cybersecurity practice. For more information on antitrust issues, please contact Michelle Yost Hale, or another member of the firm’s antitrust and competition practice.