Application of FTC Act and Other Initiatives on Dark Patterns

By Aron Berger

Introduction

The Federal Trade Commission (“FTC”) Act declares unlawful any “unfair or deceptive acts or practices in or affecting commerce.”[1]  While this prohibition is clearly applicable to false or misleading advertisements, the FTC has brought enforcement actions in many distinct commercial contexts and concerning a wide swath of commercial conduct.[2]

By design, the FTC Act does not define “unfair or deceptive acts or practices.”  Instead, the FTC was intentionally given a “broad grant of discretionary authority” to protect consumers and competitors alike in an evolving marketplace.[3]  Using this broad grant, the FTC has enforced the Act against misleading advertisements and price claims, inadequate disclosures concerning product risks, hidden bundling of products and services, and the use of deceptive techniques such as bait-and-switch or pyramid schemes.[4]  But all actions under Section 45(a)(1)’s deceptiveness prong have a common thread: the presence of “a representation, omission or practice that is likely to mislead the consumer.”[5]

This blog post will examine the burgeoning application of the FTC Act and other consumer-protection initiatives to a recently-recognized form of deception in the design of electronic user interfaces—“dark patterns.”

Dark Patterns

“Dark patterns” refer to the intentional design of digital user interfaces to influence user decision-making to the benefit of the service provider.[6]  The term was first used by UK-based user interface researcher Harry Brignull in 2010 and has since gained significant attention in the user interface design community.[7]  By capitalizing on advances in cognitive science and psychology, designers can construct environments that subtly influence user behavior, steering or nudging consumers towards pre-determined ends.[8]

Some of the most obvious examples of dark patterns come from online sales transactions.  Employed in this context, dark patterns simply attempt to influence consumers into “spending more than they otherwise would.”[9]

(from Arvind Narayanan et al., Dark Patterns: Past, Present, and Future, 18 ACM Queue 67 (2020))

In the screenshot above, the ticking countdown timer creates the impression of urgency in the decision to make a purchase; however, the supposedly limited time offer continued even after the countdown reached zero.[10]

(retrieved from https://www.uxbooth.com/articles/ux-dark-patterns-manipulinks-and-confirmshaming/; last accessed 10-3-2021)

Here, the screenshot exhibits two specific dark patterns: misdirection and “confirmshaming.”[11]  The opt-out option is graphically deprioritized and phrased to guilt users into opting-in.

The design choices illustrated above are backed by a straightforward rationale—increasing conversion rates.  By capitalizing on online users’ habits and short attention spans, businesses can create platforms that increase the percentage of views that turn into completed transactions.[12]  But dark patterns are not limited simply to “traditional” retail-oriented transactions such as those above.  On the internet, dark patterns are quite ubiquitous, present in areas from subscription advertising[13] to videogame microtransactions and loot boxes.[14]  And while these design choices can be used to influence purchase decisions, they can also influence users’ decisions about their private data.[15]  Cookie consent notices serve as a particularly compelling case study of these “privacy dark patterns.”

Following the European Union’s General Data Protection Regulation in May 2018, many websites started presenting visitors with cookie consent notices purporting to inform users of the types of personal data that the website would collect and supposedly offering the ability to opt out.[16]  Recent data suggest that only around 0.1% of users would opt-in to data collection via cookies if the default state was non-collection.[17]  But many of the studied cookie collection notices employed dark patterns to ensure collection of as much data as possible, including by setting default options as the most intrusive, presenting the notice at the bottom corners of the screen, using color and text size to focus users on the opt-in button, or not presenting an opt-out option at all.[18]  And beyond cookie consent notices, privacy dark patterns have been observed in terms of service, account creation interfaces, and social media address books, to name only a few instances.[19]  In short, the potential for privacy-negative design choices is almost unlimited.

Growing Attention from Regulators

The applicability of Section 45(a) to the practices described above should be clear.  For example, an online advertisement’s representation that “time is running out” for a particular offer when there really is no time limit clearly meets the definition of a misrepresentation.  And designing cookie opt-out interfaces to make it difficult for users to decline could well be a deceptive practice within the scope of Section 45(a).  Given this clear applicability, online dark patterns have not gone unnoticed by regulators.  Just last year, FTC Commissioner Rohit Chopra released a statement concerning the use of dark patterns in an online children’s learning service called ABCmouse, which it had sued for violations of Section 45(a).[20]  Though the FTC’s complaint made no mention of the phrase “dark patterns,” it included various allegations directed at the service’s design and interface, which the FTC alleged misdirected, confused, and frustrated consumers who were attempting to discontinue their subscriptions.[21]  The FTC secured a $10 million settlement with ABCmouse, along with a stipulated order enjoining ABCmouse from certain conduct and mandating simple disclosures and a simple opt-out process.[22]

Aside from announcing the ABCmouse case, Commissioner Chopra’s statement targeted dark patterns generally, describing them as “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.”[23]  And while Commissioner Chopra did not call out any particular service providers, he did indicate a new effort from the FTC to “go after large firms that make millions, or even billions, through tricking and trapping users through dark patterns.”[24]  To be sure, the FTC was bringing enforcement actions against digital service providers long before it released this statement.[25]  But Commissioner Chopra’s statement could be a signal of a broader regulatory effort to enforce the FTC Act’s consumer protection provisions against larger, more systemic users of dark patterns.  Just this September, the FTC passed a resolution to initiate a ten-year investigation targeting unfair, deceptive and exploitative practices “relating  to the marketing of goods and services on the Internet, the manipulation of user interfaces (including, but  not limited to, dark patterns).[26]

A similar effort has been gaining some steam in policymaking circles, as well.  In 2019, Senator Mark Warner (D-VA) introduced the “Deceptive Experiences To Online Users Reduction Act” (or “DETOUR Act”), which was designed to curtail deceptive user interface design and encourage development of industry standard-setting.[27]  And though the DETOUR Act failed to pass in 2019, state-level initiatives have been more successful.  In 2020, California voters approved a series of updates to the California Consumer Privacy Act of 2018, named the California Privacy Rights Act (CPRA) of 2020.[28]  Included in the CPRA is a definition of dark patterns as “user interface[s] designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice” and requirements that the California Attorney General’s Office adopt regulations preventing the use of dark patterns in obtaining opt-in and consent to the sale of sensitive user data.[29]  The Act further defines “consent” by explicitly noting that agreement obtained through use of dark patterns does not constitute consent.[30]

Conclusion

Though dark patterns have been a subject of scholarly and in-field research for over a decade a, lawmakers and regulators have only recently begun defining the limits of acceptable design in digital interfaces.  And while the FTC’s recent enforcement efforts and the DETOUR Act and CPRA both signal increased attentiveness to deception in the digital marketplace, the future trajectory of these efforts remains to be determined.  For example, this September, the regulatory body responsible for implementing these provisions (the California Privacy Protection Agency) requested comments from members of the public on how it should go about regulating dark patterns.[31]  Similarly, the FTC held a workshop on dark patterns in April, seeking to define “effective prevention, mitigation, and remediation of the harmful effects of dark patterns.”[32]  While the ultimate shape and effectiveness of these efforts have yet to be determined, one thing is certain: privacy-conscious regulators and lawmakers are not willing to let dark patterns flourish for much longer.

 

 

[1] 15 U.S.C § 45(a)(1).

[2] See, e.g., F.T.C. v. All. Document Preparation, 296 F. Supp. 3d 1197, 1201 (C.D. Cal. 2017); F.T.C. v. LeadClick Media, LLC, 838 F.3d 158, 162 (2d Cir. 2016).

[3] F.T.C. v. IFC Credit Corp., 543 F. Supp. 2d 925, 934-35 (N.D. Ill. 2008).

[4] See, e.g., Federal Trade Commission, FTC Policy Statement on Deception, (1983); available at https://www.ftc.gov/system/files/documents/public_statements/410531/831014deceptionstmt.pdf.

[5] Id.

[6] See Federal Trade Commission, Statement of Commissioner Rohit Chopra Regarding Dark Patterns in the Matter of Age of Learning, Inc. (2020); available at: https://www.ftc.gov/system/files/documents/public_statements/1579927/172_3086_abcmouse_-_rchopra_statement.pdf

[7] Id.; see also Arvind Narayanan et al., Dark Patterns: Past, Present, and Future, 18 ACM Queue 67 (2020) at 67-70; Yael Grauer, Dark Patterns are designed to trick you (and they’re all over the Web), Ars Technica (2016); available at: https://arstechnica.com/information-technology/2016/07/dark-patterns-are-designed-to-trick-you-and-theyre-all-over-the-web/;

[8] Arvind Narayanan et al., Dark Patterns: Past, Present, and Future, 18 ACM Queue 67 (2020) at 71-73, 77.

[9] Id. at 77.

[10] Id. at 67-68

[11] Confirmshaming is “[t]he act of guilting the user into opting into something. The option to decline is worded in such a way to shame the user into compliance.”  https://www.darkpatterns.org/types-of-dark-pattern

[12] Lauren E. Willis, Deception by Design, 34 Harv. J.L. & Tech. 115 (2020) at 134-36.

[13] Id. at 137-38.

[14] Scott A. Goodstein, When the Cat’s Away: Techlash, Loot Boxes, and Regulating “Dark Patterns” in the Video Game Industry’s Monetization Strategies, 92 U. Colo. L. Rev. 285 (2021) at 287-89.

[15] Arvind Narayanan et al., Dark Patterns: Past, Present, and Future, 18 ACM Queue 67 (2020) at 77.

[16] Christine Utz et al., (Un)informed Consent: Studying GDPR Consent Notices in the Field, 2019 ACM SIGSAC Conference on Computer and Communications Security (2019) at 1-2.

[17] Id. at 2.

[18] Id. at 3-4.

[19] See, e.g., Christoph Bösch et al., Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns, 4 Proceedings on Privacy Enhancing Technologies 237 (2016) at 238, 248-52.

[20] Federal Trade Commission, Statement of Commissioner Rohit Chopra Regarding Dark Patterns in the Matter of Age of Learning, Inc. (2020); available at: https://www.ftc.gov/system/files/documents/public_statements/1579927/172_3086_abcmouse_-_rchopra_statement.pdf

[21] F.T.C. v. Age of Learning, Inc., No. 2:20-cv-07996, Dkt. No. 1 (C.D. Cal. Sept. 1, 2020), ¶¶ 25-39.

[22] Todd Kossow, Children’s Online Learning Program ABCmouse to Pay $10 Million to Settle FTC Charges of Illegal Marketing and Billing Practices, available at https://www.ftc.gov/news-events/press-releases/2020/09/childrens-online-learning-program-abcmouse-pay-10-million-settle; F.T.C. v. Age of Learning, Inc., No. 2:20-cv-07996, Dkt. No. 4-1 (C.D. Cal. Sept. 1, 2020) at 4-10.

[23] Federal Trade Commission, Statement of Commissioner Rohit Chopra Regarding Dark Patterns in the Matter of Age of Learning, Inc. (2020); available at: https://www.ftc.gov/system/files/documents/public_statements/1579927/172_3086_abcmouse_-_rchopra_statement.pdf

[24] Id.; see also Bringing Dark Patterns to Light: An FTC Workshop, available at: https://www.ftc.gov/news-events/events-calendar/bringing-dark-patterns-light-ftc-workshop

[25] See, e.g., F.T.C. v. Ross, 743 F.3d 886, 889-91 (4th Cir. 2014) (use of “scareware” advertising tactics to market software violates Section 45(a)); see also In the Matter of HTC America Inc., F.T.C. File No. 122 3049, Dkt. No. C-4406, at ¶¶ 24-26 (alleging violation of Section 45(a) via design of user interface) (available at: https://www.ftc.gov/sites/default/files/documents/cases/2013/07/130702htccmpt.pdf)

[26] FTC, Resolution Directing Use of Compulsory Process Regarding Deceptive and Manipulative Conduct on the Internet, File No. 212 3125 (Sept. 2, 2021), available at: https://www.ftc.gov/system/files/attachments/press-releases/ftc-streamlines-consumer-protection-competition-investigations-eight-key-enforcement-areas-enable/omnibus_resolutions_p859900.pdf

[27] DETOUR Act of 2019, S. 1084, 116th Cong. § 2 (2019).

[28] Shelby Dolen et al., CPRA Regulations: California Privacy Protection Agency Commences Preliminary Rulemaking Process, JDSupra (2021); available at: https://www.jdsupra.com/legalnews/cpra-regulations-california-privacy-7867021/

[29] Cal. Civ. Code. §§ 1798.140(l), 1798.185(a)(20(C)(iii).

[30] Id. § 1798.140(h).

[31] California Privacy Protection Agency, Invitation for Preliminary Comment on Proposed Rulemaking Under the California Privacy Rights Act of 2020, Proceeding No. 01-21 (2021); available at: https://cppa.ca.gov/regulations/pdf/invitation_for_comments.pdf

[32] Bringing Dark Patterns to Light: An FTC Workshop (2021); available at: https://www.ftc.gov/news-events/events-calendar/bringing-dark-patterns-light-ftc-workshop