Close this menu

Pre-Rulemaking Stakeholder Sessions: Dark Patterns

Jennifer Huddleston, Policy Counsel

1401 K St NW, Ste 502

Washington, DC 20005

netchoice.org

California Privacy Protection Agency

Pre-Rulemaking Stakeholder Sessions: Dark Patterns

May 5, 2022

Dear Chair Urban and Members of the California Privacy Protection Agency board:

Thank you for the opportunity to participate in today’s stakeholder session.

 

As the CPPA considers how to handle privacy rulemaking, the agency should avoid overly expansive actions that would penalize the uses of neutral technologies in a way that undermines many daily uses of technologies such as algorithms that benefit consumers and can even provide solutions related to privacy, security, and authentication.

The CPPA should carefully consider the impacts its decisions may have beyond privacy and how they interact with existing laws and tools to resolve underlying consumer concerns.  As with any regulations, the agency should consider the impact these rules have on these technologies and users and ensure that rules are grounded in their mandate related to privacy and balance concerns about other issues such as speech. The agency should avoid dictating a specific design that does not take into account the differences in technologies, types of data collected, and user preferences. The agency should also consider how existing laws and regulations may address the underlying concerns.

When it comes to “dark patterns,” the agency should be cautious of the negative impacts over-regulation may have and seek to address specific harms. Any regulations the agency considers should have clear definitions of the harmful behavior it seeks to redress to avoid unintentionally prohibiting neutral or beneficial practices and consumer privacy preference.

But as research around dark patterns discusses, many of the concerns around manipulative options known as “dark patterns” are most likely already addressed by existing precedents around unfair or deceptive practices.  An over-zealous approach could result in an agency dictating user interface designs without full consideration of the distinctions in products, services, audiences, and communication methods. In some cases, providing very specific and clear features like an “unsubscribe” button may work simply, but in others, it is more complicated to communicate the variety of features that may be impacted by a consumer’s privacy choices. There might not be malicious intent, but rather an attempt to ensure consumers fully understand the impact of their choices.

As with many privacy scenarios, often the two best tools available to policymakers are consumer education and redressing harmful conduct. This should include pursuing those bad actors who are engaged in deceptive and manipulative practices as would be done in offline scenarios around such consumer protection violations and tied to the specific harms such issues perpetrate. This can include providing clarity around any specifically prohibited practices, but should also recognize the design difference that may arise depending on a product and service being offered. Policymakers should be cautious in presuming data collection or interaction with consumers is inherently harmful.

In addition to regulation, the agency should also consider less interventionist approaches that empower consumers and innovators to make choices that support privacy decisions that align with consumer preferences and help identify deceptive and unfair practices.

Thank you for the opportunity to speak during this pre-rulemaking phase.

Sincerely,

Jennifer Huddleston

Policy Counsel

 

NetChoice is a trade association that works to make the internet safe for free enterprise and free expression.