Close this menu

Testimony Against MN HF 3724—violates First Amendment

Minnesota House Commerce Committee Hearing Testimony

  Download

Jennifer Huddleston, Policy Counsel

1401 K St NW, Ste 502

Washington, DC 20005

netchoice.org

Minnesota House Commerce Committee Hearing

HF 3724 Undermines Parental Choice, Fails Children and Teens, and Violates the First Amendment

March 22, 2022

Dear Chair Becker-Finn, Vice-Chair Moller,  and Members of the House Judiciary Committee:

While well intentioned, we ask that you not advance HF 3724 as it:

  • Undermines parental choice,
  • Removes the access to beneficial technologies from young people, and
  • There are clear concerns regarding the proposal’s constitutionality.

 

Many policymakers and voters are understandably concerned about the content children and teenagers may be exposed to on and offline.

As a former educator myself, I understand the good intentions of these bills; however, proposals such as HF 3724 are not the solution and will be unconstitutional on First Amendment grounds.

As a result, even though the intentions of protecting teenagers are laudable, the committee should not advance HF 3724.

The Bill Undermines Parental Choice and Creates a False Sense of Security

Today parents have a wide range of opinions of what sort of content they deem appropriate for their children at any age. This includes their decision about whether or not to allow their teenagers to be on social media.

But HF 3724 would undermine parental choice as it dictates to content providers that these teens cannot interact in the ways technologies currently work.  In doing so, HF 3724 sends a message to parents that the state, not parents, will determine the specific age at which a child or teenager is ready to interact with technology. Doing so takes away the option from parents to choose the balance and risks that work best for their family and determine when their teenager is ready to interact with technology.

Additionally, the law would create a false sense of security and might mean that parents and other adults won’t have important conversations with teens about what to do when exposed to harmful content. With HF 3724, parents are less likely to engage in conversations with their teenagers on how to distinguish reliable sources or the distortion that can come from photo filters. Instead, parents may even let their teenagers engage on social media before they are ready or without critical online safety tools.

A better solution is to empower parents and teenagers to understand the content they consume online and make the appropriate choices. HF 3724 is a one-size-fits-all approach. HF 3724 lacks the nuances of different online experiences and treats all recommendations as equally harmful. The result could be that parents are less likely to talk to their teenagers about the content they are consuming online and create more secretive behaviors around many devastating issues such as bullying, eating disorders, or depression.

The Bill Undermines Children and Teenager’s Beneficial Usage of Technology and Requires More Data Collection

Definitions of terms like “social media” used in the bill are incredibly broad and presume all recommendations are potentially harmful to young people. The results would have a negative impact on children and teenager’s ability to use technology in beneficial ways. HF 3724, however, holds all uses of algorithms equal and so penalizes the good elements that can empower, encourage, and protect teens along with the bad.

But HF 3724 not only impacts those traditional social media platforms like Instagram or YouTube, it would potentially impact a far wider array of user generated content sites online including those that would be useful for research and education. For example, given that social media includes any website or app that “allows users to create, share, and view user-created content,” this could include resources such as book review site GoodReads or even newspapers with comments sections. As a result, young people would be unable to receive recommendations done by algorithms that guide them to books based on their previous interests and reviews by similar readers. A newspaper might not be able to recommend further related news stories by algorithm to a student doing research if comments are attached.

Finally, while the goal of HF 3724 is to protect teenagers, it should be noted that it would actually require companies to collect more information about users under 18. It is unlikely that the proposal is technologically feasible in many cases, but even if it were, a company would have to know the age of the user and that they were located in Minnesota to then disable any algorithmic recommendations. In order to do so, information that might not otherwise be collected regarding age and location would be needed.

The Bill Violates the First Amendment and Raises Dormant Commerce Clause Issues

In many cases the proposal might not be technologically feasible and would result in undermining the safety features and SPAM filters that many of us rely on.

In Sorrell v. IMS, the Supreme Court ruled that information is speech and that a Vermont law could not prohibit the creation and dissemination of information including the selling of data to a database. Even more relevant here, multiple court cases have held that the distribution of speech, including by algorithms such as those used by search engines, are protected by the First Amendment. This proposal would result in the government restraining the distribution of speech by platforms and Minnesotans access to information. Thus, HF 3724 will be deemed by courts as a violation of the First Amendment.

And while the government is entitled to take reasonable steps to protect minors from harmful content that might otherwise be constitutionally protected, it may not do so in a way that is so broad it limits adults’ access to legal content. In Ashcroft v. ACLU, the Supreme Court struck down a federal law that attempted to prevent the posting of content harmful to teenagers on the web due to such impact as well as the harm and chilling effect that the associated fines could have on legal protected speech. This bill will face similar challenges.

This proposal should be considered distinct from those proposals that require libraries and schools to have filters on computers or other connected devices through which children and teens access the internet – this is constitutional, in part, because the restrictions are based on receipt of federal and state funding.

However, HF 3724 enjoys no such protections as it is a mandate that specifies that a specific type of technology must be disallowed for all content for users under 18 even if the algorithm is being employed to protect the user from harmful content.

Beyond the significant First Amendment concerns with the proposal also raises concerns that it may violate the dormant commerce clause due to its impact on interstate commerce.

The internet by its very nature is interstate and state regulations such as the ones proposed place a significant impact on interstate commerce and interactions well beyond the young people the law seeks to protect.

The currently broad definitions make it likely that the bill would be interpreted as applying beyond Minnesota’s borders but unclear entirely to whom and how.

  • If a Minnesota teenager is on a class field trip to Washington, DC does the law still apply?
  • What about an Arizona teenager visiting their grandparents in Minnesota?

The lack of clarity provides both uncertainty for companies and constitutional concerns in these scenarios.

Given the negative impacts on parental choice and young people themselves, and its likely unconstitutionality, we ask you to not advance HF 3724.

Thank you and we welcome the opportunity to speak with you further about the protection of children and teens online.

Sincerely,

Jennifer Huddleston

Policy Counsel

 

NetChoice is a trade association that works to make the internet safe for free enterprise and free expression.