U.S.

Republicans and Democrats Just Came Together To Protect America’s Kids | Opinion


Something extraordinary happened this summer. During one of the most divisive periods in modern U.S. history, Senate Republicans and Democrats came together almost unanimously to advance legislation that protects America’s children. This bill, the Kids Online Safety Act (KOSA), requires internet companies to keep children’s safety in mind when designing the online platforms they use—something required of every other industry.

This new bill focuses specifically on the platforms’ design—not their content. Since senators likely do not agree about what content should be online, the bill instead provides a much-needed, common-sense response to new evidence that social media companies knowingly designed their products with features that exploit the still-developing brains of children and cause them harm.

Now, Big Tech is doing everything it can to kill the bill in the House of Representatives. Why? Money. The six most popular social media platforms earned roughly $11 billion in just one year from advertising that targets children, according to a recent study from Harvard University and Boston Children’s Hospital authors.

Though Big Tech had told Congress that its platforms’ harmful effects were unavoidable consequences of the medium, we now know this claim is false. Through congressional hearings, leaked internal documents, company whistleblowers, and investigative reporting, the Senate amassed a mountain of evidence that the tech industry is aware of its products’ dangerous aspects and designs them to take advantage of young people’s not yet complete brain development and keep them online as much as possible.

Historically, Big Tech has relied on two tactics to scare Congress into not passing legislation that might touch their corporate billions. They either argue it violates the 1996 Communications Decency Act or claim it violates the First Amendment, often with no basis. However, the Senate anticipated these arguments and, through careful drafting, largely avoided both problems in this bill.

Regarding Section 230 of the Communications Decency Act, which protects internet service providers from liability for content created by third parties, the new bill explicitly states that it neither expands or contracts the provisions of Section 230.

Regarding the First Amendment, the bill avoids focusing on content. Instead, it makes a modest requirement for specific acts of covered platforms. The Senate expressly seeks to hold platforms responsible based on their failure to “use reasonable care” in the design of their products’ features—not because of content that sits on the internet.

Despite Congress’ proactive steps, Big Tech continues, through its surrogates, to advance a false claim that this is a censorship bill. It is not.

In fact, the Senate has largely avoided the suggestion of government regulating content.

First, the bill explicitly states nothing in KOSA prevents anyone from “deliberately and independently” seeking sought-after information. Second, it targets pernicious design features, including endless scrolling, dark patterns, rewards for staying online, late-night notification, and unrequested recommendations. Third, though the bill does require that the Federal Trade Commission and a Kids Online Safety Council (which includes representatives of children, parents, and platforms) issue guidance for the platforms, the text again explicitly says that any enforcement action cannot be based on a failure to follow this guidance.

Boy using smartphone
PENZANCE, UNITED KINGDOM – AUGUST 15: A 13-year-old boy looks at a iPhone screen on August 15, 2024 in Penzance, England. The amount of time children spend on screens each day rocketed during the Covid…
PENZANCE, UNITED KINGDOM – AUGUST 15: A 13-year-old boy looks at a iPhone screen on August 15, 2024 in Penzance, England. The amount of time children spend on screens each day rocketed during the Covid pandemic by more than 50 per cent, the equivalent of an extra hour and twenty minutes. Researchers say that unmoderated screen time can have long-lasting effects on a child’s mental and physical health. Recently TikTok announced that every account belonging to a user below age 18 have a 60-minute daily screen time limit automatically set.

Matt Cardy/Getty Images

To comply, online platforms have no obligation to stop anyone from searching for any content. Rather, covered platforms must simply do what other businesses must do: engage in “safety by design.” This means, for example, setting the privacy settings for children at their most protective by default—allowing children and families to decide for themselves what content to access. Rather than requiring that content be removed from the internet, this approach allows children and parents to opt out of content and stimuli pushed by Big Tech to get and keep kids hooked.

In the words of one congressional witness, “The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they put their astronomical profits before people.”

Big Tech asserts that the bill’s measures violate the Constitution even before it has been enforced. The U.S. Supreme Court, however, recently expressed concern over broad attacks on legislation that come before courts can address specific cases.

There will likely be litigation over how this statute is enforced—Big Tech seems certain to make sure of that. But by advancing this broad censorship argument, Big Tech is essentially claiming that Congress can do nothing about a company knowingly designing a defective product just because it involves the internet.

If that were true, it would mean Congress could never pass any law regarding these platforms’ intentional behavior. But Congress has long allowed sensible regulation that might restrict actions somewhat in exchange for significant gains in public health and safety. We have laws, for example, against fraud (which implicates speech); prohibitions on the sale of alcohol, tobacco, and drugs to children (which implicates freedom); and limits on child labor (which implicates economic liberty).

The Supreme Court reserves its highest First Amendment scrutiny for government regulations that target speech because of its content and viewpoint. It allows reasonable non-content-based regulations—particularly those that neutrally affect the time, place, and manner of speech. We believe KOSA’s restrictions fall within the government’s right to limit the manner of speech, rather than its content.

Even if Big Tech were right that this bill may affect some speech, however, it would still not run afoul of the First Amendment.

As the Supreme Court made clear more than four decades ago, in New York v. Ferber (1982), the government has a compelling interest in “protecting the physical and emotional well-being of youth.” Indeed, the Court observed in Prince v. Massachusetts (1944) that “A democratic society rests, for its continuance, upon the healthy, well-rounded growth of young people into full maturity as citizens.”

While that does not mean Congress can pass any law to protect children regardless of its effect on free speech, it does mean Congress can pass laws that protect children from businesses that knowingly design products that harm them.

Mary Graw Leary is a Professor of Law at The Catholic University of America, Columbus School of Law. Warren Binford is a Professor of Pediatrics and the W.H. Lea Endowed Chair for Justice in Pediatric Law, Policy and Ethics at the University of Colorado School of Medicine. John Yoo is the Heller Professor of Law at the University of California, Berkeley, a nonresident senior fellow at the American Enterprise Institute, and a distinguished visiting professor in the School of Civic Leadership at the University of Texas at Austin.

The views expressed in this article are the writers’ own.

This post was originally published on this site

0 views
bookmark icon