Tech

New bill would require companies like Facebook, Google to add features that protect children

Key Points
  • Two senators introduced a new bill Wednesday that would give online platforms a duty to prevent or mitigate certain harms to minors including suicide, eating disorders and substance abuse.
  • It would have a significant effect on the design of platforms made by companies like Facebook parent Meta, Snap, Google and TikTok.
  • The Senate subcommittee received thousands of pages of documents from former Facebook employee Frances Haugen, who also testified before the panel.
Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., conduct a news conference in the Capitol.
Tom Williams | CQ-Roll Call, Inc. | Getty Images

Two senators introduced a new bill Wednesday that would give online platforms a duty to act in kids' best interests and prevent or mitigate the risk of certain harms including suicide, eating disorders and substance abuse.

The Kids Online Safety Act was introduced by Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., respectively the chair and ranking member of the Senate Commerce subcommittee on consumer protection. If passed, the bill would have a significant effect on the design of platforms made by companies like Facebook parent Meta, Snap, Google and TikTok.

The subcommittee received thousands of pages of documents from former Facebook employee Frances Haugen, who also testified before the panel. The documents revealed in part that the company had researched its platforms' impact on children and found negative effects on the mental health of some teen girls. Lawmakers who later confronted executives from Facebook, including Instagram chief Adam Mosseri, were outraged the company hadn't done more to alter its services after the research findings.

The Kids Online Safety Act would raise the standards for online platforms that are "reasonably likely to be used" by kids aged 16 or younger to better protect them.

It requires those companies to implement safeguards that minors or their parents can easily access to "control their experience and personal data."

That would include platform settings that help them limit the ability of others to find minors online, restrict the amount of data that can be collected on them, allow them to opt out of algorithmic-recommendations systems using their data and limit their time spent online.

"I think we are on the cusp of a new era for Big Tech imposing a sense of responsibility that has been completely lacking so far," Blumenthal told reporters at a press conference Wednesday. "And we know that it is not only feasible and possible, but that it works."

He pointed to similar standards in Europe that experts described at one of the subcommittee's hearings.

Blumenthal said he's willing to hear from the tech companies that would be impacted by the bill, though remains wary of their true goals.

"If the tech companies want to come to the table, we're always ready to hear them," Blumenthal said. "But all too often in the past, they have unleashed their armies of lawyers and lobbyists to oppose legislation."

Blumenthal said he thought of the bill as similar to a product safety law that exists for other types of goods.

"For way too long, the internet was regarded as different from all other products," Blumenthal said. "Well, now we're going to have guardrails and safeguards for the internet that will enable children and their parents to protect themselves."

Notably, the bill also requires platforms to make the strongest version of these safeguards the default setting on their services. What's more, it would prohibit services from encouraging minors to turn off those controls.

Covered platforms would need to release annual public reports based on an independent, third-party audit of the risks of harm to minors on their services. They would also need to provide access to data for researchers vetted by the National Telecommunications and Information Administration to conduct public interest research on the harms to minors online.

Unlike other bills targeting online platforms, this one does not include a minimum size threshold to be considered liable under the act. Blumenthal said that's because "responsibility is not a matter of the numbers of people who are viewing or seeing toxic content."

The bill also directs government agencies to figure out the best ways to protect minors on these services. For example, it directs the Federal Trade Commission to create guidelines for covered platforms on how to conduct market- and product-focused research on minors. It also requires the NTIA to study how platforms can most feasibly and accurately verify ages of their users.

The bill would create a new council of parents, experts, tech representatives, enforcers and youth voices, convened by the Commerce secretary to give advice on how to implement the law. It would be enforced by the FTC and state attorneys general.

Blumenthal said he's hopeful the Commerce Committee will move efficiently toward a markup so the bill can eventually reach the floor for a full Senate vote.

Blackburn told reporters that the issue of child safety online is one she and her colleagues hear about constantly from their constituents.

"This issue of what is happening online to children is something that comes up repeatedly, with them saying, there has to be something done about this," she said.

Subscribe to CNBC on YouTube.

WATCH: How Facebook makes money by targeting ads directly to you

How Facebook makes money by targeting ads directly to you
VIDEO9:0509:05
How Facebook makes money by targeting ads directly to you