Senators put executives from YouTube, TikTok and Snapchat on the defensive Tuesday, questioning them about what they’re doing to ensure young users’ safety on their platforms.
Citing the harm that can come to vulnerable young people from the sites — ranging from eating disorders to exposure to sexually explicit content and material promoting addictive drugs — the lawmakers also sought the executives’ support for legislation bolstering protection of children on social media. But they received little firm commitment.
“The problem is clear: Big Tech preys on children and teens to make more money,” Sen. Edward Markey, D-Mass., said at a hearing by the Senate Commerce subcommittee on consumer protection.
The subcommittee recently took testimony from a former Facebook data scientist, who laid out internal company research showing that the company’s Instagram photo-sharing service appears to seriously harm some teens. The subcommittee is widening its focus to examine other tech platforms, with millions or billions of users, that also compete for young people’s attention and loyalty.
“We’re hearing the same stories of harm” caused by YouTube, TikTok and Snapchat, said Sen. Richard Blumenthal, D-Conn., the panel’s chairman.
“This is for Big Tech a big tobacco moment ... It is a moment of reckoning,” he said. “There will be accountability. This time is different.”
To that end, Markey asked the three executives — Michael Beckerman, a TikTok vice president and head of public policy for the Americas; Leslie Miller, vice president for government affairs and public policy of YouTube’s owner Google; and Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc. — if they would support his bipartisan legislation that would give new privacy rights to children, and ban targeted ads and video autoplay for kids.
In a lengthy exchange as Markey tried to draw out a commitment of support, the executives avoided providing a direct endorsement, insisting that their platforms already are complying with the proposed restrictions. They said they’re seeking a dialogue with lawmakers as the legislation is crafted.
That wasn’t good enough for Markey and Blumenthal, who perceived a classic Washington lobbying game in a moment of crisis for social media and the tech industry. “This is the talk that we’ve seen again and again and again and again,” Blumenthal told them. Applauding legislative goals in a general way is “meaningless” unless backed up by specific support, he said.
“Sex and drugs are violations of our community standards; they have no place on TikTok,” Beckerman said. TikTok has tools in place, such as screen-time management, to help young people and parents moderate how long children spend on the app and what they see, he said.
The company says it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. The video platform, wildly popular with teens and younger children, is owned by the Chinese company ByteDance. In only five years since launching, it has gained an estimated 1 billion monthly users.
Early this year after federal regulators order TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for users under 18.
Pressed by Sen. Amy Klobuchar, D-Minn., about a 19-year-old said to have died from counterfeit pain medication he bought through Snapchat, Stout said, “We’re absolutely determined to remove all drug dealers from Snapchat.” She said the platform has deployed detection measures against dealers but acknowledged that they are often evaded.
Stout made the case that Snapchat’s platform differs from the others in relying on humans, not artificial intelligence, for moderating content.
Snapchat allows people to send photos, videos and messages that are meant to quickly disappear, an enticement to its young users seeking to avoid snooping parents and teachers. Hence its “Ghostface Chillah” faceless (and word-less) white logo.
Only 10 years old, Snapchat says an eye-popping 90% of 13- to 24-year-olds in the U.S. use the service. It reported 306 million daily users in the July-September quarter.
Miller said YouTube has worked to provide children and families with protections and parental controls like time limits, to limit viewing to age-appropriate content. The offshoot YouTube Kids, available in around 70 countries, has an estimated 35 million weekly users.
“We do not prioritize profits over safety. We do not wait to act,” she said.
The three platforms are woven into the fabric of young people’s lives, often influencing their dress, dance moves and diet, potentially to the point of obsession. Peer pressure to get on the apps is strong. Social media can offer entertainment and education, but platforms have been misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing, lawmakers say.
The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy. And Blumenthal especially asked the executives whether independent research had been conducted on the impact on young people of the platforms. He said the lawmakers want to receive information from the companies on such research soon.
TikTok, in its first time testifying before Congress, received especially fierce criticism during the hearing, particularly from conservative Republican lawmakers who highlighted its Chinese ownership. The company says it stores all TikTok U.S. data in the United States, with a backup facility in Singapore.
“TikTok actually collects less data than many of our peers,” Beckerman said.
Sen. Ted Cruz, R-Texas, told Beckerman that he dodged questions more than any witness he’s ever seen in Congress.