米兰体育

Skip to content
NOWCAST 米兰体育 13 Midday Newscast
Watch on Demand
Advertisement

Meta, TikTok and other social media CEOs testify in heated Senate hearing on child exploitation

Meta, TikTok and other social media CEOs testify in heated Senate hearing on child exploitation
Advertisement
Meta, TikTok and other social media CEOs testify in heated Senate hearing on child exploitation
Sexual predators. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are dealing with on social media 鈥� and children's advocates and lawmakers say companies are not doing enough to protect them. On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the Senate Judiciary Committee to testify as lawmakers and parents grow increasingly concerned about the effects of social media on young people鈥檚 lives.The hearing began with recorded testimony from kids and parents who said they or their children were exploited on social media. Throughout the hours-long event, parents who lost children to suicide silently held up pictures of their dead kids.鈥淭hey鈥檙e responsible for many of the dangers our children face online,鈥� U.S. Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks. 鈥淭heir design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.鈥滻n a heated question and answer session with Mark Zuckerberg, Republican Missouri Sen. Josh Hawley asked the Meta CEO if he has personally compensated any of the victims and their families for what they have been through.鈥淚 don't think so,鈥� Zuckerberg replied.鈥淭here's families of victims here,鈥� Hawley said. 鈥淲ould you like to apologize to them?鈥滱s parents rose and held up their children's pictures, Zuckerberg turned to face them and apologized for what they have been through.Video below: Meta CEO Mark Zuckerberg apologizes to families during Senate hearing on child exploitationHawley continued to press Zuckerberg, asking if he'd take personal responsibility for the harms his company has caused. Zuckerberg stayed on message and repeated that Meta's job is to 鈥渂uild industry-leading tools鈥� and empower parents.鈥淭o make money,鈥� Hawley cut in.South Carolina Sen. Lindsay Graham, the top Republican on the Judiciary panel, echoed Durbin's sentiments and said he's prepared to work with Democrats to solve the issue.鈥淎fter years of working on this issue with you and others, I鈥檝e come to conclude the following: social media companies as they鈥檙e currently designed and operate are dangerous products," Graham said.He told the executives their platforms have enriched lives but that it is time to deal with 鈥渢he dark side.鈥滲eginning with Discord鈥檚 Jason Citron, the executives touted existing safety tools on their platforms and the work they鈥檝e done with nonprofits and law enforcement to protect minors.Snapchat had broken ranks ahead of the hearing and began backing a federal bill that would create a legal liability for apps and social platforms who recommend harmful content to minors. Snap CEO Evan Spiegel reiterated the company鈥檚 support on Wednesday and asked the industry to back the bill.TikTok CEO Shou Zi Chew said TikTok is vigilant about enforcing its policy barring children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, doesn鈥檛 cater to children.鈥淲e do not have a line of business dedicated to children,鈥� Yaccarino said. She said the company will also support Stop CSAM Act, a federal bill that make it easier for victims of child exploitation to sue tech companies.Yet child health advocates say social media companies have failed repeatedly to protect minors.鈥淲hen you鈥檙e faced with really important safety and privacy decisions, the revenue in the bottom line should not be the first factor that these companies are considering," said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media. "These companies have had opportunities to do this before they failed to do that. So independent regulation needs to step in.鈥漅epublican and Democratic senators came together in a rare show of agreement throughout the hearing, though it鈥檚 not yet clear if this will be enough to pass legislation such as the Kids Online Safety Act, proposed in 2022 by Sen. Richard Blumenthal of Connecticut and Sen. Marsha Blackburn of Tennessee.Meta is being sued by dozens of states that say it deliberately designs features on Instagram and Facebook that addict children to its platforms and has failed to protect them from online predators.New internal emails between Meta executives released by Blumenthal鈥檚 office show Nick Clegg, president of global affairs, and others asking Zuckerberg to hire more people to strengthen "wellbeing across the company鈥� as concerns grew about effects on youth mental health.鈥淔rom a policy perspective, this work has become increasingly urgent over recent months. Politicians in the U.S., U.K., E.U. and Australia are publicly and privately expressing concerns about the impact of our products on young people鈥檚 mental health,鈥� Clegg wrote in an August 2021 email.The emails released by Blumenthal鈥檚 office don鈥檛 appear to include a response, if there was any, from Zuckerberg. In September 2021, The Wall Street Journal released the Facebook Files, its report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate.Meta has beefed up its child safety features in recent weeks, announcing earlier this month that it will start hiding inappropriate content from teenagers鈥� accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders. It also restricted minors' ability to receive messages from anyone they don鈥檛 follow or aren鈥檛 connected to on Instagram and on Messenger and added new 鈥渘udges鈥� to try to discourage teens from browsing Instagram videos or messages late at night. The nudges encourage kids to close the app, though it does not force them to do so.Video above: The good, the bad and the ugly: Facebook's 20th anniversaryBut child safety advocates say its actions from the companies has fallen short.鈥淟ooking back at each time there has been a Facebook or Instagram scandal in the last few years, they run the same playbook. Meta cherry picks their statistics and talks about features that don鈥檛 address the harms in question,鈥� said Arturo B茅jar, a former engineering director at the social media giant known for his expertise in curbing online harassment who recently testified before Congress about child safety on Meta's platforms.Google's YouTube is notably missing from the list of companies called to the Senate Wednesday even though more kids use YouTube than any other platform, according to the Pew Research Center. Pew found that 93% of U.S. teens use YouTube, with TikTok a distant second at 63%.

Sexual predators. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are dealing with on social media 鈥� and children's advocates and lawmakers say companies are not doing enough to protect them.

On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the Senate Judiciary Committee to testify as lawmakers and parents grow increasingly concerned about the effects of social media on young people鈥檚 lives.

Advertisement

The hearing began with recorded testimony from kids and parents who said they or their children were exploited on social media. Throughout the hours-long event, parents who lost children to suicide silently held up pictures of their dead kids.

鈥淭hey鈥檙e responsible for many of the dangers our children face online,鈥� U.S. Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks. 鈥淭heir design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.鈥�

In a heated question and answer session with Mark Zuckerberg, Republican Missouri Sen. Josh Hawley asked the Meta CEO if he has personally compensated any of the victims and their families for what they have been through.

鈥淚 don't think so,鈥� Zuckerberg replied.

鈥淭here's families of victims here,鈥� Hawley said. 鈥淲ould you like to apologize to them?鈥�

As parents rose and held up their children's pictures, Zuckerberg turned to face them and apologized for what they have been through.

Video below: Meta CEO Mark Zuckerberg apologizes to families during Senate hearing on child exploitation

Hawley continued to press Zuckerberg, asking if he'd take personal responsibility for the harms his company has caused. Zuckerberg stayed on message and repeated that Meta's job is to 鈥渂uild industry-leading tools鈥� and empower parents.

鈥淭o make money,鈥� Hawley cut in.

South Carolina Sen. Lindsay Graham, the top Republican on the Judiciary panel, echoed Durbin's sentiments and said he's prepared to work with Democrats to solve the issue.

鈥淎fter years of working on this issue with you and others, I鈥檝e come to conclude the following: social media companies as they鈥檙e currently designed and operate are dangerous products," Graham said.

He told the executives their platforms have enriched lives but that it is time to deal with 鈥渢he dark side.鈥�

Beginning with Discord鈥檚 Jason Citron, the executives touted existing safety tools on their platforms and the work they鈥檝e done with nonprofits and law enforcement to protect minors.

Snapchat had broken ranks ahead of the hearing and began backing a federal bill that would create a legal liability for apps and social platforms who recommend harmful content to minors. Snap CEO Evan Spiegel reiterated the company鈥檚 support on Wednesday and asked the industry to back the bill.

TikTok CEO Shou Zi Chew said TikTok is vigilant about enforcing its policy barring children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, doesn鈥檛 cater to children.

鈥淲e do not have a line of business dedicated to children,鈥� Yaccarino said. She said the company will also support Stop CSAM Act, a federal bill that make it easier for victims of child exploitation to sue tech companies.

Yet child health advocates say social media companies have failed repeatedly to protect minors.

鈥淲hen you鈥檙e faced with really important safety and privacy decisions, the revenue in the bottom line should not be the first factor that these companies are considering," said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media. "These companies have had opportunities to do this before they failed to do that. So independent regulation needs to step in.鈥�

Republican and Democratic senators came together in a rare show of agreement throughout the hearing, though it鈥檚 not yet clear if this will be enough to pass legislation such as the Kids Online Safety Act, proposed in 2022 by Sen. Richard Blumenthal of Connecticut and Sen. Marsha Blackburn of Tennessee.

Meta is being sued by dozens of states that say it deliberately designs features on Instagram and Facebook that addict children to its platforms and has failed to protect them from online predators.

New internal emails between Meta executives released by Blumenthal鈥檚 office show Nick Clegg, president of global affairs, and others asking Zuckerberg to hire more people to strengthen "wellbeing across the company鈥� as concerns grew about effects on youth mental health.

鈥淔rom a policy perspective, this work has become increasingly urgent over recent months. Politicians in the U.S., U.K., E.U. and Australia are publicly and privately expressing concerns about the impact of our products on young people鈥檚 mental health,鈥� Clegg wrote in an August 2021 email.

The emails released by Blumenthal鈥檚 office don鈥檛 appear to include a response, if there was any, from Zuckerberg. In September 2021, The Wall Street Journal released the Facebook Files, its report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate.

Meta has beefed up its child safety features in recent weeks, announcing earlier this month that it will start hiding inappropriate content from teenagers鈥� accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders. It also restricted minors' ability to receive messages from anyone they don鈥檛 follow or aren鈥檛 connected to on Instagram and on Messenger and added new 鈥渘udges鈥� to try to discourage teens from browsing Instagram videos or messages late at night. The nudges encourage kids to close the app, though it does not force them to do so.

Video above: The good, the bad and the ugly: Facebook's 20th anniversary

But child safety advocates say its actions from the companies has fallen short.

鈥淟ooking back at each time there has been a Facebook or Instagram scandal in the last few years, they run the same playbook. Meta cherry picks their statistics and talks about features that don鈥檛 address the harms in question,鈥� said Arturo B茅jar, a former engineering director at the social media giant known for his expertise in curbing online harassment who recently testified before Congress about child safety on Meta's platforms.

Google's YouTube is notably missing from the list of companies called to the Senate Wednesday even though more kids use YouTube than any other platform, according to the Pew Research Center. Pew found that 93% of U.S. teens use YouTube, with TikTok a distant second at 63%.