Politics & Government

Are YouTube and Facebook doing enough to protect your kids online?

The National Center for Missing and Exploited Children runs an online tip service that receives potential allegations of online child sexual exploitation from technology companies and internet users, and directs those tips to law enforcement. The line got over 18.4 million tips last year.

Five years ago, it got 1.1 million.

As the numbers grow, child internet protection experts partially blamed large tech and social media companies for fostering an online environment that’s dangerous for young internet users.

The Senate Judiciary Committee Tuesday heard testimony on the concerns. Sen. Lindsey Graham, a South Carolina Republican who chairs the committee, said he would consider legislation to remove protections against lawsuits for tech companies that didn’t actively try to protect children from online sexual predators or explicit content.

Experts and researchers with Protect Young Eyes, an internet safety group which seeks to teach parents about safe online behavior for children, have witnessed the “troubling and pervasive darkness that exists in the pockets of millions of young people today,” Christopher McKenna, the group’s founder, told the committee.

McKenna said that his group created multiple fake test accounts on Instagram that would “mimic the behavior of an average teen girl.”

He described how “We posted two selfies with three hashtags each, searched a few hashtags, and liked a few photos” and “within a week we had dozens of men sending us images of their penises, telling us they were horny, and sending us pornography through direct messages.”

McKenna blamed tech companies like Facebook, which owns Instagram, for not better regulating its user base and being quicker about barring potential sexual predators from using the services.

Facebook officials could not be reached for comment.

“Things would change tomorrow if you could get sued,” said Graham, who added that many large web-based companies had strong protections against lawsuits but they were “not doing a lot to earn it.”

Graham specifically cited Section 230 of the 1996 Communications Decency Act. It generally specifies that companies, like YouTube or Facebook, are protected from penalties or lawsuits when users on their websites post content that might warrant a lawsuit. Under current law, the companies have these protections.

He said that he hoped to work with lawmakers to create legislation that established a “set of best business practices” for tech companies to follow to protect young users from explicit content or sexual predators.

Graham didn’t fully specify what he wanted those best practices to look like, but he said that if companies didn’t meet those standards then they should be open to lawsuits from the parents of children who may have been sexually exploited over online platforms.

“If you meet those best business practices you’re okay,” Graham said. “If you don’t, you’re going to get sued. Seems to me that that will do more good than anything else.”

Several lawmakers also took aim at YouTube, after several news agencies have reported that the company’s automated video recommendation system—an algorithm that recommends new videos to users based on their previous viewing history—is allowing sexual predators to share and access hundreds of videos of young, partially clothed children.

Sen. Richard Blumenthal of Connecticut, the only Democrat at hearing, said that he and Sen. Marsha Blackburn, a Tennessee Republican, had asked YouTube to change its policy after the news broke but was “frankly disappointed” that the video sharing platform has yet to announce changes.

Legislation introduced by Sen. Josh Hawley, a Missouri Republican, would fine YouTube if the platform did not stop automatically recommending videos that primarily feature children.

“This report was sickening,” Hawley said of a researcher’s report on the YouTube video recommendation algorithm. “But I think what was more sickening was YouTube’s refusal to do anything about it.”

YouTube has responded by changing its recommendation system and removing some videos that were sexually exploitative of children.

“Protecting kids is at the top of our list,” Jennifer O’Connor, YouTube’s product director for trust and safety, told the New York Times last month.

Graham said during the hearing that he would try to bring representatives from larger social media companies before the committee in the near future.

When asked why the social media companies were not invited to Tuesday’s hearing, Taylor Reidy, a spokesperson for the committee’s Republican majority, said the hearing was just the beginning of tits “discussion regarding protecting children online.”

  Comments