The Federal Trade Commission is in the advanced stages of an investigation into YouTube’s handling of videos aimed at children, according to two people with knowledge of the inquiry.
The investigation, which could result in fines against YouTube, comes after complaints by parents and consumer groups that the video giant had collected data of young users.
The groups also complained that YouTube allowed harmful and adult content to appear in searches for children’s content, said the two people, who were not authorized to speak about the investigation because it was private. In addition, misinformation and inappropriate content appeared in YouTube’s recommendation engines, according to the complaints.
The FTC is pursuing the investigation of YouTube as regulators and lawmakers in Washington are signaling their interest in curbing the power and influence of some of the biggest tech companies. The agency is poised to announce a settlement with Facebook over the social network’s handling of user data and potential violations of a 2011 consent decree with the agency over previous privacy violations.
The House Judiciary Committee announced a broad antitrust investigation into Big Tech earlier this month. And the two top federal antitrust agencies, the Justice Department and the FTC, agreed to divide oversight over Apple, Amazon, Facebook and Google as they explore whether the companies have abused their market power to harm competition and consumers.
The FTC has put more focus on child privacy. In February, the consumer protection agency fined the music video-sharing app Music.ly, now known as TikTok, a record $5.7 million for violating child privacy laws. The FTC said in its settlement that TikTok allowed children under the age of 13 to use the site with little enforcement of its minimum-age requirement.
YouTube has been considering significant changes to its handling of children’s videos, including how its algorithms work with the videos, according to the two people briefed on the talks. The Wall Street Journal earlier reported the internal discussions.
“We consider lots of ideas for improving YouTube, and some remain just that — ideas. Others, we develop and launch, like our restrictions to minors livestreaming or updated hate speech policy,” Andrea Faville, a YouTube spokeswoman, said in a statement.
The Washington Post was the first to report news of the FTC’s investigation of YouTube.
The federal investigation of YouTube is the latest cloud hanging over the video service, which has more than 2 billion users. It has come under criticism for not doing enough to filter inappropriate or dangerous content, while promoting videos that espouse extreme points of view and inconsistently applying its own rules on harassment.
YouTube’s main site and app are intended for viewers 13 and older. The company directs younger children to the YouTube Kids app, which contains a filtered set of videos from the main site.
YouTube’s distinction between its main product and YouTube Kids is significant because of the rules on disclosure and parental consent that kick in for sites with “actual knowledge” that they are trafficking in the personal information of children younger than 13.
But consumer advocacy groups have argued that YouTube, which is owned by Google, is able to collect data on children under 13 through its main site, where cartoons, nursery-rhyme videos and those ever-popular toy-unboxing clips garner millions of views.
Dealing with children’s videos is particularly thorny for YouTube. Children are among the most avid users of YouTube, and videos geared toward them are popular on the platform. However, YouTube has struggled to keep inappropriate content away from children’s videos, in part because of the volume of videos being uploaded to the platform.
In February, YouTube was rocked by a video documenting how pedophiles used the comments on videos of children to guide other predators. After brands announced plans to boycott YouTube, the company said it would disable comments on most videos featuring children under 13.
Earlier this month, The New York Times published an investigation into how YouTube’s automated recommendation system promoted videos of scantily-clad children to people who had watched other videos of young children in compromised positions or sexually themed content.
The Campaign for a Commercial-Free Childhood, an advocacy group, said it had filed two complaints — one in 2016 and another in 2018 — with the FTC about YouTube.
In 2016, the group said YouTube and other companies were using so-called influencer marketing to target children. Last year, the group was part of a coalition of children’s advocacy groups that filed a complaint saying YouTube was collecting the personal information of children under 13 without obtaining parental consent.
“Action by the FTC is long overdue,” said David Monahan, campaign manager at the Campaign for a Commercial-Free Childhood. “If action is coming, we hope it comes with the conditions that YouTube clean up its act and stop targeting children.”
With skepticism toward the tech industry growing in Washington, the FTC investigation is likely to be applauded by lawmakers. Sen. Edward J. Markey, D-Mass., a frequent YouTube critic, welcomed the inquiry. In March, Markey and Sen. Josh Hawley, R-Mo., introduced a bill that would apply the children’s online privacy law to teenagers up to 16.
“It is no secret that kids flock to YouTube every day, but the company has yet to take the necessary steps to protect its youngest users,” Markey said in a statement. “I am pleased to see reports that the FTC is working to hold YouTube accountable for its actions.”