Logo

Logo

YouTube violate its own content policies: Mozilla

YouTube’s very own algorithm actively recommended over 71 per cent of all videos that volunteers reported as regrettable.

YouTube violate its own content policies: Mozilla

"YouTube needs to admit their algorithm is designed in a way that harms and misinforms people," said Brandi Geurkink, Mozilla's Senior Manager of Advocacy. (Photo: iStock)

Mozilla has alleged that YouTube keeps pushing harmful videos, and its algorithm recommends videos with misinformation, violent content, hate speech and scams to its over two billion users.

The in-depth study also found that people in non-English speaking countries are far more likely to encounter videos they considered disturbing.

Mozilla conducted the research using RegretsReporter, an open-source browser extension that converted thousands of YouTube users into YouTube watchdogs.

Advertisement

“YouTube’s controversial algorithm is recommending videos considered disturbing and hateful that often violate the platform’s very own content policies,” according to a 10-month long, crowdsourced investigation released by Mozilla late on Wednesday.

Firefox published the results of an analysis of Google’s Federated Learning of Cohorts (FLoC) proposal. Firefox CTO Eric Rescorla said there are major privacy problems with the system.

YouTube’s recommendation system results in more than 200 million views a day from its homepage, and that it pulls in more than 80 billion pieces of information, as per media reports.

“We constantly work to improve the experience on YouTube, and over the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content,” Youtube said.

At RegretsReporter, people voluntarily donated their data, providing researchers access to a pool of YouTube’s tightly-held recommendation data.

Research volunteers encountered a range of regrettable videos, reporting everything from Covid fear-mongering to political misinformation to wildly inappropriate “children’s” cartoons.

“The non-English speaking world is most affected, with the rate of regrettable videos being 60 per cent higher in countries that do not have English as a primary language,” the findings showed.

YouTube’s very own algorithm actively recommended over 71 per cent of all videos that volunteers reported as regrettable.

Almost 200 videos that YouTube’s algorithm recommended to volunteers have now been removed from YouTube — including several that the platform deemed violated their own policies. These videos had a collective 160 million views before they were removed, said the Mozilla report.

“YouTube needs to admit their algorithm is designed in a way that harms and misinforms people,” said Brandi Geurkink, Mozilla’s Senior Manager of Advocacy.

“Our research confirms that YouTube not only hosts but actively recommends videos that violate its very own policies. We also now know that people in non-English speaking countries are the most likely to bear the brunt of YouTube’s out-of-control recommendation algorithm,” Geurkink emphasised.

Recommended videos were 40 per cent times more likely to be regretted than videos searched for. Several Regrets recommended by YouTube’s algorithm were later taken down for violating the platform’s own community guidelines, the report mentioned.

Last month, Firefox said that Google’s new proposal for targeted ad tracking has several properties that could pose “significant” privacy risks to users.

Advertisement