YouTube’s Algorithm Pushes Hateful Content and Misinformation

Source: Click to View the Full Mozilla Foundation July 2021 Article

According to a Mozilla Foundation report published July 7th, 2021, violent videos and misinformation are amplified by YouTube’s algorithm, despite the company’s rules designed to limit their spread. These included conspiracies about 9/11 and the coronavirus pandemic, as well as the promotion of white supremacy.

“YouTube’s algorithm is working in amplifying some really harmful content…” suggests Brandi Geurkink, the Foundation’s senior manager of advocacy. “…its recommendation algorithm is not even functioning…it’s actually going off the rails.”

A YouTube spokesperson said it “constantly” works to improve users’ experience and has launched 30 different changes to reduce recommendations of harmful content in the last year. YouTube is owned by Google, and has long resisted sharing information about their algorithms…

But a growing body of evidence implicates social media’s recommendation algorithms in the spread of misinformation and violent content…and the studies have convinced lawmakers to come up with new rules to pry open tech platforms’ opaque algorithms…

YouTube is the most visited website in the world after Google, with users watching around one billion hours of videos on its platform every day. Mozilla’s investigation permitted more than 37,000 users from 91 countries to report “regrettable recommendations”…

Click to read the full Mozilla Foundation’s July 2021 article on the Politico website.

Sign up for Updates