Week 8 NYT article un-pay-walled- YouTube the Great Radicalizer

YouTube, the Great Radicalizer

Tufekci, Zeynep

At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google’s business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes.

What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with – or to incendiary content in general.

Is this suspicion correct? Good data is hard to come by; Google is loath to share information with independent researchers. But we now have the first inklings of confirmation, thanks in part to a former Google engineer named Guillaume Chaslot.

Mr. Chaslot worked on the recommender algorithm while at YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performance. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues.

The Wall Street Journal conducted an investigation of YouTube content with the help of Mr. Chaslot. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos.

It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommended videos as well as its patterns of recommendations. He discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended.

Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro-Clinton content, and YouTube’s tendency toward the incendiary seems evident.

YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims. Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax.

What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.

Human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life. For example, our craving for fat, salt and sugar, which served us well when food was scarce, can lead us astray in an environment in which fat, salt and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes and misinformation.

In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.

This situation is especially dangerous given how many people – especially young people – turn to YouTube for information. Google’s cheap and sturdy Chromebook laptops, which now make up more than 50 percent of the pre-college laptop education market in the United States, typically come loaded with ready access to YouTube.

This state of affairs is unacceptable but not inevitable. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.

2 Likes

thank you Sam! just added this link to the curriculum

I just wanted to highlight this line:
It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm.
That just amused (and saddened me).

Thank you for sharing this article. The fact that YouTube’s algorithms have a bias toward inflammatory content shouldn’t surprise me, but then WHY is this surprising me? I like to think I’m reasonably smart, can look at media critically…

It’s alarming to realize how little defense I may have of YouTube rabbit holes radicalizing my politics, how can I expect others in our civil society to have a defense? Are inflammatory YouTube rabbit holes the reason I feel as though the non-mask wearing stranger next to me in the store is almost seething with anger and daring me to challenge them?

Are these types of algorithms supposed to drive me toward apathy so I can mentally survive the day-to-day?

I believe it was @shelley who spoke recently about the sensation that where it can hard to know where you begin and where the algorithm ends - how it takes over and manipulates you so you’re no longer just your ‘self’ - you’re buying the shoes, or the conspiracy theory, that is being sold to you.

Like, I know my mom isn’t (wasn’t) really afraid of immigrants “invading” the country, but she’s been talking as if it’s a real, personal concern of hers. Where DOES this come from? I feel like she is slipping into this… alternative Fox News reality. Not to be dramatic, but it almost feels like she’s slipping away from me. I can’t talk to her about so. many. issues.

Just processing all this… thanks for reading :slight_smile:

1 Like

I think it’s good to retain the ability to feel surprise or shock at just how sinister this all is.

One major criticism I have of this article and of Zeynep Tufekci’s arguments in general is that she equates left and right radicalism as both bad/too extreme. This to me is a meaningless position to take, especially when you consider a few things: one, the radical right has absolute dominance on platforms like YouTube, and on platforms where there’s more of a mix of people and opinions, like Facebook, it’s been shown again and again that far right content is favored. The far right also dominates in how much funding is behind their propaganda.

The other reason why it’s meaningless and harmful to equate left and right radicalism is because when you examine what these positions actually are, you find that only one of them is…basically a death cult. For example, the so-called radical left view on responding to coronavirus amounts to something like: we should close everything down for as long as it takes, pay people a living wage from the treasury, everyone should wear a mask to protect their neighbor, and everyone should get Medicare because health care is a human right. The radical right position is that 200,000 dead is not a high number, that the virus is maybe a hoax, etc etc, I know you are all familiar with it.

But that said, your question also brings up the important point that while we can get mad at the people who get brainwashed by these hideous ideas, when it comes down to it these are not individual issues, but systemic ones. Rather than wanting people to defend their horrible views, we need to address these problems at the root (the etymology of “radical”).

I think the short answer is “yes”, because even if the actual intent is that the algorithm is just designed to make you buy more products, apathetic people are probably more pliable and manipulable by brands. But I think it’s fair to assign even more sinister intentions to these algos, about even greater types of social control, and we’ll actually be getting into that this week with our speaker Varoon Mathur from AI Now Institute.

Such an important point and so true. Even just taking it from the position of how much of an attention suck all of these platforms are: if I do nothing but doomscroll, who am I?

This is really sad to hear. I have a few family members poisoned by QAnon and “coronavirus was manufactured in a lab” conspiracies, but none as close to me as a parent. I’m so sorry you have to deal with that, and I don’t think you’re being dramatic at all…these conspiracies really do make people slip away.

And I don’t know what happened with your mom personally, but I do know that both YouTube and Fox News share the age-old strategy of stoking white American racisms, blaming Black and Brown folks for the various social and economic failures that are attributable to the American ruling class.

This stinks - I’m sorry Jodi. This reminds me of the segment John Oliver did about conspiracy theories. That when something happens so dramatically, that feels big - a simple answer, even if it’s the truth, never feels satisfactory. And because of this malaise, the right has a opportunity to swoop in with an answer - however false, because it’s an answer that makes sense to our lizard brain.

An interesting thought experiment would be to come up with a leftist conspiracy theory that convinces folks to wear masks and take 'Rona seriously. Aerosol-ed “vaccinations” a-la chemtrails that only masks can protect you from? Stay home, don’t go to work or the “vaccines” will get you?

1 Like