Perspective: Tackling Misinformation on YouTube

Perspective: Tackling Misinformation on YouTube

By Neal Mohan, Chief Product Officer, YouTube

Misinformation has moved from the marginal to the mainstream. No longer contained to the sealed-off worlds of Holocaust deniers or 9-11 truthers, it now stretches into every facet of society, sometimes tearing through communities with blistering speed. Seemingly no topic is immune. All too frequently, we’ve seen misinformation spin up in the midst of breaking news. Following tragic events like violent attacks, theories emerge by the second on everything from a shooter’s identity to motive. In these moments, what happens in the world also happens on YouTube. We reflect the world around us, but know we can also help shape it. And that’s why we’ve made stopping the spread of misinformation one of our deepest commitments.

“Speedy removals will always be important but we know they’re not nearly enough. Instead, it’s how we also treat all the content we’re leaving up on YouTube that gives us the best path forward.”

The solution might not be what you’re thinking—that we just need to get better at removing more content, more quickly, from our site. We’ve had that focus since the beginning of YouTube through our Community Guidelines. Today, we remove nearly 10 million videos a quarter, the majority of which don’t even reach 10 views. Speedy removals will always be important but we know they’re not nearly enough. Instead, it’s how we also treat all the content we’re leaving up on YouTube that gives us the best path forward.

The most important thing we can do is increase the good and decrease the bad. That’s why at YouTube we’re ratcheting up information from trusted sources and reducing the spread of videos with harmful misinformation. When people now search for news or information, they get results optimized for quality, not for how sensational the content might be. Here’s why we’ve made this core to our approach:

First, if we only focus on what we remove, we’re missing the massive amount of content that people actually see. Bad content represents only a tiny percentage of the billions of videos on YouTube (about .16-.18% of total views turn out to be content that violates our policies). And our policies center on the removal of any videos that can directly lead to egregious real world harm. For example, since February of 2020 we’ve removed over 1M videos related to dangerous coronavirus information, like false cures or claims of a hoax. In the midst of a global pandemic, everyone should be armed with absolutely the best information available to keep themselves and their families safe.

But in order to identify clear bad content, you need a clear set of facts. For COVID, we rely on expert consensus from health organizations like the CDC and WHO to track the science as it develops. In most other cases, misinformation is less clear-cut. By nature, it evolves constantly and often lacks a primary source to tell us exactly who’s right. Like in the aftermath of an attack, conflicting information can come from all different directions. Crowdsourced tips have even identified the wrong culprit or victims, to devastating effect. In the absence of certainty, should tech companies decide when and where to set boundaries in the murky territory of misinformation? My strong conviction is no.

“In the absence of certainty, should tech companies decide when and where to set boundaries in the murky territory of misinformation? My strong conviction is no.”

We saw this play out in the days that followed the 2020 U.S. Presidential election. Without an official election certification to immediately point to, we allowed voices from across the spectrum to remain up. But our systems delivered the most trustworthy content to viewers. That first week, the most watched channels and videos for election coverage came from reliable news outlets. Once states certified their election results in early December, we began removing content with false claims that widespread fraud changed the outcome of any past U.S. presidential election. We’ve since taken down thousands of videos for violating our elections-related policies, with over 77% removed before hitting 100 views.

An overly aggressive approach towards removals would also have a chilling effect on free speech. Removals are a blunt instrument, and if used too widely, can send a message that controversial ideas are unacceptable. We’re seeing disturbing new momentum around governments ordering the takedown of content for political purposes. And I personally believe we’re better off as a society when we can have an open debate. One person’s misinfo is often another person’s deeply held belief, including perspectives that are provocative, potentially offensive, or even in some cases, include information that may not pass a fact checker’s scrutiny. Yet, our support of an open platform means an even greater accountability to connect people with quality information. And we will continue investing in and innovating across all our products to strike a sensible balance between freedom of speech and freedom of reach.

I sometimes get asked whether we leave up hot-button content because we benefit financially. Not only have we found that this type of content doesn’t perform well on YouTube—especially compared to popular content like music and comedy—it also erodes trust with viewers and advertisers. We’ve committed significant time and money to address this, and as we’ve done so, our company, and therefore our creator economy has benefitted. In short, responsibility is good for business.

Some will likely disagree with our approach and ask us to take down or leave up even more content, but I’m buoyed by the progress of these early investments. Our teams continue to work around the clock to improve our systems and we’ll keep building on the foundational work that helps us combat misinformation. I’ll detail more on that soon, but I hope these perspectives shed light on how we’re thinking about the broader misinformation challenge at YouTube.

Take from: https://blog.youtube/inside-youtube/tackling-misinfo/