Disinformation ceased to be something marginal and began to occupy an increasingly larger space. Now, it is no longer restricted to the universes of groups that deny the Holocaust or invent false theories about 9/11: it is already reaching all facets of society and crossing entire populations at a terrifying speed. It seems that no subject is immune from the spread of misleading information. We often see disinformation emerge in the midst of breaking news. After tragic events, such as violent attacks, theories soon appear that range from the shooter’s identity to the motive behind the action. At times like these, what happens in the world also happens on YouTube. We are a reflection of reality—but at the same time, we can help shape it. Therefore, preventing the dissemination of misleading information is one of our most important commitments.
Perhaps the solution is not exactly what users imagine: increasingly improving our ability to remove content from our sites, in greater quantities and faster. That’s been our focus since YouTube came along, thanks to the Community Guidelines. We currently take nearly 10 million videos off the air per quarter — most of them don’t even reach ten views. Removing inappropriate content quickly will always be important. But we know that this is not enough. In fact, the best way is to think about how we treat all the content that remains available on YouTube.
The most important thing we can do is to increase the good things and decrease the bad things. With that in mind, YouTube highlights information from reliable sources and reduces the dissemination of videos that contain harmful misinformation. When people search for news or information, they see results optimized for the quality, not the attractiveness of the content. Below, we talk about the reasons that lead us to adopt this strategy.
First, if our focus is only on what has to be taken off the air, we may lose sight of the huge amount of content that people actually watch. Harmful content represents a small fraction of the billions of videos watched on YouTube (between 0.16% and 0.18% of total views are content that violates our rules). Our policies focus on removing any video that could directly lead to real and practical harm. An example: since February we’ve taken down more than a million videos related to dangerous information about the coronavirus, such as fake cures or scams that take advantage of the disease. In the midst of a pandemic, everyone must have access to the best information available to ensure their own safety and the safety of their families.
However, to identify clearly malicious content, you need to have hard facts on hand. In the case of Covid, we start from the consensus of experts from health organizations such as the WHO (World Health Organization) and the CDC (Center for Disease Control and Prevention, in the United States), who follow the latest advances and scientific discoveries. In other cases, however, it is not so easy to define what misinformation is. By definition, misleading information is always changing, and there is often no primary source able to say with absolute certainty who is right. As with what happens after attacks, conflicting information can come from different sides. Clues pointed out by the population have even come to identify the wrong culprits and victims, causing serious consequences. When there is no certainty, is it up to technology companies to decide when and where to demarcate the borders of the nebulous territory of disinformation? I am convinced that the answer is “no”.
We saw this happen in the days following the 2020 presidential election in the United States. As we still didn’t have an official and final statement on the outcome of the election right after the polls closed, we decided to allow voices from different sides to continue in the air. Even so, our systems delivered the most trusted content to the public. That first week, the most-watched channels and videos on election coverage were from credible sources. In early December, when the states started to announce official results, we started to take down content alleging electoral fraud from the air. Since then, we’ve taken down thousands of videos that violated our rules on election-related content — and more than 77% of those videos went off the air before they even reached 100 views.
An overly aggressive strategy regarding the removal of content could also have a reckless effect on freedom of expression. Taking content off the air is a peremptory resource. Overused, it can give the impression that divergent views and opinions are not accepted. We have seen a disturbing move by governments ordering content to be withdrawn for political reasons. Personally, I believe society always works best when there is open debate. Often, one person’s misinformation can equate to another person’s deep belief—and that includes controversial, possibly offensive ideas and, in some cases, even information that would not survive a fact-checking process. However, our support for an open platform further enhances the mission’s responsibility to connect people with trusted information. We will continue to invest and innovate in our products, with the objective of reaching a balance between freedom of expression and a wide range of information, always guided by common sense.
Critics say we keep controversial content on the air because it brings us financial benefits. However, we already know that this type of content does not perform well on YouTube, especially when compared to more popular content such as music or comedy. What’s more: misinformation erodes public and advertiser confidence. We devote considerable time and resources to this issue. And in doing so, our company wins – and so does the community of creators. In short: being responsible is good for business.
Some people are likely to disagree with our strategy and say we should retire or keep more content. But I base myself on the advances we have seen with our investments. Our teams also work non-stop to improve systems and advance important initiatives to combat misinformation. We will soon bring more news on this subject. I hope this article has shown you a little more about how YouTube faces the challenge of fighting misleading information.