fbpx

Content moderation won’t stop the spread of disinformation. Here’s why.

Our social discourse is tainted by mis- and disinformation that started long before Facebook and Twitter existed.

Over the past month, as new, more dangerous variants of the novel coronavirus have cropped up in various countries, some social media platforms have ramped up their fight against mis- and disinformation about the disease. Facebook, for instance, consulted with the WHO before expanding their list of false claims about COVID-19; the company announced they will delete posts that contain any of those claims. 

There’s no denying that mis- and disinformation are real problems that plague our societies. The former represents untrue information spread without the intent to deceive, while the latter is more insidious: Information that is intentionally circulated to mislead, sow chaos, or indoctrinate. Nearly all of us, at some point in our lives, have accidentally spread misinformation. Most have us have encountered it as well, whether from friends and family or authorities we were taught to trust.

As a child growing up in the United States, I encountered misinformation at public school regularly, taught as unquestionable “facts”: Columbus discovered America; the United States single handedly defeated the Nazis; America is the greatest country on earth; colonizers “civilized” the savage natives; Pluto is a planet, marijuana is a gateway drug…and so forth. In most cases, I was taught not to question these “facts.” Some were based on scientific error, but others were intentional. I was presented with a single-sided version of history that aligned with a certain narrative propagated by the country in which I was raised.

Of course, the United States is not alone in brainwashing its youth. In Morocco, where I lived during my early twenties, every schoolchild is taught the same line about Morocco’s colonization of the Western Sahara. Soviet schools taught children to revere Stalin—at least until they didn’t, following Kruschev’s de-Stalinization campaign that saw his image erased from history books. In Germany, where I live now, most friends say they were never taught about the country’s colonial past. And the vast majority of us throughout the world have spent our lives presented with a world map that distorts the size of certain countries.

Schools are not the only institutions that impart misinformation. All over the world, various faith traditions teach different and sometimes competing sets of values and histories. I was raised in a secular household and taught to respect believers, which I do—and yet, I have spent my entire life trying to reconcile the diverse and often conflicting teachings of various religions. Many others, raised in a particular faith, don’t struggle like that; instead, they believe firmly that whatever they were told as children is the ultimate truth. While diversity is part of what makes our world so complex and beautiful, these competing sets of beliefs have also caused countless wars and deaths. And yet, freedom of religious thought is generally upheld as a vital right, despite the fact that it’s simply impossible for all of these ideas to be factually accurate.

The thing is, there is absolute fact and there is the unknowable. There’s a reason why we don’t treat religion as disinformation despite the harms its adherents have caused throughout history: Because we can’t, in fact, know whether the deities in which we put so much faith exist.

What we do know, however, is that some of the information presented as fact by religious traditions has been proven to be scientifically false. And yet, we continue to allow it to propagate for fear of challenging traditions. Some disinformation, it seems, is simply not a priority.

Fact-checking as industry

During the Trump administration and particularly during the pandemic, fact-checking has been emphasized as a key measure in the war against disinformation, with numerous major publications engaging in fact-checking initiatives. The trouble is, many of the same publications that stress the importance of fact-checking and regularly deride social media companies for their failure to act against disinformation all too often engage in misinformation themselves. 

The New York Times infamously threw its considerable support behind the invasion of Iraq in 2003 and played a major role in disseminating the lie about weapons of mass destruction; the paper of record also employs several columnists who frequently propagate falsehoods presented as opinions. There are also numerous publications that report on conflicts in the Middle East through the lens of nationalism, putting an emphasis on U.S. interests over the price paid by civilians on the ground. 

The legacy media outlets, in other words, have played a significant role in creating a public discourse that is tainted by the pervasive belief that there is no such thing as objective truth.

Nor is the World Health Organization unqualifiedly committed to the truth. As social media platforms scramble to counter new disinformation about COVID-19, some critics have raised the point that the WHO was an early perpetrator of misinformation, telling people not to wear masks for fear that they could create a higher risk of infection. The sociologist Zeynep Tufkeci—whose insights have often been a breath of fresh air throughout the past year—noted on Twitter that the WHO and the mainstream media were guilty of propagating falsehoods during the early days of the pandemic. 

All of these examples demonstrate that mis- and disinformation are serious problems—and yet, the ways in which certain types of disinformation are prioritized for debunking, while others are allowed (often for nationalistic or propagandistic reasons) to flourish should serve to illustrate why our current dialogue around tackling mis- and disinformation—and particularly its emphasis on combating these ills with technology and censorship—is set to fail. As a society, we must become more comfortable with admitting that we don’t always have the answer; this is a project that must start with our youth.

An article in Vice about a new app called Clubhouse illustrates my point well. The sub-head of the article is: “The increasingly popular social media app is allowing conspiracy theories about COVID-19 to spread unchecked.” The article itself is well-reported, noting how falsehoods are shared on the audio-based platform by well-known figures and spread like wildfire. The piece also gets into the difficulties of moderating speech on an app where the speech is not only audio-based, but ephemeral—Clubhouse does not allow conversations to be recorded, meaning that moderation can only be done in real-time, an impossible venture at scale. 

And yet, a number of the experts quoted in the piece speak of the problem as one to be solved by technology, pointing to the moderation on other platforms done by humans or artificial intelligence as positive examples, rather than the hopeless game of whack-a-mole that they are.

It’s easy to see why tech companies and media ventures would seek to root out disinformation through moderation measures. It’s also easy to understand why they would try to tackle the “worst of the worst”—that is, the most pressing issues—in this manner. And there are indeed some moderation measures (such as going after repeat offenders, particularly those with power) that are reasonable. And yet, over the past few years I’ve watched countless panel discussions about “tackling” or “fighting” misinformation through technical measures, as if social media were the key battlefield and content moderation is how the war will be won.

It is eminently reasonable to fight certain disinformation using short-term means. Although I have concerns about some of the key details of, say, Facebook’s latest measure, I understand the importance of cutting off COVID-19 disinformation amidst far too many deaths and rising infection numbers. But I will not pretend that this is how we’ll solve the root causes of the problem.

If lawmakers are serious about combating disinformation, then they should start looking inside classrooms and churches. They should follow the money trail and look a bit harder at why our democratic systems are failing. And most importantly, they should step away from technosolutionism and stop viewing it as anything but what it is: A stopgap measure.