“Violation of Disinformation Policy: Content that makes false claims that widespread fraud, errors, or glitches altered the outcome of the 2020 U.S. presidential election is not permitted on YouTube.”
That’s the message I got from YouTube after deleting the video of a recent interview I conducted with Peter Wood, an anthropologist and scholar who believes — like tens of millions of Americans — that the 2020 election was stolen from Donald Trump.
During the podcast, which YouTube eventually decided to restore after further appeal and scrutiny, Wood also claimed the Jan. 6 riots were instigated by liberal activists working in collusion with the FBI, a theory trumpeted by Trump and conservative media commentators.
I did not count the 2020 votes myself, nor was I at the Capitol on January 6. However, based on news sources I trust, I am very confident that Joe Biden was the legitimate winner of the election and that the vast majority of the January 6 rioters were Trump supporters determined to disrupt the peaceful transfer of power.
But millions of my fellow citizens are equally confident in the theories espoused by Wood and other conservative influencers.
Most Americans trust stories they believe to be true in good faith, regardless of their timeliness. They believe their side is protecting democracy, while the other side is deliberately working to undermine it. Misinformation thrives in an environment of low trust. And closing this widening “trust chasm” is essential to saving America from a cycle of extreme polarization and political violence.
I am sensitive to the efforts of tech platforms to restrict content that spreads dangerous misinformation related to the 2020 election, the Capitol riot, and COVID-19 vaccines – especially when created and amplified by cynical and bad faith actors who know they are spreading misinformation. The desire to de-platform perspectives deemed false or detrimental to public health and democracy is generally not the result of mean-spirited authoritarianism; it stems from a sincere concern for justice, truth and the lives of our neighbours. But completely excluding Americans who have fallen prey to false narratives from the conversation will simply further erode trust and allow misinformation to transform and spread. And indeed, limiting conversation to protect people from the consequences of misconceptions also prevents people from being exposed to the truth.
YouTube’s decision to reinstate my podcast due to its Educational, documentary, scientific or artistic instructionsthat take videos into account the context as well as his contents, was a sound. Although the platform lacks the ability to discern the nuances of content that trigger a violation in real time (it took over a week to restore the video), it is clear that YouTube is trying to develop policies that weigh on nuance, context and intent. Other tech platforms would do well to do the same. In an environment where user trust is low, it is crucial to state policies, explain guidelines, and communicate transparently when a decision has been reversed.
Whether on YouTube or at the kitchen table, it’s critical that we talk about — rather than just about — our fellow Americans and political opponents, including those who have placed their trust in false narratives. Establishing lines of communication and fountains of trust across partisan and epistemological divide will enhance our collective ability to ultimately build fact-based consensus and reduce the power of misinformation.
With the right approach, we can choose engagement over exclusion, dialogue over platform, and empathy over contempt. We can invite each other to the common pursuit of the truth upon which our vital experience of ordered freedom, however fragile and imperfect, depends. And perhaps, despite our differing views of what is fact and what is fiction, together we can find transcendent truth in each other’s values, experiences and identities.
Ciaran O’Connor is a leader of Braver Angels, a national nonprofit organization that works to depolarize America.