How Digital Disinformation Is Destroying Society
The virus effects old people much more so than young people, is transferred innocuously from person to person, and is usually noticed too late to prevent its spread. It is a novel threat poorly understood, and authorities are struggling to catch up and contain it. Efforts to combat it are usually too little, too late, and undermine our faith in political institutions meant to protect us.
I am, of course, talking about the pandemic of digital disinformation and fake news that is infecting our democracies. It is stark: authorities may be able to do little to prevent the 2020 Presidential election being stolen, and may not even be aware if it is.
The coronavirus, that we now take seriously enough to address with radical intervention, has a similar effect: cause disruption to our economy and society, and spread fear and mistrust. Digital disinformation aims to do exactly the same — albeit by design, not by random mutation.
It is a brutally effective political tool, and its wielders have an interest in keeping it well under the radar. They circumvent the usual flow of information from trusted news sources to voters, creating alternate realities that stoke fear, spread rumor, and relentlessly smear. It works.
It takes place where few can see it: in a Facebook feed visible only to you, in private WhatsApp groups, spreading via Twitter accounts too numerous to rein in.
The key artery of our democratic pact — that voters have the information necessary to make judgments favoring their interests and values — is being severed. James Madison warned in 1822, “A popular Government, without popular information, or the means of acquiring it, is but a Prologue to a Farce or a Tragedy; or, perhaps both.”
That farce, or tragedy, is upon us. In 2017, 28 countries suffered a “social media manipulation campaign” at the hands of a government or political party, according to a study by Oxford University. In 2019 they found evidence in 70 countries.
I was commissioned to do the digital postmortem on last year’s shock election result in Australia. There, I similarly found a deliberate disinformation campaign, largely on Facebook, had been a surprise contributing factor to a result that confounded all observers — including every public opinion poll.
My findings should ring a warning bell to American observers of this year’s election: what you see on TV and in candidates’ public Facebook feeds is not representative of their message flowing to persuadable voters. In the Australian context, the disinformation ranged from Facebook ads warning about a fake new tax on pickup trucks (using Facebook’s tools to show these ads to pickup truck owners), to doctored images of fake immigration policies on WeChat targeting Chinese-Australian voters.
Study after study shows people are unable to reliably distinguish between fake and real news (older demographics even more so), and that fake news spreads faster and further than real news. In addition, attaching a warning to fake news articles (as Facebook has begun in an effort to mitigate their spread) can be counter-productive.
Disinformation in political campaigns is not a new tactic. Even at its founding, American democracy was plagued by this. In the 1800 election, John Adams’ supporters spread the rumor that Thomas Jefferson was actually dead. In early 2000s, Karl Rove effectively torpedoed John McCain’s Presidential primary bid with a telephone poll that asked if they would be less likely to vote for McCain “if you knew he had fathered an illegitimate black child?”
What is new is the fracturing of the flow of traditional information and in its vacuum the rise of deliberate misinformation. With digital communications this disinformation can now spread quite literally like a virus, well out of the disinfecting sunlight a free press basks in.
In the same way many have fixated on a rush to find a vaccine to prevent COVID 19, instead of the immediate social distancing needed to flatten the current exponential curve, the touted cures for digital disinformation have all been wildly misplaced.
We can’t put the digital genie back in the bottle, in the same way we can’t win the disinformation wars by flagging and fact-checking articles on Facebook. This is not a failing of cyber defenses, this is a failing of society through its inequality of knowledge, information, media and web literacy.
This was a vacuum created by decades of inequality in education and a lack of preparedness for the information age. A vacuum the Russians and Chinese were ever too keen to fill. A focus on moralizing about them and other bad actors, or the self-regulation of social media platforms, misses the point: only a pandemic-level mobilization across all sectors can shine light into the dark recesses of the political internet in time to ensure this year’s election is fought in daylight.
Ed Coper is Executive Director of Center for Impact Communications
© Copyright IBTimes 2024. All rights reserved.