My finger hovered over the mouse as I briefly considered retweeting the “Battle of Snake Island” footage. You may have seen it; you may have retweeted it yourself. It was, apparently, the last moments alive of 13 heroic Ukrainian soldiers, with a Russian ship demanding surrender, and the imperishable Ukrainian response, “Russian warship, go fuck yourself”. The coda: all 13 had immediately been killed by a Russian bombardment.
But while my finger hovered, I did not retweet to my nearly 200,000 followers. Why not? There was something a bit too perfect about the tiny tale of courage and atrocity, and I had no way of knowing whether any of it was true. For all I knew, the photograph was taken in Shetland and the conversation was audio from a Romanian pizza advert.
After a few days it became clear that while the exchange was genuine, the coda was not. The defiant soldiers had been captured alive. The Russians had said so all along; even a stopped clock is right twice a day.
For those of us fortunate enough to be a safe distance from the horrors of war, the Russian invasion of Ukraine is providing a crash-course in how to think both about accidental misinformation and deliberate disinformation.
I draw five lessons.
First, we should recognise that a lot of disinformation is absurdly simple. For many decades, people have fretted about “damned lies and statistics”, fearing that cleverly manipulated data was the ultimate weapon of disinformation. More recently there has been something of a panic about “deepfake” video technology. But it doesn’t take a master of video effects to fool us. For a receptive or distracted audience, a simple lie will do.
A lot of the disinformation that is circulating is kindergarten-level stuff: clips from computer games or relabelled footage. UkraineFacts.org, a collaboration between fact-checking organisations, has hundreds of examples, including video of paratroopers filmed years ago in North Carolina, a photo of a Soviet-era missile taken in a museum, and footage from the movie Deep Impact. The camera may not be lying, but the caption is.
Such “recontextualised media” are ideally suited to social sharing. TikTok’s main function, for example, is to make it easy to edit then share clips of media, stripped of their original context.
Second, we should slow down and pay attention both to the claim and to our reaction. We don’t fall for misinformation because we’re stupid but because we’re emotionally aroused. We can often spot the lie if we think calmly. But if we are angry, fearful, lustful or laughing out loud, calm thinking is what we don’t do.
Third, we have allies in our fight for the truth. There’s a growing movement of diligent independent fact-checkers, and there are also people out there called “journalists” whose job it is to figure out what is going on. Some of them are pretty good at what they do, and some of them are risking their lives right now to do it.
Fourth, we must remember that indiscriminate disbelief is at least as damaging as indiscriminate belief. It might seem smart to reject every claim as potential disinformation, but it is wiser to try to figure out the difference between truth and lies.
Indeed, disinformation is often designed less to con the gullible, and more to force us all into a reflexive crouch, instinctively rejecting the very idea that the truth will ever be known. Few people are fooled by clumsy footage of a fake President Zelensky ordering Ukrainians to surrender, but rather more will go on to reject footage that is perfectly genuine.
The non-profit news organisation ProPublica recently reported the phenomenon of fake fact-checking. Social media posts, amplified by Russian state TV, appear to be fact-checkers debunking Ukrainian disinformation. In reality, they are themselves disinformation, debunking claims that were never made. It’s a more sophisticated version of the UK’s Conservative party briefly rebranding itself on Twitter in 2019 as a fact-checking organisation. The aim, in both cases, is probably not straightforward deception. It is to breed confusion, cynicism and distraction.
Which brings me to lesson five: we mustn’t lose sight of what matters. I’m writing this column about disinformation because I know more about disinformation than Kremlinology or combined-arms warfare. But it is vital not to let a discussion of disinformation distract us from what is happening — an outrageous war, an economic crisis and a humanitarian catastrophe.
While most of us are far from the tanks and the bombs, we are all participating in an information war. The good news is that every one of us has been in training for it all our lives. We have developed a keen sense for bullshit, and filled our cognitive toolboxes with sharp and sturdy tools for thinking clearly.
Every newspaper article you’ve read, every political debate you’ve watched, every conversation you’ve had and every social media post you’ve ever thought about sharing — all of them have prepared you for the challenging yet essential task of taking responsibility for what you read, share, believe and disbelieve.
For most of us, the scarce resources in this information war aren’t years of study or intellectual brilliance. They are softer assets: curiosity, patience, persistence and judgment. It is not too late to bring them to the battle.
Tim Harford’s new book is ‘How to Make the World Add Up’
Follow @FTMag on Twitter to find out about our latest stories first