The video brought tears to my eyes. A Ukrainian father wept as his wife and daughter boarded a bus to flee the country. His daughter’s hands stretched toward him in a silent plea to leave with the family. His wife’s eyes were swollen and red with tears already spent.
There was only one problem.
It wasn’t actually a Ukrainian soldier. It was a Russian soldier. The news agency sharing the story got it all wrong. They probably found the story on Twitter.
Speaking of Twitter, a day before I saw that photo, a friend texted me a screenshot of a tweet by Matt Walsh where he claimed most Americans (including himself) experience pedophilic desires. I was shocked, so I visited his page for more context.
Sure enough, the tweet had gone viral and Matt was responding to it. But not because it was real. He was responding because the image was a photoshopped fabrication.
Two days before that, Elon Musk added me to his “NFT & Crypto” Twitter list. He offered to enter me into a contest to win 15,000 Bitcoin. To enter, I simply needed to tweet at him and five other people on the list. For the next day, my Twitter feed jammed up with posts from other people on the list tweeting at me and Elon.
Unfortunately, none of them won anything. It was a fake Twitter account. The giveaway links were a phishing scam.
In one week, I narrowly avoided one case of misinformation and two cases of disinformation. But rather than feeling pride, I asked myself, how many falsehoods slipped past my radar? Probably more than a few.
God’s ninth commandment—do not bear false witness—is being obliterated by social media (Ex. 20:16). Where untruth takes root, social trust declines. Friends begin doubting friends because we increasingly agree, “I can’t tell what’s true anymore.”
Jesus warned his followers, “I am sending you out as sheep in the midst of wolves, so be wise as serpents and innocent as doves” (Matt. 10:16). In the era of the social internet, this means rejecting the temptation of cynicism while simultaneously growing wise to how falsehood spreads. Currently, two kinds of falsehood thrive on social media: misinformation and disinformation. Understanding both is the first step toward resistance.
Misinformation comes in two forms: (1) unintentional falsehoods shared without malice and (2) falsehoods created by decontextualizing or deceptively reframing true events.
Unintentional misinformation: Often, unintentional lies include heartwarming stories, like the fabricated transcript of Todd Beamer’s call from onboard Flight 93 on 9/11. The “transcript” is certainly moving, and so it wasn’t surprising when screenshots of this reconstructed conversation went viral on Christian Twitter last September. The volume of retweets convinced most people (including me) it was authentic.
If it wasn’t true, why did it go viral?
Every social media algorithm is designed to pinpoint successful posts (i.e., posts that attract engagement such as comments, likes, or shares) and put them in front of as many people as possible. This keeps users on the platform longer, which means more ad revenue. Emotionally gripping posts are great for business. Who cares if they’re true?
Where untruth takes root, social trust declines.
Decontextualized misinformation: Partisan news sites and influencers often remove the context of a quote or video clip in order to make it fit their narrative. This stokes outrage, which drives up engagement and leads the algorithm to highlight it. We love to assume the worst about our enemies, so few people ever get around to investigating the broader context that makes sense of the ostensibly outrageous story or quote.
Disinformation is a false story, photo, infographic, or quote fabricated by nefarious agents for the purpose of stoking division, mistrust, and hate. China, Iran, and Russia have all proven themselves adept at manipulating social media algorithms to spread viral disinformation in an effort to destabilize American society.
Examples of foreign-made stories are many: ISIS causing a chemical explosion in Lousiana, a deadly phosphorous leak in Ohio, Ukrainian fighter jets shooting down a commercial flight, Alaskans petitioning to secede from the union, and Queen Elizabeth II warning about the imminent onset of World War III in 2017.
Most Americans were unaware of digital disinformation campaigns until the 2016 election, when Russian agents posted an ocean of false stories about both candidates, which both sides were all too willing to believe.
Their disinformation playbook—called active measures—was developed well before the internet by the KGB. In an excellent docuseries, the New York Times lays out the KGB’s seven-step program for seeding disinformation:
1. Find a crack: Identify places where a society is most divided and hostile, so you can break them apart.
2. The big lie: Create a lie so enormous that it’s difficult to believe it’s made up.
3. The kernel of truth: Construct the lie around a small bit of truth to lend credibility.
4. Conceal your hand: Make sure that the lie cannot easily be traced to its foreign source.
5. The useful idiot: Identify people whose hostility makes them easy targets for the lie. Use them as mules to spread it.
6. Deny everything: No matter the evidence, deny any involvement in the lie’s fabrication.
7. The long game: Only a few seeds will take root over a long period of time. So sow as many as possible, and nurture the promising ones.
This is exactly how Pizzagate happened. Russians hacked the emails of John Podesta and released them on WikiLeaks. This allowed them to conceal their hand and later deny involvement. Fast forward several months and anonymous parties on 4chan began to fixate on emails between Podesta and Cosmo Pizza. The emails were the kernel of truth. The big lie was that they included coded language revealing that the Clintons were running a secret child sex-trafficking ring out of the restaurant. Useful idiots bought the lie and spread it onto mainstream social media platforms. Several months later a gun-toting Christian showed up to Cosmo Pizza to break children free from a non-existent basement.
Voila! Dissent and mistrust sowed. And the rumors around Clinton-run sex trafficking rings continue to this day.
6 Principles for Resisting Disinformation and Misinformation
Understanding how misinformation and disinformation work is the first step in arming yourself against falsehood. Here are six additional principles for wisely avoiding falsehood online.
1. Corroborate sensational headlines. If a story or quote sounds outlandish, it probably is. Because we’re prone to believe headlines that demonize our enemies, show extra caution when you read takedowns of people you’re prone to dislike. The algorithm knows what you hate, and it’s happy to inflame your anger if it keeps you on the platform for a second longer.
2. Check the source before you click, comment, share, or like. When you engage with a post, you tell the algorithm, “Give me more stuff like this.” Protect yourself by looking at the source of the material before you click. Usually, the source website is visible below the picture. If it’s not a reputable, well-known institution, then google before you engage. This will protect your future feed from algorithm-generated falsehoods.
3. Seeing isn’t believing. Pictures can be doctored. Videos can be taken out of context. Deepfakes are a thing. By dragging and dropping a photo into Google Images, you can often find its original source and context. By searching for a speech on YouTube, you can find a quote’s original context.
4. Don’t believe it just because a Christian said it. Unfortunately, Christians are one of three groups in the United States that foreign powers actively target with disinformation. According to the MIT Technology Review, 19 of the top 20 Christian Facebook pages are run by foreign troll farms. Their strategy—post 95 percent Christian content stolen from other pages and then slip in 5 percent insanity—is effective in making Christians the “useful idiots” spreading disinformation.
5. Read more than headlines. Marketers love clickbait. They put half-truths into headlines just to get you to click. However, many people don’t click and instead operate under the assumption that the headline is entirely true. Even honest headlines rarely sum up the whole story. Make sure to read the whole thing.
6. Admit when you get it wrong. My friend Michael Graham shared the Flight 93 “transcript” I mentioned earlier. Then he did something shocking: he issued a mea culpa. Most people don’t have the honesty or courage to do this. But when we do it, we show that we care more about truth than our reputation. This lends tremendous credibility to our claims about Jesus.
“I can’t trust anything” cynicism is not the answer to the abundance of untruth online. Instead, we must love truth by seeking it out. Christians should be known as those who seek to understand context in the face of decontextualization, who seek truth in the face of disinformation.