

How To Recognize Falsehoods
In June 2021, televangelist Jim Bakker's ministry paid $156,000 in restitution and accepted a permanent ban on disease treatment claims. The product: "Silver Sol," promoted on his show for $80 - $125 as COVID-19 prevention. The pitch reached millions before regulators acted.[1]
Why did it work? And why did it spread so far before anyone stopped it?
Understanding both questions is the difference between being a target and seeing clearly.
The fraud succeeded because it started with truth. Silver does have antimicrobial properties; silver-containing wound dressings are FDA-approved for burns. Silver ions do disrupt bacterial cell membranes; this is established science. The guest on Bakker's show claimed the government had proven silver "has the ability to kill every pathogen it has ever been tested on, including SARS and HIV." The leap from "antimicrobial properties exist" to "drink this to prevent COVID" happened after the audience had already agreed twice.
The most effective falsehoods don't announce themselves. They build credibility through accurate statements, then leverage that trust toward conclusions that don't follow. By the time the logic jumps, you've been nodding along. True premises smuggling false conclusions past your defenses. That's the Trojan Horse, and the comfortable feeling of agreement is exactly what it exploits.
The mechanism powers health fraud, financial scams, and political radicalization alike. The specific content varies; the structure doesn't. Accurate statements first. False conclusion last. Your prior agreement smoothing the path.
The defense is counterintuitive: treat agreement as a warning signal. When content confirms what you already believe, that's precisely when to examine whether the conclusions actually follow from the premises.
But knowing why the scam worked doesn't explain why it spread. That's a different problem.
The information environment you navigate daily wasn't designed to inform you. It was designed to engage you. Those aren't the same thing.
In 2018, MIT researchers analyzed 126,000 stories shared on Twitter over eleven years. False news reached 1,500 people about six times faster than accurate news.[2] The researchers controlled for bots. The disparity remained. Humans preferentially spread falsehoods. The study found false stories triggered fear, disgust, and surprise, emotions that drive sharing.
The Engagement Engine doesn't care what's true. It cares what spreads. Internal Facebook documents revealed in 2021 showed the company knew its recommendation systems drove users toward extremist content. Their own researchers wrote:[3]
"Our recommendation systems grow the problem."
Employees warned that reshares were the main vector for misinformation. The warnings were ignored. Engagement drives revenue, and outrage engages.
So a scam like Silver Sol gets two accelerants: a persuasion structure that exploits how agreement works, and a distribution system that rewards emotional intensity over accuracy. The Trojan Horse is ancient. The Engagement Engine is engineered.
There's a third pattern worth recognizing. The line between reporting and commentary has dissolved, and this happened for economic reasons.
A 2013 Pew Research study found cable news was 63% opinion content versus 37% straight reporting. MSNBC ran 85% opinion, Fox News 55%, CNN 46%.[4] Pew identified the cause:
"A good deal of on-scene reporting has been replaced with interviews, which, although they may be live, are far less expensive to produce."
Talk shows need no foreign bureaus, no investigative teams, no travel budgets. Opinion is cheap. Reporting is expensive.
The result is News Theater. Viewers encounter content that looks like news (professional sets, authoritative delivery, network branding) but functions as commentary. The packaging says "information." The content says "here's what to think."
Compare two statements about the same event:
"The bill passed 52-48 along party lines after six hours of debate."
"The vote revealed the true priorities of senators who claim to support working families."
The first gives you information to evaluate. The second gives you a conclusion to adopt. Both might appear in the same broadcast, same article, same feed, often without clear demarcation. When consuming news, ask whether you're being told what happened or what to think about what happened. Both have value. Knowing which you're consuming changes how you should process it.
Beyond recognizing these patterns, a few verification habits help.
Credible sources show their work. They explain methodology, not just conclusions. They acknowledge limitations rather than projecting false confidence. They correct errors publicly, treating mistakes as opportunities to demonstrate reliability. A source that never acknowledges being wrong is telling you something important about their relationship to accuracy.
Repetition across outlets means nothing if they're echoing the same original source. True verification requires ideologically diverse sources with independent reporting. When outlets that disagree about interpretation still converge on facts, those facts are probably solid. This applies to expert opinion too. Expertise is domain-specific; a Nobel physicist isn't an authority on epidemiology. Consensus matters more than any single credential.
Information degrades through intermediaries. "A study found that..." invites questions: Which study? Who funded it? What did it actually conclude versus what does the headline claim? Original documents, complete transcripts, and raw footage beat summaries. Each layer of processing introduces potential distortion.
Numbers create an illusion of precision that may not exist in the underlying data. Key questions: What was the sample size? Who gathered this and why? What methodology? Watch for framing games. "Risk doubled" sounds alarming until you learn it went from 1 in 10,000 to 2 in 10,000. Correlation presented as causation remains the most common statistical sleight-of-hand.
Images carry implicit authority. "Seeing is believing" makes visual misinformation effective. In March 2018, images circulated showing Parkland survivor Emma Gonzalez appearing to tear up the U.S. Constitution. The original Teen Vogue photo showed her ripping a shooting range target; the Constitution was digitally added. Despite rapid debunking, the doctored image gathered thousands of shares.[5] Reverse image search can reveal whether an image predates the event it supposedly depicts. Check for crops and edits that remove context. Video context matters too: when was it filmed, where, by whom?
Strong emotional response is a signal for increased scrutiny, not decreased. Content engineered to provoke outrage or vindication is optimized for sharing, not accuracy. The more a piece makes you feel righteous without requiring you to think, the more likely someone has done your thinking for you.
This isn't just about protecting yourself from being fooled. Every time you share something false, you become part of the distribution network. Every time you recognize and decline to share something false, you break a link in the chain.
The asymmetry makes prevention essential: filling an empty mind with falsehoods requires far less effort than correcting a mind already filled with misinformation. Once a false belief integrates into someone's worldview, dislodging it threatens their sense of self. People double down when confronted with contradicting evidence.
But there's a deeper point. The same skills that protect you from misinformation help you see systems clearly, recognizing when institutions work as designed versus when they malfunction, evaluating whether proposed solutions address root causes or just shift problems elsewhere.
The Trojan Horse, the Engagement Engine, News Theater: these aren't going away. But neither are the tools to see through them. The battle for accurate information won't be won by institutional gatekeepers alone. It requires distributed verification, people applying these skills independently, each functioning as a node in a network of truth-seeking. That network is still emerging, imperfectly, through the choices of individuals who refuse to be passive consumers of whatever appears in their feeds.
These skills are the prerequisite. What gets built with them comes next.

