I’ve noticed a strange psychological phenomenon. I’ve seen it crop up in myself, and I’ve seen it in others. You have, too—perhaps you’ve even experienced it yourself.
I have in mind a phenomenon that occurs when someone’s views in a contested space are shown to be correct—whether by empirical data, overwhelming consensus, or some other means. That is, I’m talking about something that happens when someone’s views are truly vindicated, often multiple times, and particularly when that vindication comes in spite of scoffers and impassioned adversaries.
What tends to occur is something that might best be described as “muscle memory” or “momentum” toward being right.
Let me explain.
In epistemology, the problem of induction refers to the challenge of predicting the future based on patterns from the past. For example, we believe we’re justified in thinking the sun will rise tomorrow, but when we try to explain why, things get tricky. We initially reason that tomorrow will resemble today because today resembled yesterday. But of course, that simply assumes the very thing we’re trying to justify: that the future will follow the past.
Gallons of ink have been spilled trying to explain why we are justified in believing the future will resemble the past without falling into circularity. I won’t bore you with proposed solutions.
What’s important here is that there is an analogous (though distinct) problem of induction that rears its head when we enter into debate and dialogue.
Perhaps initially someone wades into controversial conversations a bit timidly, wielding a quivering sword (or shield). They’ve quietly held their convictions but have now decided it’s time to speak up. In doing so, they encounter strong opposition, which may lead them to question whether they’ve gotten things right or if they truly know as much as they thought.
But they stick to their guns. Eventually, their claims are validated in one way or another, and a powerful—and natural—feeling of vindication washes over them. So far, so good.
Curiously, however, this psychological effect of vindication does not remain an artifact of the past. Rather, it lingers—even in the absence of a particular target.
Something like the following begins to happen, and it gains strength with each new vindication:
“I was criticized last time for my view of X, and I was right. My defense or critique of Y will probably play out the same way. I won’t make the mistake of doubting myself this time. Time to stick to my guns and await vindication.”
In other words, people who have been right—and it honestly doesn’t take many instances—develop a kind of snowballing, psychological “muscle memory” for being right, often to the point of quietly saying to themselves:
“Remember all those times I’ve been right in contested waters? When the next issue comes along, I’ll probably be right about that, too.”
They come to believe that their pattern of past “victories” will follow them, even into areas unrelated to their earlier vindications.
It’s a kind of momentum—an inertia, if you will.
Winning teams in football expect to win on the field. The “winners” I’m describing expect to be right whenever they develop a strong opinion. After all, that’s what they’ve done in the past—sometimes while being ridiculed by experts and the highly educated. They’ve demonstrated to themselves that they have an edge in truth-perception. Why wouldn’t they stick to their guns?
There’s a lot we could unpack here about the problem of “argumentative induction.” For what it’s worth, I do believe there are principled reasons to think that one’s pattern of grasping truth might (imperfectly) continue into the future—say, due to training in research or critical thinking that provide a skillset for approaching problems more generally.
But my primary concern is this: people who develop muscle memory for vindication often become the least teachable, most uncorrectable and most stubborn individuals in society—and in the church.
Achilles always sneered at the next man up—so do they.
Whatever sense of exploration characterized their initial inquiries is now gone. Anyone who disagrees becomes someone to either educate or eventually dismiss—“agree to disagree” at best.
They barely entertain the possibility that they could be wrong. And if they are, it won’t be you who points it out to them, but someone they believe has cornered the market on truth even more than they have.
To be clear—I’m not talking about people who stick to their guns because they genuinely know more than most in a particular area (or even several). I’m talking about people whose confidence comes from a sense of momentum established by being right in the past.
The reasoning goes: the next time will resemble the last time, because the last time resembled the time before that—hence the snowball effect.
This is a deeply concerning phenomenon for at least two reasons.
1. It threatens our humility.
If we’re not extraordinarily careful, muscle memory for being right will destroy our humility, and we be at risk for developing a superiority complex that borders on embarrassment.
If our resilience in argument is rooted in a pattern of being right across unrelated subjects—not in our ability to actually argue well about the issue at hand—we will struggle to feel the need to truly listen. We don’t need to hear others’ arguments—except, perhaps, to help them understand things better. All we need to know is that it’s us versus them. And “them” looks a lot like those we’ve already triumphed over.
More concretely:
If someone was right about the (very bad) public mismanagement of COVID-19 and was called an idiot, and they were right about the harms of transgender ideology and called a bigot, it is incredibly unlikely that they’ll approach immigration debates with tentativeness once they form an opinion.
But how does being right about immunizations and intersex people translate into confidence on immigration?
Answer: The muscle memory for being right—and the momentum it builds.
Once more, I’m not talking about true polymaths who deep-dive every topic of interest. In a world where reading three significant books on a topic can put you in the top 5% of knowledge on that subject, some genuinely become very well-informed across multiple disciplines. After all, in 2016, the median American read only four books in the entire year—many of which were fiction. This hasn’t changed. (Source). While expertise requires years of high-level study, comparative competence does not, particularly when that competence is informed by said experts.
Even so, most of us are not time-rich, hyper-disciplined, exceptionally talented polymaths. And thus, in many cases, we’re not being driven by mastery—we’re being pushed forward by momentum.
We want to believe our confidence is rooted in a stable foundation: superior cognitive ability, better research habits, deeper discernment etc. And sometimes that may be true.
But that very yearning—even when justified—often betrays our vanity and pride.
Bottom line: Even if we happened to be smarter or more well-read than all our interlocutors, our muscle memory for being right should never determine our confidence. Only our ability to rigorously defend each position on its own terms can play that role.
Failure here will result in us becoming prideful people driven more by being right than getting it right.
2. It makes us unbearable to others.
If we become known as people carried along by the momentum of past vindication, others will ignore us.
They may do so because they don’t want to spar with someone who’s always waiting to coach them up. Or they may give up even when we’re badly wrong, simply because they don’t have the emotional energy to try to convince us we’re mistaken.
It just won’t be worth it.
Being left to ourselves to shout in a vacuum or throw red meat to our yes-men is the wages of being insufferably stubborn and prideful. We should seek to avoid this at all costs for the sake of our own souls and our own worldviews.
So what should we do instead?
When we feel momentum from past vindication fueling our confidence, pause and recalibrate. Focus on the issue at hand.
Refuse to believe we’ve ever “arrived” at the perfect view. Stay open to correction. Ask good questions.
In humility, acknowledge that we all have blind spots—even if we don’t know where they are.
Only speak with real confidence when we’ve:
Read at least three substantial books or peer-reviewed articles on a topic, and
Had in-person conversations with thoughtful dialogue partners, ideally, some of whom know the subject better than we do.
Don’t mistake real, comparative competence for genuine expertise—they are not even in the same ballpark.
If we adopt this posture and framework (or something like it), we’ll have everything to gain—and only a dubious, psychological form of justification to lose.

