
We like to think of ourselves as rational beings who make decisions based on facts and evidence. We pride ourselves on our ability to think logically, weigh options carefully, and arrive at reasonable conclusions. Yet every day, millions of people share misinformation on social media, cling to beliefs despite contradictory evidence, and make choices that seem to defy logic. Welcome to the fascinating world of human irrationality—where our minds play tricks on us in ways that would make a magician jealous.
This disconnect between how we think we think and how we actually think has never been more consequential. In an age of information overload, social media echo chambers, and deliberate disinformation campaigns, understanding our cognitive limitations isn’t just intellectually interesting—it’s essential for navigating modern life.
Traditional economics has long operated on the assumption that humans are perfectly rational actors—homo economicus—who process information objectively, calculate probabilities accurately, and make optimal decisions to maximize their utility. This neat theoretical framework makes for elegant mathematical models but fails spectacularly when it comes to predicting actual human behavior.
Behavioral economics tells a dramatically different story. We’re not computers running on pure logic; we’re humans, complete with cognitive shortcuts, emotional baggage, and mental blind spots that profoundly shape how we perceive truth. Our brains didn’t evolve to handle the complex information environment of the 21st century—they evolved to make quick survival decisions on the African savanna thousands of years ago.
Dan Ariely, a pioneer in behavioral economics and professor at Duke University, has spent decades studying these quirks of human nature. Through hundreds of experiments, his research reveals that we’re “predictably irrational”—our mistakes aren’t random but follow consistent, measurable patterns (Ariely, 2008). This predictability is both troubling and hopeful: troubling because it means we’re systematically prone to error, but hopeful because understanding these patterns can help us design better systems and make better choices.
“We’re not just bad at being rational,” Ariely observes. “We’re skilled at being irrational in ways that serve our emotional needs, even when those needs conflict with our stated goals or values” (Ariely, 2012).
To understand why we struggle with truth, we need to examine the mental machinery that processes information. Our brains are essentially prediction machines, constantly making assumptions about the world based on incomplete information. These mental shortcuts—called heuristics—are incredibly useful for making quick decisions but can lead us astray when accuracy matters more than speed.
Consider how we form beliefs. Rather than carefully weighing evidence like impartial judges, we often start with an emotional or intuitive reaction and then work backward to find supporting evidence. This process, called motivated reasoning, helps explain why intelligent, well-educated people can hold wildly different views about the same set of facts.
The problem is compounded by our social nature. Humans are tribal creatures who derive security and identity from group membership. Our beliefs aren’t just intellectual positions; they’re signals about which tribe we belong to. Changing your mind isn’t just about updating your information—it can feel like betraying your community and losing your identity.
Our brains come equipped with dozens of cognitive biases—systematic errors in thinking that affect our decisions and judgments. These biases served our ancestors well in simpler environments but often lead us astray in modern life. Here are some key players in our irrational relationship with truth:
Confirmation Bias: Perhaps the most famous cognitive bias, confirmation bias describes our tendency to seek information that confirms what we already believe while ignoring contradictory evidence (Wason, 1960; Nickerson, 1998). It’s like having a personal yes-man in your head who only tells you what you want to hear. In the digital age, algorithms exploit this bias by showing us content similar to what we’ve previously engaged with, creating information bubbles that reinforce our existing views (Pariser, 2011).
Motivated Reasoning: We work backward from our desired conclusion, cherry-picking facts that support our position while dismissing inconvenient evidence. This isn’t conscious deception—we genuinely believe we’re being objective while our subconscious minds filter information to protect our preferred beliefs.
The Backfire Effect: Sometimes, when presented with facts that contradict our deeply held beliefs, we actually become more entrenched in our original position (Nyhan & Reifler, 2010). It’s as if our minds have an immune system that rejects inconvenient truths. While recent research suggests this effect may be less universal than originally thought (Wood & Porter, 2019), it still occurs, particularly with beliefs tied to our identity or values.
Availability Heuristic: We judge the likelihood of events based on how easily we can remember examples (Tversky & Kahneman, 1973). This is why people overestimate the danger of plane crashes (which receive extensive media coverage) while underestimating more common risks like car accidents or heart disease. In the age of social media, vivid anecdotes and dramatic stories often carry more psychological weight than dry statistics.
Anchoring Bias: The first piece of information we encounter about a topic serves as an “anchor” that influences all subsequent judgments (Tversky & Kahneman, 1974). Even when we know this initial information is irrelevant or unreliable, it continues to shape our thinking.
Social Proof: We look to others to determine what’s true or appropriate, especially in uncertain situations (Cialdini, 2006). This can create cascades where incorrect information spreads rapidly because each person assumes others have verified its accuracy.
Our beliefs don’t just reflect reality—they actively shape it. This is particularly dangerous in the age of social media, where algorithms create echo chambers that reinforce our existing views. We end up living in parallel universes of information, each with its own version of “truth.”
The psychology behind this is deceptively simple: beliefs aren’t just intellectual positions; they’re part of our identity. When someone challenges our beliefs, our brains literally respond as if we’re under physical attack. Brain imaging studies show that challenging someone’s political beliefs activates the same neural regions associated with physical pain and personal threat.
This explains why rational arguments often fail to change minds. You’re not just asking someone to update their information; you’re asking them to question their identity, potentially alienate their social group, and admit they’ve been wrong. The psychological cost can be enormous (Kahan et al., 2012).
Disinformation thrives in this environment of cognitive bias and tribal thinking. Bad actors understand our psychological vulnerabilities and exploit them deliberately. They craft messages that appeal to our emotions, confirm our existing beliefs, and spread faster than careful fact-checking can keep up with.
Modern disinformation campaigns don’t just spread false information—they undermine our collective ability to distinguish between true and false. By flooding the information environment with contradictory claims, conspiracy theories, and partisan interpretations, they create what researchers call “truth decay”—a condition where facts become negotiable and reality splits into competing narratives (Kavanagh & Rich, 2018).
Social media platforms amplify these problems through their attention-based business models. Outrageous, emotionally charged content generates more engagement than nuanced, factual reporting. The result is an information ecosystem that rewards sensationalism over accuracy, passion over precision.
Dan Ariely’s research on dishonesty reveals another troubling dimension: we’re remarkably good at rationalizing unethical behavior, including the spread of false information. We tell ourselves we’re sharing important truths, fighting against bias, or protecting vulnerable people—all while propagating content we haven’t verified.
Perhaps most unsettling is our capacity for self-deception. We don’t just fool others; we fool ourselves with remarkable consistency. Ariely’s experiments demonstrate that people can simultaneously know they’re being dishonest while genuinely believing they’re acting ethically.
This self-deception serves a psychological function: it allows us to maintain a positive self-image while pursuing our interests. We’re not lying; we’re interpreting events in the most favorable light. We’re not being biased; we’re seeing truths that others miss. This internal narrative protects our ego but distorts our relationship with objective reality.
The implications for critical thinking are profound. It’s not enough to be smart or well-educated—intelligence can actually make us better at rationalizing poor reasoning. Highly intelligent people are often more susceptible to certain biases because they’re better at constructing sophisticated justifications for their beliefs (Klayman & Ha, 1987; Stanovich & West, 2007).
The good news is that awareness is the first step toward improvement. While we can’t eliminate our cognitive biases—they’re hardwired features, not bugs—we can learn to work with them more effectively. Here’s how we can strengthen our relationship with truth:
Embrace Intellectual Humility: Accept that you might be wrong about important things. The smartest people in history have been spectacularly incorrect about basic facts. Galileo was wrong about tides. Darwin was wrong about heredity. Einstein was wrong about quantum mechanics. You’re probably not the exception to this pattern of human fallibility.
Seek Disconfirming Evidence: Actively look for information that challenges your views. Set up Google alerts for opposing perspectives. Follow thoughtful people who disagree with you. Join forums where your ideas will be challenged respectfully. It’s uncomfortable but necessary for intellectual growth.
Consider the Source: Who benefits if you believe this information? What are their motivations? Are they selling something? Seeking attention? Advancing a political agenda? This doesn’t automatically make their claims false, but it should make you more skeptical.
Check Your Emotional Response: If a piece of information makes you feel strongly validated or outraged, be extra skeptical. Your emotions might be clouding your judgment. The most dangerous misinformation often feels emotionally satisfying—it confirms our fears about our enemies or our hopes about our allies.
Use Structured Decision-Making: Create systems that force you to consider multiple perspectives before reaching conclusions. Keep a decision journal. Use devil’s advocate processes. Establish cooling-off periods for important choices. These formal structures can help counteract our natural biases.
Practice Probabilistic Thinking: Instead of thinking in terms of absolute certainty, think in probabilities. How confident are you in this belief? What would change your mind? What’s the strongest argument against your position? This mental habit makes it easier to update your views when new evidence emerges.
Beyond individual cognitive strategies, we need better systems for processing information in the digital age. This means developing new literacies and habits adapted to our current media environment:
Source Diversification: Just as financial advisors recommend diversifying investments, we should diversify our information sources. Read across the political spectrum. Consume international news. Seek out expert perspectives from multiple disciplines.
Fact-Checking Hygiene: Before sharing information, especially emotionally charged content, take a moment to verify its accuracy. Check reputable fact-checking sites. Look for the original source. Ask whether the claim passes basic common-sense tests.
Slow Down: Our brains are wired to make quick judgments, but accuracy often requires patience. Implement a “24-hour rule” for sharing provocative content. Sleep on important decisions. Create friction between your initial reaction and your final response.
We can’t eliminate irrationality—it’s hardwired into our humanity. But we can learn to work with our cognitive limitations rather than against them. As Dan Ariely suggests, the goal isn’t to become perfectly rational but to become “better irrational”—to understand our biases and design systems that help us make better choices despite them (Ariely, 2008).
This isn’t just an individual challenge; it’s a collective one. We need institutions, technologies, and social norms that account for human psychology rather than ignoring it. We need education systems that teach critical thinking alongside content knowledge. We need technology platforms designed to promote accuracy rather than engagement. We need political and media cultures that reward nuance over sensationalism.
In an era of information overload and deliberate misinformation, developing these skills isn’t just helpful—it’s essential for the health of our democracy and our society. The truth may be complicated, but our commitment to seeking it doesn’t have to be.
Remember: being wrong isn’t a character flaw; it’s a human feature. The real mistake is refusing to update our beliefs when presented with better evidence. In the words of behavioral economics, we might be predictably irrational, but we’re also predictably capable of learning and growth. The first step is acknowledging that even our most cherished beliefs might be wrong—and that’s not a weakness, it’s the beginning of wisdom.
Like what you’re reading? Want more consciously prepared brain food?
Listen to this Harvesting Happiness episode: Belief, Bias, and Behavior: Rational Thinking and Behavioral Economics with Dan Ariely, PhD or wherever you get your podcasts.
Get More Mental Fitness bonus content by Harvesting Happiness on Substack and Medium.

Dan Ariely, PhD, is the James B. Duke Professor of Psychology and Behavioral Economics at Duke University and a founding member of the Center for Advanced Hindsight.
He researches the irrational ways in which people behave and explains these findings in plain language.
Dan has written NY Times bestsellers, including Predictably Irrational, The Upside of Irrationality, and The Honest Truth About Dishonesty.
Book: Misbelief: What Makes Rational People Believe Irrational Things
Constant Contact—Offers easy-to-use email and digital marketing tools to grow your business. Visit ConstantContact.com and get a 30-day FREE trial.
Disclaimer: This communication is for information only and does not constitute mental health treatment or indicate a therapeutic relationship. Individuals in need of treatment for mental health or psychological concerns should seek out services from appropriate mental healthcare professionals.
References:
Ariely, D. (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions. HarperCollins.
Ariely, D. (2012). The Honest Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves. Harper.
Cialdini, R. B. (2006). Influence: The Psychology of Persuasion. Harper Business.
Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732-735.
Kavanagh, J., & Rich, M. D. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation.
Klayman, J., & Ha, Y. W. (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94(2), 211-228.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220.
Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303-330.
Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. Penguin Press.
Stanovich, K. E., & West, R. F. (2007). Natural myside bias is independent of cognitive ability. Thinking & Reasoning, 13(3), 225-247.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207-232.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131.
Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129-140.
Wood, T., & Porter, E. (2019). The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior, 41(1), 135-163.
Resources for Further Learning:
Books:
Organizations and Websites:
Academic Resources:
Lisa Cypers Kamen is a lifestyle management consultant who explores the art and science of happiness in her work as a speaker, author, and happiness expert. Through her globally syndicated positive psychology podcast, books, media appearances, and documentary film, Kamen has impacted millions of people around the world.
learn more
browse services
CONTACT
OFFERINGS
ABOUT
NONPROFIT
PODCAST
SERVICES
SHOP
BLOG
PRIVACY POLICY
Our communications do not constitute mental health treatment nor is it indicative of a private therapeutic relationship.
Individuals seeking help for trauma related issues or other psychological concerns should seek out a mental health professional.
© 2010-2025 Harvesting Happiness
Website Design by Nadia Mousa