We pay attention to, easily recall, and feel positive emotions towards things we deem interesting or useful. We dismiss, downplay, dump, and have negative emotional reactions to information that is threatening to our objectives or our self-image, or that conflicts with our expectations or pre-existing beliefs. Things that don’t seem particularly significant in either direction, we largely ignore (even though these neglected details often prove to be quite important in retrospect).
…When good things happen that could be plausibly laid at our feet, we attribute those positive outcomes to stable and internal factors that are within our control – i.e. positive characteristics we possess and wise actions we took. When bad things happen, we tell the opposite story. Adverse outcomes are attributed to contingent and fleeting circumstances – things external to us and outside of our control.
…Most business fail within six years. An overwhelming majority of romantic relationships end in less than a year. Most employment relationships end up not working out for one or more parties eventually (relative to the alternatives) – typically leading to resignations or termination within five years. Social movements rarely achieve their stated ends. Most innovations are maladaptive. The modal result of publication submissions is rejection. An overwhelming majority of published scientific findings are wrong, trivial, and/or non-impactful. If we allowed these types of probabilities to govern our attitudes and behaviors, we’d rarely invest ourselves in anything.
In reality, however, people defy the odds all the time. Ostensibly irrational levels of confidence, conviction, resilience and optimism often play an important role in these outcomes. Our biases and blindspots are, therefore, not just a product of our cognitive limitations – they empower us to accomplish things we otherwise may not.
…people who are highly educated, intelligent, or rhetorically skilled are significantly less likely than most others to revise their beliefs or adjust their behaviors when confronted with evidence or arguments that contradict their preferred narratives or preexisting beliefs. Precisely in virtue of knowing more about the world or being better at arguing, we are better equipped to punch holes in data or narratives that undermine our priors, come up with excuses to “stick to our guns” irrespective of the facts, or else interpret threatening information in a way that flatters our existing worldview. And we typically do just that.
In a decades-long set of ambitious experiments and forecasting tournaments, psychologist Philip Tetlock has demonstrated that—as a result of their inclinations toward epistemic arrogance and ideological rigidity—experts are often worse than laymen at anticipating how events are likely to play out . . . especially with respect to their areas of expertise. What’s worse, cognitively sophisticated people tend not to be very self-aware about our error rates either, because we excel at telling stories about how we were “basically right” even when we were, in fact, clearly wrong – inhibiting our ability to learn from mistakes and miscalculations.
In a similar vein, experts have been shown to perform a bit worse than laymen at predicting the likely effects of behavioral science interventions. Political practitioners have been found to be no better than laypeople at predicting which political messages are persuasive. Comparative and longitudinal studies have found that highly educated political leaders perform no better than less educated ones, and may even be a bit worse in some respects.
Rather than becoming more likely to converge on the same position, people tend to grow more politically polarized on contentious topics as their knowledge, numeracy, reflectiveness increases, or when they try to think in actively openminded ways.
These empirical patterns would be shocking and difficult to explain while operating under the assumption that humans’ cognitive and perceptual systems are primarily oriented towards objective truth. However, these tendencies are exactly what one might expect if we instead work from the premise that our cognitive capacities are fundamentally geared toward group building and coalitional struggles, and that we typically reason in ways that help us achieve our goals with and through other people.
On this understanding of how our brains work, we might likewise expect that the kinds of people the symbolic professions select for (cognitively sophisticated, academically high-performing, highly educated) may be especially prone to tribalism, virtue signaling and self-deception.
- https://PayPal.Me/lukeisback
"Luke Ford reports all of the 'juicy' quotes, and has been doing it for years." (Marc B. Shapiro)
"This guy knows all the gossip, the ins and outs, the lashon hara of the Orthodox world. He’s an [expert] in... all the inner workings of the Orthodox world." (Rabbi Aaron Rakeffet-Rothkoff)"This generation's Hillel." (Nathan Cofnas)