No matter how thoughtful, intelligent, well-informed,
or open-minded you think you are, you’re wrong. Even if you are humble enough to admit what
you don’t know, odds are you are still falling victim to a major cognitive fallacy that keeps
you from having an accurate impression of reality. Cognitive fallacies are so common, they are
almost synonymous with thinking, and like a chicken-and-egg puzzle, they make it exhausting
to try to think your way out of them. So whether you want to shatter the illusion of your false
humility, or can’t wait to leave a comment at the bottom of this article explaining how
you actually are never wrong about anything, this list should give you plenty to think
about… in your own fundamentally flawed and biased way. 10. Your Lizard Brain It turns out, size does matter–but mostly
because of how you use it; or more precisely, how it can be used. Different areas of the
brain have different functions, so it isn’t the total size that is important so much as
the relative size of different areas, which support different cognitive processes. And
this isn’t an endorsement of phrenology; it is a question of capacity, which may or
may not actually be utilized. Human brains are like evolutionary onions:
they have developed several layers, and each layer adds both mass, and the capacity for
higher functions. Because we are all superficial, we tend to heap attention on the last outer
layer–the cerebral cortex–that is capable of complex thought, reasoning, and many of
the features we think of as defining intelligence or humanity. But superficiality is stupid, and we tend
to give less credit to the relatively more important, less-evolved layers–namely, the
Lizard Brain that secretly controls everything (Reptilian Conspiracy theorists, rejoice–you
were right all along). At the core of all brain activity is a small
combination of brain parts found in all vertebrates that serve mostly to keep you alive, regulating
heart rate, breathing, equilibrium, etc. Surrounding this is the limbic system, also known as The
Lizard Brain, because that’s about all there is inside the average lizard’s head. It
interacts with the central core to process sensory information, generate basic emotions,
and manage decision-making at its most rudimentary level. In short, this bit of the brain is
where instinct lives. Every unconscious habit or automatic response–jumping
when startled, eating what tastes good, or trying to make babies–originates in the
reptilian core. Because it is obsessed with survival (eating,
escaping, reproducing), the Lizard Brain has an uncanny ability to shout over your higher
reasoning brain and take control, rendering the more-evolved lobes (the ones that make
you smarter than the average ape) impotent right when it counts. Since it is embarrassing to admit how often
our behavior is driven by base animal impulses, we tend to employ the rest of our mammalian
brains to justify our behavior, rather than to control it. This is why rationalization
is common to the point of being unconscious: it takes a lot of mental effort and concentration
to perpetually silence the Lizard Brain, but a lot less to delegate decision-making to
it, and let the cerebral cortex invent excuses to make those decisions seem rational. Like so many MFA graduates who wind up writing
generic marketing copy to pay the rent, our higher functions squander their talent by
simply dressing up our primal instincts as deliberate, thoughtful actions. Human evolution
has turned us into bullsh*t artists, because it is easier to invent excuses for our behavior
than to take active control of it. 9. Threat Perception As a species, we like to pat ourselves on
the back for being aware of our own mortality. As far as we can tell, no other animals on
earth possess this innate knowledge of their inevitable death, which supposedly gives us
a powerful cognitive and behavioral advantage. Except our brains are still hardwired to ignore
long-term consequences so that we can better enjoy short-term pleasures. Just as our Lizard Brains don’t let our
higher functions waste time trying to reason us out of harm’s way, they tend not to let
thoughtful consideration (or impending disaster) get in the way of a good time. The “Fight
or Flight” response is really only applicable in dealing with an immediate threat; when
the risk is more long-term, like obesity or the robot apocalypse, the Lizard Brain becomes
obsessed with instant gratification. Outside of the Flight or Fight response, the
Lizard Brain is basically playing a never-ending game of *Boff*, Marry, or Kill, only it isn’t
as interested in monogamy as it is in getting fed, so the game becomes *Boff*, Eat, or Kill.
Even when there isn’t an immediate demand for its services, your Lizard Brain still
wants its voice heard, and will always drive you to choose short-term pleasure over long-term
security. And when pleasure is involved–as opposed
to fear–it makes your Lizard Brain feel like it is winning the game, and want to keep
playing. This is also known as addiction–it literally happens in your brain, and is not
nearly as dependent on the chemical properties of any substances involved; that is why behaviors
can become just as addictive as anything you can consume. That is also why treating addiction
takes more than just willpower; you have to physically change your brain to develop new
habits. So while you call upon all your knowledge
and experiences to make a decision about how to behave in any given situation, you are
simultaneously working to suppress your instinct to simply do the easiest, most carnally satisfying
thing possible. Hence the popularity of Netflix, Cheetos, and porn. 8. Authority You know that old saying, “Never meet your
heroes?” That’s because there is a gap between what
we expect people to be, and what they are. Only it isn’t just heroes–it is everyone
pretty, tall, confident, or even carrying a friggin’ clipboard. There’s a whole cocktail of fallacies at
play here, but aside from projecting a false correlation of authority based purely on appearance,
there is the timeless appeal to authority. This can manifest as anything from a meme
image that uses a famous historical figure to make a quote sound profound (when it is
actually nonsense), to using Rotten Tomatoes to determine what is or isn’t a “good”
movie. It is easy to come up with an opinion, but simply having more of them doesn’t make
them objective or informed. Even credentialed experts–like doctors–are
often operating more on informed opinions than cut-and-dry facts, and that’s assuming
the subject in question is actually within their area of expertise. If a nurse and a
doctor disagreed over the best way to deliver a baby, for example, people tend to assume
the doctor is the superior authority on the matter. Of course, if the nurse was a certified nurse
midwife (a maternity specialist trained in women’s health and pregnancy issues) and
the doctor was a colorectal surgeon (butt doctor), then deferring to the doctor suddenly
seems a little misguided. But the natural inclination we all feel is to simply look
for an authority, without first qualifying what makes that person an authority–because,
all too often, nothing does. 7. Binary Thinking Sports, amirite? The popular narrative holds
that youth sports create character, teach leadership and cooperation, and other such
valuable life lessons. All that may be true, but they also train participants and spectators
alike to view everything as a tribal contest between good and bad, right and wrong, us
and them. This is why team sports are so eternally popular: they reflect the way we think. Even though our brains evolved to tolerate
and process complexity, these new developmental expansion drives take extra processing power,
so we subconsciously simplify whenever possible. And it turns out, it is always possible, because
there are few circumstances that can’t be (fallaciously) boiled down to an A/B option,
just like the opposing teams facing off in an athletic competition. This is the essence of binary thinking–A/B,
ones and zeroes, us and them. Surviving as a tribe requires cooperation, which is greatly
aided by the perception of a shared threat–a competing tribe, an opposing team. Because
this kind of cooperative-competitive worldview has been so famously helpful in human survival,
we project its simplistic calculus onto damn near everything else, so that we can better
make decisions, forge alliances, and elect the right candidate for president. No matter how much you holler about how party
membership is a representation of ideological cohesion, millions of years of evolution make
a compelling counter-argument that you actually just picked a team and want to see them win. Even when it comes to what we eat, we can’t
tolerate the dietary complexity of the omnivore. Instead, we make wild extrapolations–low
in fat must also mean low in anything that might kill us; non-GMO means healthy; reduced
sodium means vitamin-enriched. Our selection process for food is overwhelmed by creative
labels that speak directly to our monkey minds, looking for the team we perceive as healthy. 6. The Internet Believing your opinions are valid and correct
is like thinking you know which way is up in an M.C. Escher drawing–it only works
if you limit your focus and deliberately ignore all contradictory evidence, and the internet
is a near-perfect tool for doing just that. The internet is well on its way to bringing
the sum total of human knowledge within reach of anyone who connects. We have never had
more access to information, or more user-friendly platforms for consuming and disseminating
data. Following the patterns established by the advent of written script or the printing
press, we should be seeing an explosion of intelligence, progress, and intellectual transformation. But we aren’t. What we’re seeing instead
is that people are more interested in being right than being informed, in making obnoxious
punchline comments than advancing discourse, in Rick Rolling earnest YouTubers than in
providing a video infotainment experience. Rather than an escalator to the pinnacle of
human development, the internet is really just a utility for feeding confirmation bias. This is the human tendency to agree with people
who say things we think, and feel a surge of confidence that other people think the
way we do. Online, you can find someone arguing for,
or agreeing with, virtually anything. This means no matter how objectively wrong you
are, you can find someone else on your side. Now, through the power of binary thinking,
you can boldly assume infallibility by association. After all, if people agree with you, then
the real problem isn’t with your ideas… it is with the people who disagree with your
ideas. The internet, in combination with social media,
feeds a similar-but-different problem, selection bias. This is where we tend to seek out opinions
that match ours, and ignore those that don’t. The fact that you can block Facebook friends
from your feed seems like a feature, but it is really just proof of how the internet is
just an automated cognitive fallacy. Multiply your friends network by the news
channels you watch, the papers you read, the people you talk to, the movies you watch,
the restaurants you patronize, and the house of worship you attend, and you can see how
we aren’t so much seeking truth as avoiding conflicting reports on existence. A possible defense against the overwhelming
human tendency to indulge these biases has cropped up in the form of big data: a massive,
automated, algorithmic assessment of raw data through various query-based lenses. President
Obama even established the Office of Data Science in an effort to bring some measure
of scientific and mathematical objectivity to the direction of U.S. federal policy. Either
that, or even the White House is losing ground in the fight against Artificial Intelligence. 5. Distorted Memories Memories are less like images etched into
granite, and more like soft Play-Doh we accidentally step on and mix together until everything
is a gross grey color. In less metaphorical terms, memories are never
complete. As we learn, make new memories, experience new emotions, and find ourselves
in new contexts trying to remember something, all this new mental baggage lands directly
on the memories we are trying to recall. When we remember something, we aren’t simply
recalling it like a photo in an album, we are interacting with it, projecting new memories,
knowledge, and biases onto the old memory. Sometimes, we construct memories through some
combination of suggestion and desire–or fear. For example: if you are about to go on a date,
you might find yourself visualizing disaster scenarios and the date ending in lonely tears.
The emotionally jarring imagery might even compel you to picture the disaster all over
again…and again, until the repetition emblazons the whole scene on your memory, as though
it had actually happened. Alternatively, you might find the experience
of that hot date so sensual, that you expedite its storage straight to long-term memory.
Unfortunately, however, you can have efficiency, or you can have accuracy, but not both. In
order to make memories easier to recall, our brains take shortcuts, and then fill in the
gaps whenever we call on them. This sort of adaptive memory relies on your present-day
intellectual context and capacity to make up for what you didn’t actively commit to
memory. It is sort of like saving your finished novel
as a word document, and then opening it back up only to find a 900-page Mad Libs epic,
except that you don’t even realize that you are filling in random nouns and adjectives
to complete the story. It is also highly suggestible, which is why
eye witness testimony is basically worthless. If they are gently nudged toward thinking
something, they retroactively convince themselves that that is what they actually saw. Outside of the occasional trauma or joy that
is the stuff of dynamic memory, there is the unassuming, everyday drudgery that makes up
the bulk of our lives. Like the commute to work or the layout of your home, this is the
background scenery that seems so unchanging, that our brains actually stop noticing it
and just assume it is there–a cognitive function known as habituation. Our brains are much more engaged by novelty
than familiarity, so when stimulation becomes routine, it stops penetrating all the way
through every layer of the brain, and starts getting filtered out as noise. 4. Aging Getting old is bad for your brain. If you’ve
ever met an old person, you might already be familiar with the telltale signs of cognitive
decline: forgetfulness, slower reflexes, impaired judgement, etc. Of course, there is no set
age at which one becomes “old” and their brains start getting unreliable; Alzheimer’s
can begin to develop in patients as young as 40–and 40 is the new 30, so really no
one is immune. But even if you reach your golden years in
peak physical fitness, that still means you’ve spent a lifetime recalling old memories, which
you’ll recall is not a passive process. By then, your long-term memory will be a mess
of reconstruction, adaptation, suggestion, and pure invention that is hardly recognizable
as an account of what you actually experienced in life. And when you start preparing yourself
to face the grim specter of death, you’ll very likely want to believe that you lived
a good life, which means you’ll subjectively recall memories through a very thick lens
of nostalgia. The future may be uncertain, but you’ll
always have “the good ol’ days” to reflect on, because you are probably making them up
as a compensatory mechanism for the lack of true, reliable memories. And you’ll likely be spending a lot more
time wrapped up in the comfort-blanket of your imaginary past, because your brain’s
ability to perceive sensory information is also going to decline as you age. Sugar literally
will taste less sweet, and all your favorite foods will become less appetizing. You will
remember everything has being better back in the day, because in the current day you
aren’t capable of fully experiencing the simple pleasures of life. Bearing that in mind, maybe the next grumpy
geezer you run across will seem less like a curmudgeon and more like another example
of the quiet tragedy of the human experience. 3. Groupthink If American anti-drug advertising is to be
believed, peer pressure’s primary function is to get kids to take drugs in a desperate
bid to seem cool. It is easily countered by the forceful repetition of “No” to any
offer that involves social conformity. In reality, peer pressure is one of the key
evolutionary advantages that helps humans survive and thrive. Adopting a cooperative
strategy might allow hunters to take down a mammoth, and enjoy more spoils than they
would have hunting individually. Toddlers learn most of their earliest lessons (like
how to talk, or when to laugh) by trying to imitate the people around them. The military
indoctrinates soldiers to follow orders and work as a unit, because everyone having their
own idea about how to behave in combat is a solid way to get everyone killed. But our tendency to crowdsource our decisions
isn’t limited to learning and survival tactics. Similar to the appeal to authority, groupthink
functions as an individual default to the apparent will of the crowd. It isn’t just
high schoolers who want to be part of the in-crowd; that’s why angry mobs, bandwagons,
and the wave are all such standard features of people assembling. It is also why your
opinion on a single issue can be extrapolated to an entire political platform: when you
find a group who agrees with you, you naturally want to continue agreeing with them. By extension, even when we find ourselves
disagreeing with a group, we will self-police that disunity and silence dissent so as not
to injure our membership. This, in combination with deference to authority, is how humans
are able to come together and produce such hallmark achievements as genocide. Much as w e celebrate illusory notions of
individuality, this tendency to try to blend in, mirror behavior, and offload thinking
to a group is all but impossible to prevent, once enough people start mingling. It is unconsciously
activated by socialization, and the only defense goes against instinct and popular opinion,
and being unpopular is just the worst. 2. Conditioning What is free will? Trick question–you have no idea, because
most of what you perceive as deliberate behavior is actually a conditioned response. From your
minor daily habits to your entire sense of identity, you are overwhelmingly the product
of conditioning–otherwise known as “training,” the exact same way you get pets and babies
to defecate with more precision. If you aren’t a pet-owner, you’ve still
more than likely heard of the experiments of Pavlov–particularly the part where the
dog began salivating upon hearing a bell ringing. Well, “culture” is to people what “ringing
bell” was to the dog. Except in society, the people holding the
bell may not realize what they are doing, so both the trainer and the trainee are participating
in the conditioning process unawares. This is basically how superstitions are born: by
failing to differentiate between correlation and causation, we find ourselves augmenting
behavior purely out of a perceived association with the desired outcome. (Editor’s note:
As one with a BS in Psychology, I can attest that it was drilled into our brains that “correlation
does not equal causation,” and it took an entire semester to learn that.) Plus, memory distortions can directly feed
conditioning–we remember certain key features of what appears to be a cause and effect,
and then link the two. Without reconsidering the basis for that connection, we simply remember
it, and proceed to act on it as though it is fact. Whether it starts as an effort to avoid pain
(or embarrassment, public shaming, loneliness) or the pursuit of pleasure (friends, laughter,
comfort, social acceptance), your life soon becomes one long chain of conditioned responses
that only seem like choices because of the stories you tell yourself about them. In fact… 1. Narrative Thinking Most people think of empathy as the capacity
to put oneself in someone else’s shoes. If you can imagine what something feels like,
that must mean you understand how someone is actually feeling, right? The key here is that in order to relate to
someone, you need to see yourself in that person. Everything we perceive or think about,
we convert to a first-person perspective; our minds are inherently self-centered. This
is the essence of narrative thinking: all of existence is a story, and we all think
we are the main character. Not only is this the bridge connecting us
to our fellow humans, it is one of the primary ways we make decisions big or small. Intrinsic
or even relative value matter much less to us when expressing preference for a particular
musician, brand, lifestyle, religion, or hair color. The value they add to our lives starts
as the story we tell ourselves about them, and how they enrich our own sense of ourselves
as the leading player. We tell stories to explain everything–even
our everyday speech is riddled with metaphors–but nothing more so than our own behavior. Narrative
thinking is how we persistently ignore the bundle of instinct and emotion regulating
our behavior, and turn it all into a blend of destiny and deliberation. It is how we
take perfectly good memories and sensory perception, and muddle them up into whatever we want them
to be. Our brains even tell stories subconsciously, filling in perceptual gaps to make sense of
stimuli–otherwise known as illusions.