In this episode we interview Dean Burnett, author of "Idiot Brain: What Your Brain is Really Up To." Burnett's book is a guide to the neuroscience behind the things that our amazing brains do poorly.

In the interview we discuss motion sickness, the pain of breakups, why criticisms are more powerful than compliments, the imposter syndrome, anti-intellectualism, irrational fears, and more. Burnett also explains how the brain is kinda sorta like a computer, but a really bad one that messes with your files, rewrites your documents, and edits your photos when you aren't around.

Dean Burnett is a neuroscientist who lectures at Cardiff University and writes about brain stuff over at his blog, Brain Flapping hosted by The Guardian.

In this divisive and polarized era how do you bridge the political divide between left and right? How do you persuade the people on the other side to see things your way?

New research by sociologist Robb Willer and psychologist Matthew Feinberg suggests that the answer is in learning how to cross something they call the empathy gap.

When we produce arguments, we do so from within our own moral framework and in the language of our moral values. Those values rest on top of a set of psychological tendencies influenced by our genetic predispositions and shaped by our cultural exposure that blind us to alternate viewpoints. Because of this, we find it very difficult to construct an argument with the same facts, but framed in a different morality. Willer’s work suggests that if we did that, we would find it a much more successful route to persuading people we usually think of as unreachable.

One of the most effective ways to change people’s minds is to put your argument into a narrative format, a story, but not just any story. The most persuasive narratives are those that transport us. Once departed from normal reality into the imagined world of the story we become highly susceptible to belief and attitude change.

In this episode, you’ll learn from psychologist Melanie C. Greene the four secrets to creating the most persuasive narratives possible.

For computer scientist Chenhao Tan and his team, the internet community called Change My View offered something amazing, a ready-made natural experiment that had been running for years.

All they had to do was feed it into the programs they had designed to understand the back-and-forth between human beings and then analyze the patterns the emerged. When they did that, they discovered two things: what kind of arguments are most likely to change people’s minds, and what kinds of minds are most likely to be changed.

If you wanted to build a team in such a way that you maximized its overall intelligence, how would you do it? Would you stack it with high-IQ brainiacs? Would you populate it with natural leaders? Would you find experts on a wide range of topics? Well, those all sound like great ideas, but the latest research into collective intelligence suggests that none of them would work.

To create a team that is collectively intelligent, you likely need to focus on three specific factors that psychologist Christopher Chabris and his colleagues recently identified in their research, and in this episode of the You Are Not So Smart podcast, he will tell you all about them and why they seem to matter more than anything else.

If you could compare the person you were before you became sleep deprived to the person after, you’d find you’ve definitely become...lesser than.

When it comes to sleep deprivation, you can’t trust yourself to know just how much it is affecting you. You feel fine, maybe a bit drowsy, but your body is stressed in ways that diminish your health and slow your mind.

In this episode, we sit down with two researchers whose latest work suggests sleep deprivation also affects how you see other people. In tests of implicit bias, negative associations with certain religious and cultural categories emerged after people started falling behind on rest.

What effect does Google have on your brain? Here's an even weirder question: what effect does knowing that you have access to Google have on your brain?

In this episode we explore what happens when a human mind becomes aware that it can instantly, on-command, at any time, search for the answer to any question, and then, most of time, find it.

According to researcher Matthew Fisher, one of the strange side effects is an inflated sense of internal knowledge. In other words, as we use search engines, over time we grow to mistakenly believe we know more than we actually do even when we no longer have access to the internet.

From the opioid crisis to the widespread use of lobotomies to quiet problem patients, celebrity scientists and charismatic doctors have made tremendous mistakes, but thanks to their fame, they escaped the corrective mechanisms of science itself. Science always corrects the problem, but before it does, many people can be harmed, and society can suffer.

In this episode, we sit down with Dr. Paul Offit to discuss how we can get better at catching those mistakes before they happen and mitigating the harm once Pandora's Lab has been opened.

In late 2014 and early 2015, the city of Starkville, Mississippi, passed an anti-discrimination measure that lead to a series of public debates about an issue that people there had never discussed openly.

In this episode, we spend time in Starkville exploring the value of argumentation and debate in the process of change, progress, and understanding our basic humanity.

In this episode, psychologist Per Espen Stoknes discusses his book: What We Think About When We Try Not to Think About Global Warming.

Stoknes has developed a strategy for science communicators who find themselves confronted with climate change deniers who aren’t swayed by facts and charts. His book presents a series of psychology-based steps designed to painlessly change people’s minds and avoid the common mistakes scientists tend to make when explaining climate change to laypeople.

In this episode, Tali Sharot, a cognitive neuroscientist and psychologist at University College London, explains our' innate optimism bias.

When the brain estimates the outcome of future events, it tends to reduce the probability of negative outcomes for itself, but not so much for other people. In other words, if you are a smoker, everyone else is going to get cancer. The odds of success for a new restaurant change depending on who starts that venture, you or someone else. Sharot explains why and details how we can use our knowledge of this mental quirk to our advantage both personally and institutionally.

We are each born labeled. In moments of ambiguity, those labels can change the way people make decisions about us. As a cognitive process, it is invisible, involuntary, and unconscious – and that’s why psychology is working so hard to understand it.

Our guest for this episode is Adam Alter, a psychologist who studies marketing and communication, and his New York Times bestselling book is titled Drunk Tank Pink after the color used to paint the walls of police holding cells after research suggested it lessened the urge to fight.

Confirmation bias is our tendency to seek evidence that supports our beliefs and confirms our assumptions when we could just as well seek disconfirmation of those beliefs and assumptions instead.

This is such a prevalent feature of human cognition, that until recently a second bias has been hidden in plain sight. Our past beliefs and future desires usually match up. Desirability is often twisted into confirmation like a single psychological braid - but recent research suggests that something called desirability bias may be just as prevalent in our thinking. When future desires and past beliefs are incongruent, desire wins out.

Is psychology too WEIRD? That's what this episode's guest, psychologist Steven J. Heine suggested when he and his colleagues published a paper suggesting that psychology wasn't the study of the human mind, but the study of one kind of human mind, the sort generated by the kinds of brains that happen to be conveniently located near the places where research is usually conducted - North American college undergraduates. They called them the WEIRDest people in the world, short for Western, Education, Industrial, Rich, and Democratic - the kind of people who make up less than 15 percent of the world's population. In this episode, you'll learn why it took so long to figure out it was studying outliers, and what it means for the future of psychology.

In psychology, they call it naive realism, the tendency to believe that the other side is wrong because they are misinformed, that if they knew what you knew, they would change their minds to match yours.

According to Lee Ross, co-author of the new book, The Wisest One in the Room, this is the default position most humans take when processing a political opinion. When confronted with people who disagree, you tend to assume there must be a rational explanation. What we don't think, however, is maybe WE are the ones who are wrong. We never go into the debate hoping to be enlightened, only to crush our opponents.

Listen in this episode as legendary psychologist Lee Ross explains how to identify, avoid, and combat this most pernicious of cognitive mistakes.

"Science is wrong about everything, but you can trust it more than anything."

That's the assertion of psychologist Brian Nosek, director of the Center for Open Science, who is working to correct what he sees as the temporarily wayward path of psychology.

Currently, psychology is facing what some are calling a replication crisis. Much of the most headline-producing research in the last 20 years isn't standing up to attempts to reproduce its findings. Nosek wants to clean up the processes that have lead to this situation, and in this episode, you'll learn how.

In medical school they tell you half of what you are about to learn won't be true when you graduate - they just don't know which half. In every field of knowledge, half of what is true today will overturned, replaced, or refined at some point, and it turns out that we actually know when that will be for many things. In this episode, listen as author and scientist Sam Arbesman explains how understanding the half life of facts can lead to better lives, institutions, and, of course, better science.

The cyberpunks, the Founding Fathers, 19th Century philosophers, and the Enlightenment thinkers - they all looked forward to the world in which we now live, a multimedia psychedelic freakout in which information is free, decentralized, democratized, and easy to access. What they didn't count on though, was that we would choose to keep a whole lot of it out of our heads.

In this episode, we explore a psychological phenomenon called active information avoidance, the act of keeping our senses away from information that might be useful, and that we know is out there, but that we'd rather not learn.

Before we had names for them or a science to study them, the people who could claim the most expertise on biases, fallacies, heuristics and all the other quirks of human reasoning and perception were scam artists, con artists, and magicians. On this episode, magician and scam expert Brian Brushwood explains why people fall for scams of all sizes, how to avoid them, and why most magicians can spot a fraudster a mile away.

Show notes at: www.youarenotsosmart.com - Become a patron at: www.patreon.com/youarenotsosmart

Do we have the power to change the outcome of history? Is progress inevitable? Is it natural? Are we headed somewhere definite, or is change just chaos that seems organized in hindsight? In this episode we explore these questions with University of Chicago historian Ada Palmer.

If dumping evidence into people’s laps often just makes their beliefs stronger, would we just be better off trying some other tactic, or does the truth ever win?

Do people ever come around, or are we causing more harm than good by leaning on facts instead of some other technique?

In this episode we learn the answers to these questions and others from two scientists who have learned how to combat the backfire effect. One used an ingenious research method to identify the breaking point at which people stop resisting and begin accepting the fact that they might be wrong. The other literally wrote an instruction manual for avoiding the backfire effect and debunking myths using the latest psychological research into effective persuasive techniques.

If you try to correct someone who you know is wrong, you run the risk of alarming their brains to a sort-of existential, epistemic threat, and if you do that, when that person expends effortful thinking to escape, that effort can strengthen their beliefs instead of weakening them.

In this episode you'll hear from three experts who explain how trying to correct misinformation can end up causing more harm than good.

The research shows that when a strong-yet-erroneous, belief is challenged, yes, you might experience some temporary weakening of your convictions, some softening of your certainty, but most people rebound from that and not only reassert their original belief at its original strength, but go beyond that and dig in their heels, deepening their resolve over the long run.

Psychologists call this the backfire effect, and this episode is the first of three shows exploring this well-documented and much-studied psychological phenomenon, one that you’ve likely encountered quite a bit lately.

In this episode, we explore its neurological underpinning as two neuroscientists at the University of Southern California’s Brain and Creativity Institute explain how their latest research sheds new light on how the brain reacts when its deepest beliefs are challenged.