In 2017, YANSS did three episodes about the backfire effect, and by far, those episodes were the most popular that year. Then, in 2018, part four was the most popular.

The backfire effect has his special allure to it, because, on the surface, it seems to explain something we’ve all experienced -- when we argue with people who believe differently than us, who see the world through a different ideological lens -- they often resist our views, refuse to accept our way of seeing things, and it often seems like we do more harm than good, because they walk away seemingly more entrenched in their beliefs than before the argument began.

But…since those first three shows, researchers have produced a series new studies into the backfire effect that complicate things. Yes, we are observing something here, and yes we are calling it the backfire effect, but everything is not exactly as it seems, and so I thought we should invite these new researchers on the show and add a fourth episode to the backfire effect series based on what they’ve found. And this is that episode (again).

In this episode, we sit down with negotiation expert Misha Glouberman who explains how to talk to people about things -- that is, how to avoid the pitfalls associated with debate when two or more people attempt to come to an agreement that will be mutually beneficial.

In late 2014 and early 2015, the city of Starkville, Mississippi, passed an anti-discrimination measure that lead to a series of public debates about an issue that people there had never discussed openly.

In this episode, we spend time in Starkville exploring the value of argumentation and debate in the process of change, progress, and understanding our basic humanity.

Our guest in this episode of the You Are Not So Smart Podcast is Dave Levitan, a science journalist with a new book titled: Not a Scientist: how politicians mistake, misrepresent, and utterly mangle science.

In the book, Levitan takes us through 12 repeating patterns that politicians fall into when they mistake, misrepresent, and mangle science. Some are nefarious and intentional, some are based on ignorance, and some are just part of the normal business of politicians managing their public image or trying to appeal to their base.

On this episode, we welcome journalist Kate Leaver to talk about her new book The Friendship Cure in which she explores the crippling, damaging, life-threatening impact of loneliness and the severe mental health impacts of living a life disconnected from a support network of close contacts. But...there is a cure...learning how to connect with others and curate better friendships.

In the interview we talk about loneliness, how to make friends, the difference between male and female friendship, platonic friendships, friends with benefits and lots, lots, more.

In this episode, we sit down with psychologist Julia Shaw, an expert in memory and criminal psychology, to discuss her new book - Evil. In the book, she makes a case for something she calls "evil empathy," seeing people who do heinous things as fellow human beings instead of as monsters. According to Shaw, othering criminals by categorizing them as a separate kind of human allows us to put them out of our minds and disappear them to institutions or prisons. The result is we become less-able to prevent the sort of behavior the harms others from happening again and again. In fact, she says "there's no such thing as evil," and sees the term as an antiquated, magical label that dehumanizes others, preventing us from accumulating the sort of scientific evidence that could lead to a better society.

One of the most effective ways to change people’s minds is to put your argument into a narrative format, a story, but not just any story. The most persuasive narratives are those that transport us. Once departed from normal reality into the imagined world of the story we become highly susceptible to belief and attitude change.

In this episode, you’ll learn from psychologist Melanie C. Greene the four secrets to creating the most persuasive narratives possible.

In this episode we explore prevalence induced concept change. In a nutshell, when we set out to change the world by reducing examples of something we have deemed problematic, and we succeed, a host of psychological phenomena can mask our progress and make those problems seem intractable -- as if we are only treading water when, in fact, we’ve created the change we set out to make.

In this episode, Tali Sharot, a cognitive neuroscientist and psychologist at University College London, explains our' innate optimism bias.

When the brain estimates the outcome of future events, it tends to reduce the probability of negative outcomes for itself, but not so much for other people. In other words, if you are a smoker, everyone else is going to get cancer. The odds of success for a new restaurant change depending on who starts that venture, you or someone else. Sharot explains why and details how we can use our knowledge of this mental quirk to our advantage both personally and institutionally.

More about Tali Sharot and her book The Optimism Bias here: theoptimismbias.blogspot.com/

In this episode we sit down with psychology legend Richard Petty to discuss the Elaboration Likelihood Model, a theory he developed with psychologist John Cacioppo in the 1980s that unified the study of attitude change and persuasion and has since become one of the most robust models for explaining how and why some messages change people’s minds, some don’t, and what makes some stick and others fade in influence over time.

In this episode, we welcome Lilliana Mason on the program to discuss her new book, Uncivil Agreement, which focuses on the idea: “Our conflicts are over who we think we are, rather than reasoned differences of opinion.”

Personally, I feel like this is just about the most important thing the social sciences are studying right now, and I think Mason is one of the its most brilliant scientists - I promise, the insights you are about to hear will change the way you think about politics, tweeting, elections, and arguing with people on the other side of just about everything.

Is it true that all it takes to be an expert is 10,000 hours of practice? What about professional athletes? Do different people get more out of practice than others, and if so, is it nature or nurture? In this episode we ask all these things of David Epstein, author of The Sports Gene, who explains how practice affects the brain and whether or not greatness comes naturally or after lots and lots of effort.

In medical school they tell you half of what you are about to learn won't be true when you graduate - they just don't know which half. In every field of knowledge, half of what is true today will overturned, replaced, or refined at some point, and it turns out that we actually know when that will be for many things. In this episode, listen as author and scientist Sam Arbesman explains how understanding the half life of facts can lead to better lives, institutions, and, of course, better science.

Confirmation bias is our tendency to seek evidence that supports our beliefs and confirms our assumptions when we could just as well seek disconfirmation of those beliefs and assumptions instead.

This is such a prevalent feature of human cognition, that until recently a second bias has been hidden in plain sight. Our past beliefs and future desires usually match up. Desirability is often twisted into confirmation like a single psychological braid - but recent research suggests that something called desirability bias may be just as prevalent in our thinking. When future desires and past beliefs are incongruent, desire wins out.

What makes you happy? As in, what generates happiness inside the squishy bits that reside inside your skull? That's what author and neuroscientist Dean Burnett set out to answer in his new book, Happy Brain, which explores both the environmental and situational factors that lead to and away from happiness, and the neurological underpinnings of joy, bliss, comfort, love, and connection. In the episode you'll hear all that and more as we talk about what we know so far about the biological nature of happiness itself.

In this episode, we sit down with author Will Storr to talk about his new book -- Selfie: How We Became so Self-Obsessed, and What it is Doing to Us.

The book explores what he calls “the age of perfectionism” -- our modern struggle to meet newly emerging ideals and standards that tell us we are falling short of the person we ought to be. As he says in the book, "Perfectionism is the idea that kills," and you’ll hear him explain what he means by that in the interview.

Despite their relative invisibility, a norm, even a dying one, can sometimes be harnessed and wielded like a weapon by conjuring up old fears from a bygone era. It’s a great way to slow down social change if you fear that change. When a social change threatens your ideology, fear is the simplest, easiest way to keep more minds from changing.

In this episode of the You Are Not So Smart Podcast, we explore how the separate spheres ideology is still affecting us today, and how some people are using it to scare people into voting down anti-discrimination legislation.

When faced with an inescapable and unwanted situation, we often rationalize our predicament so as to make it seem less awful and more bearable, but what if that situation is a new law or a new administration? The latest research suggests that groups, nations, and cultures sometimes rationalize the new normal in much the same way, altering public opinion on a large scale.

In this episode we explore new research that suggests for the majority of the mind change we experience, after we update our priors, we delete what we used to believe and then simply forget that we ever thought otherwise.

In the show, psychologists Michael Wolfe and Todd Williams, take us though their new research which suggests that because brains so value consistency, and are so determined to avoid the threat of decoherence, we hide the evidence of our belief change. That way, the story we tell ourselves about who we are can remain more or less heroic, with a stable, steadfast protagonist whose convictions rarely waver -- or, at least, they don’t waver as much as those of shifty, flip-flopping politicians.

This can lead to a skewed perception of the world, one that leads to the assumption that mind change is rare and difficult-to-come-by. And that can lead to our avoiding information that might expand our understanding of the world, because we assume it will have no impact.

The truth, say Wolfe and Williams, is that mind change is so prevalent and constant, that the more you expose yourself to counterevidence, the more your worldview will erode, replaced by a better, more accurate one -- it's just that you probably won't realize it until you look back at old posts on social media and cringe.

Little did the champions of the Enlightenment know that once we had access to all the facts…well, reason and rationality wouldn’t just immediately wash across the land in a giant wave of enlightenment thinking. While that may be happening in some ways, the new media ecosystem has also unshackled some of our deepest psychological tendencies, things that enlightenment thinkers didn’t know about, weren’t worried about, or couldn’t have predicted. Many of which we’ve discussed in previous episodes like confirmation bias, selective skepticism, filter bubbles and so on. These things have always been with us, but modern technology has provided them with the perfect environment to flourish.

In this episode, we explore another such invasive psychological species called active information avoidance, the act of keeping our senses away from information that might be useful, that we know is out there, that would cost us nothing to obtain, but that we’d still rather not learn. From choosing not to open open bills, visit the doctor, check your bank account, or read the nutrition information on the back of that box of Girl Scout Cookies, we each choose to remain ignorant when we’d rather not feel the anguish of illumination, but that same tendency can also cause great harm both to individuals and whole cultures when it spreads through politics, science, markets, and medicine. In this show, you’ll learn how.

Last year on this show, we did three episodes about the backfire effect, and by far, those episodes were the most popular we’ve ever done.

In fact, the famous web comic The Oatmeal turned them into a sort of special feature, and that comic of those episodes was shared on Facebook a gazillion times, which lead to a stories about the comic in popular media, and then more people listened to the shows, on and on it went. You can go see it at The Oatmeal right now at the top of their page. It’s titled, you are not going to believe what I am about to tell you.

The popularity of the backfire effect extends into academia. The original paper has been cited hundreds of times, and there have been more than 300 articles written about it since it first came out.

The backfire effect has his special allure to it, because, on the surface, it seems to explain something we’ve all experienced -- when we argue with people who believe differently than us, who see the world through a different ideological lens -- they often resist our views, refuse to accept our way of seeing things, and it often seems like we do more harm than good, because they walk away seemingly more entrenched in their beliefs than before the argument began.

But…since those shows last year, researchers have produced a series new studies into the backfire effect that complicate things. Yes, we are observing something here, and yes we are calling it the backfire effect, but everything is not exactly as it seems, and so I thought we should invite these new researchers on the show and add a fourth episode to the backfire effect series based on what they’ve found. And this is that episode.

• The Great Courses: Free month at www.thegreatcoursesplus.com/smart • Squarespace: Use the offer code SOSMART at www.squarespace.com for 10 percent off your first purchase. • BeachBody on Demand: Get a free trial membership, access to the entire platform, when you text SMART to 303030.

Our guest for this episode, Will Storr, wrote a book called The Unpersuadables: Adventures with the Enemies of Science.

In that book, Storr spends time with Holocaust deniers, young Earth creationists, people who believe they’ve lived past lives as famous figures, people who believe they’ve been abducted by aliens, people who stake their lives on the power of homeopathy, and many more – people who believe things that most of us do not.

Storr explains in the book that after spending so much time with these people it started to become clear to him that it all goes back to that model of reality we all are forced to generate and then interact with. We are all forced to believe what that model tells us, and it is no different for people who are convinced that dinosaurs and human beings used to live together, or that you can be cured of an illness by an incantation delivered over the telephone. For some people, that lines up with their models of reality in a way that’s good enough. It’s a best guess.

Storr proposes you try this thought experiment. First, answer this question: Are you right about everything you believe? Now, if you are like most people, the answer is no. Of course not. As he says, that would mean you are a godlike and perfect human being. You’ve been wrong enough times to know it can’t be true. You are wrong about some things, maybe many things. That leads to a second question – what are you are wrong about? Storr says when he asked himself this second question, he started listing all the things he believed and checked them off one at a time as being true, he couldn’t think of anything about which he was wrong.

In this episode of the YANSS Podcast, we sit down with legendary science historian James Burke.

For much of his career, Burke has been creating documentaries and writing books aimed at helping us to make better sense of the enormous amount of information that he predicted would one day be at our fingertips.

In Connections, he offered an “alternate view of history” in which great insights took place because of anomalies and mistakes, because people were pursuing one thing, but it lead somewhere surprising or was combined with some other object or idea they could never have imagined by themselves. Innovation took place in the spaces between disciplines, when people outside of intellectual and professional silos, unrestrained by categorical and linear views, synthesized the work of people still trapped in those institutions, who, because of those institutions, had no idea what each other was up to and therefore couldn’t predict the trajectory of even their own disciplines, much less history itself.

In The Day the Universe Changed, Burke explored the sequential impact of discovery, innovation, and invention on how people defined reality itself. “You are what we know,” he wrote “and when the body of knowledge changes, so do we.” In this view of change, knowledge is invented as much as it is discovered, and new ideas “nibble at the edges” of common knowledge until values considered permanent and fixed fade into antiquity just like any other obsolete tool. Burke said that our system of knowledge and discovery has never been able, until recently, to handle more than one or two ways of seeing things at a time. In response we have long demanded conformity with the dominant worldview or with similarly homogenous ideological binaries.

My favorite line from the book has to do with imagining a group of scientists who live in a society that believes the universe is made of omelettes and goes about designing instruments to detect traces of interstellar egg residue. When they observe evidence of galaxies and black holes, to them it all just seems like noise. Their model of nature cannot yet accommodate what they are seeing, so they don’t see it. “All that can accurately be said about a man who thinks he is a poached egg,” joked Burke, “is that he is in the minority.”

In this episode we interview Dean Burnett, author of "Idiot Brain: What Your Brain is Really Up To." Burnett's book is a guide to the neuroscience behind the things that our amazing brains do poorly.

In the interview we discuss motion sickness, the pain of breakups, why criticisms are more powerful than compliments, the imposter syndrome, anti-intellectualism, irrational fears, and more. Burnett also explains how the brain is kinda sorta like a computer, but a really bad one that messes with your files, rewrites your documents, and edits your photos when you aren't around.

Dean Burnett is a neuroscientist who lectures at Cardiff University and writes about brain stuff over at his blog, Brain Flapping hosted by The Guardian.

In this divisive and polarized era how do you bridge the political divide between left and right? How do you persuade the people on the other side to see things your way?

New research by sociologist Robb Willer and psychologist Matthew Feinberg suggests that the answer is in learning how to cross something they call the empathy gap.

When we produce arguments, we do so from within our own moral framework and in the language of our moral values. Those values rest on top of a set of psychological tendencies influenced by our genetic predispositions and shaped by our cultural exposure that blind us to alternate viewpoints. Because of this, we find it very difficult to construct an argument with the same facts, but framed in a different morality. Willer’s work suggests that if we did that, we would find it a much more successful route to persuading people we usually think of as unreachable.

One of the most effective ways to change people’s minds is to put your argument into a narrative format, a story, but not just any story. The most persuasive narratives are those that transport us. Once departed from normal reality into the imagined world of the story we become highly susceptible to belief and attitude change.

In this episode, you’ll learn from psychologist Melanie C. Greene the four secrets to creating the most persuasive narratives possible.

For computer scientist Chenhao Tan and his team, the internet community called Change My View offered something amazing, a ready-made natural experiment that had been running for years.

All they had to do was feed it into the programs they had designed to understand the back-and-forth between human beings and then analyze the patterns the emerged. When they did that, they discovered two things: what kind of arguments are most likely to change people’s minds, and what kinds of minds are most likely to be changed.

If you wanted to build a team in such a way that you maximized its overall intelligence, how would you do it? Would you stack it with high-IQ brainiacs? Would you populate it with natural leaders? Would you find experts on a wide range of topics? Well, those all sound like great ideas, but the latest research into collective intelligence suggests that none of them would work.

To create a team that is collectively intelligent, you likely need to focus on three specific factors that psychologist Christopher Chabris and his colleagues recently identified in their research, and in this episode of the You Are Not So Smart podcast, he will tell you all about them and why they seem to matter more than anything else.

If you could compare the person you were before you became sleep deprived to the person after, you’d find you’ve definitely become...lesser than.

When it comes to sleep deprivation, you can’t trust yourself to know just how much it is affecting you. You feel fine, maybe a bit drowsy, but your body is stressed in ways that diminish your health and slow your mind.

In this episode, we sit down with two researchers whose latest work suggests sleep deprivation also affects how you see other people. In tests of implicit bias, negative associations with certain religious and cultural categories emerged after people started falling behind on rest.

What effect does Google have on your brain? Here's an even weirder question: what effect does knowing that you have access to Google have on your brain?

In this episode we explore what happens when a human mind becomes aware that it can instantly, on-command, at any time, search for the answer to any question, and then, most of time, find it.

According to researcher Matthew Fisher, one of the strange side effects is an inflated sense of internal knowledge. In other words, as we use search engines, over time we grow to mistakenly believe we know more than we actually do even when we no longer have access to the internet.

From the opioid crisis to the widespread use of lobotomies to quiet problem patients, celebrity scientists and charismatic doctors have made tremendous mistakes, but thanks to their fame, they escaped the corrective mechanisms of science itself. Science always corrects the problem, but before it does, many people can be harmed, and society can suffer.

In this episode, we sit down with Dr. Paul Offit to discuss how we can get better at catching those mistakes before they happen and mitigating the harm once Pandora's Lab has been opened.

In late 2014 and early 2015, the city of Starkville, Mississippi, passed an anti-discrimination measure that lead to a series of public debates about an issue that people there had never discussed openly.

In this episode, we spend time in Starkville exploring the value of argumentation and debate in the process of change, progress, and understanding our basic humanity.

In this episode, psychologist Per Espen Stoknes discusses his book: What We Think About When We Try Not to Think About Global Warming.

Stoknes has developed a strategy for science communicators who find themselves confronted with climate change deniers who aren’t swayed by facts and charts. His book presents a series of psychology-based steps designed to painlessly change people’s minds and avoid the common mistakes scientists tend to make when explaining climate change to laypeople.

In this episode, Tali Sharot, a cognitive neuroscientist and psychologist at University College London, explains our' innate optimism bias.

When the brain estimates the outcome of future events, it tends to reduce the probability of negative outcomes for itself, but not so much for other people. In other words, if you are a smoker, everyone else is going to get cancer. The odds of success for a new restaurant change depending on who starts that venture, you or someone else. Sharot explains why and details how we can use our knowledge of this mental quirk to our advantage both personally and institutionally.

We are each born labeled. In moments of ambiguity, those labels can change the way people make decisions about us. As a cognitive process, it is invisible, involuntary, and unconscious – and that’s why psychology is working so hard to understand it.

Our guest for this episode is Adam Alter, a psychologist who studies marketing and communication, and his New York Times bestselling book is titled Drunk Tank Pink after the color used to paint the walls of police holding cells after research suggested it lessened the urge to fight.

Confirmation bias is our tendency to seek evidence that supports our beliefs and confirms our assumptions when we could just as well seek disconfirmation of those beliefs and assumptions instead.

This is such a prevalent feature of human cognition, that until recently a second bias has been hidden in plain sight. Our past beliefs and future desires usually match up. Desirability is often twisted into confirmation like a single psychological braid - but recent research suggests that something called desirability bias may be just as prevalent in our thinking. When future desires and past beliefs are incongruent, desire wins out.

Is psychology too WEIRD? That's what this episode's guest, psychologist Steven J. Heine suggested when he and his colleagues published a paper suggesting that psychology wasn't the study of the human mind, but the study of one kind of human mind, the sort generated by the kinds of brains that happen to be conveniently located near the places where research is usually conducted - North American college undergraduates. They called them the WEIRDest people in the world, short for Western, Education, Industrial, Rich, and Democratic - the kind of people who make up less than 15 percent of the world's population. In this episode, you'll learn why it took so long to figure out it was studying outliers, and what it means for the future of psychology.

In psychology, they call it naive realism, the tendency to believe that the other side is wrong because they are misinformed, that if they knew what you knew, they would change their minds to match yours.

According to Lee Ross, co-author of the new book, The Wisest One in the Room, this is the default position most humans take when processing a political opinion. When confronted with people who disagree, you tend to assume there must be a rational explanation. What we don't think, however, is maybe WE are the ones who are wrong. We never go into the debate hoping to be enlightened, only to crush our opponents.

Listen in this episode as legendary psychologist Lee Ross explains how to identify, avoid, and combat this most pernicious of cognitive mistakes.

"Science is wrong about everything, but you can trust it more than anything."

That's the assertion of psychologist Brian Nosek, director of the Center for Open Science, who is working to correct what he sees as the temporarily wayward path of psychology.

Currently, psychology is facing what some are calling a replication crisis. Much of the most headline-producing research in the last 20 years isn't standing up to attempts to reproduce its findings. Nosek wants to clean up the processes that have lead to this situation, and in this episode, you'll learn how.

In medical school they tell you half of what you are about to learn won't be true when you graduate - they just don't know which half. In every field of knowledge, half of what is true today will overturned, replaced, or refined at some point, and it turns out that we actually know when that will be for many things. In this episode, listen as author and scientist Sam Arbesman explains how understanding the half life of facts can lead to better lives, institutions, and, of course, better science.

Before we had names for them or a science to study them, the people who could claim the most expertise on biases, fallacies, heuristics and all the other quirks of human reasoning and perception were scam artists, con artists, and magicians. On this episode, magician and scam expert Brian Brushwood explains why people fall for scams of all sizes, how to avoid them, and why most magicians can spot a fraudster a mile away.

Show notes at: www.youarenotsosmart.com - Become a patron at: www.patreon.com/youarenotsosmart

Do we have the power to change the outcome of history? Is progress inevitable? Is it natural? Are we headed somewhere definite, or is change just chaos that seems organized in hindsight? In this episode we explore these questions with University of Chicago historian Ada Palmer.

If dumping evidence into people’s laps often just makes their beliefs stronger, would we just be better off trying some other tactic, or does the truth ever win?

Do people ever come around, or are we causing more harm than good by leaning on facts instead of some other technique?

In this episode we learn the answers to these questions and others from two scientists who have learned how to combat the backfire effect. One used an ingenious research method to identify the breaking point at which people stop resisting and begin accepting the fact that they might be wrong. The other literally wrote an instruction manual for avoiding the backfire effect and debunking myths using the latest psychological research into effective persuasive techniques.

If you try to correct someone who you know is wrong, you run the risk of alarming their brains to a sort-of existential, epistemic threat, and if you do that, when that person expends effortful thinking to escape, that effort can strengthen their beliefs instead of weakening them.

In this episode you'll hear from three experts who explain how trying to correct misinformation can end up causing more harm than good.

The research shows that when a strong-yet-erroneous, belief is challenged, yes, you might experience some temporary weakening of your convictions, some softening of your certainty, but most people rebound from that and not only reassert their original belief at its original strength, but go beyond that and dig in their heels, deepening their resolve over the long run.

Psychologists call this the backfire effect, and this episode is the first of three shows exploring this well-documented and much-studied psychological phenomenon, one that you’ve likely encountered quite a bit lately.

In this episode, we explore its neurological underpinning as two neuroscientists at the University of Southern California’s Brain and Creativity Institute explain how their latest research sheds new light on how the brain reacts when its deepest beliefs are challenged.

Gordon Pennycook and his team at the University of Waterloo set out to discover if there was a spectrum of receptivity for a certain kind of humbug they call pseudo-profound bullshit – the kind that sounds deep and meaningful at first glance, but upon closer inspection means nothing at all. They wondered, is there a “type” of person who is more susceptible to that kind of language, and if so, what other things about personalities and thinking styles correlate with that tolerance and lack of skepticism, and why?

Even when the prison doors are left wide open, we sometimes refuse to attempt escape. Why is that?

In this rebroadcast of one of our most popular episodes we learn all about the strange phenomenon of learned helplessness and how it keeps people in bad jobs, poor health, terrible relationships, and awful circumstances despite how easy it might be to escape any one of those scenarios with just one more effort.

You'll learn how to defeat this psychological trap with advice from psychologists Jennifer Welbourne, who studies attributional styles in the workplace, and Kym Bennett who studies the effects of pessimism on health.

Legendary science historian James Burke returns to explain his newest project, a Connections app that will allow anyone to "think connectively" about the webs of knowledge available on Wikipedia.

Burke predicted back in 1978 that we’d one day need better tools than just search alone if we were to avoid the pitfalls of siloed information and confirmation bias, and this month he launched a Kickstarter campaign to help create just such a tool - an app that searches connectivity and produces something Google and social media often don’t - surprises, anomalies, unexpected results, and connections, in the same style as his documentary series, books, and other projects.

In the interview, Burke shares his latest insights on change, technology, the future, social media, models of reality, and more.

To support the Kickstarter campaign for the Connections app, here are some links:

In this divisive and polarized era how do you bridge the political divide between left and right? You do you persuade the people on the other side to see things your way?

New research by sociologist Robb Willer and psychologist Matthew Feinberg suggests that the answer is in learning how to cross something they call the empathy gap.

When we produce arguments, we do so from within our own moral framework and in the language of our moral values. Those values rest on top of a set of psychological tendencies influenced by our genetic predispositions and shaped by our cultural exposure that blind us to alternate viewpoints.

Because of this, we find it very difficult to construct an argument with the same facts, but framed in a different morality. Willer's work suggests that if we did that, we would find it a much more successful route to persuading people we usually think of as unreachable.

For computer scientist Chenhao Tan and his team, the internet community called Change My View offered something amazing, a ready-made natural experiment that had been running for years.

All they had to do was feed it into the programs they had designed to understand the back-and-forth between human beings and then analyze the patterns the emerged. When they did that, they discovered two things: what kind of arguments are most likely to change people’s minds, and what kinds of minds are most likely to be changed.

In this episode you’ll hear from the co-founder of Reddit, the moderators of Change My View, and the scientists studying how people argue on the internet as we explore what it takes to change people’s perspective and whether the future of our online lives is thicker filter bubbles or the whittling away of bad ideas.

Julia Shaw's research demonstrates the fact that there is no reason to believe a memory is more accurate just because it is vivid or detailed. Actually, that’s a potentially dangerous belief.

Shaw used techniques similar to police interrogations, and over the course of three conversations she and her team were able to convince a group of college students that those students had committed a felony crime.

In this episode, you’ll hear her explain how easy it is to implant the kind of false memories that cause people just like you to believe they deserve to go to jail for crimes that never happened and what she suggests police departments should do to avoid such distortions of the truth.

Why do people cheat? Why are our online worlds often so toxic? What motivates us to "catch 'em all" in Pokemon, grinding away for hours to hatch eggs?

In this episode, psychologist Jamie Madigan, author of Getting Gamers, explains how by exploring the way people interact with video games we can better understand how brains interact with everything else.

In this episode we interview Dean Burnett, author of "Idiot Brain: What Your Brain is Really Up To." Burnett's book is a guide to the neuroscience behind the things that our amazing brains do poorly.

In the interview we discuss motion sickness, the pain of breakups, why criticisms are more powerful than compliments, the imposter syndrome, anti-intellectualism, irrational fears, and more. Burnett also explains how the brain is kinda sorta like a computer, but a really bad one that messes with your files, rewrites your documents, and edits your photos when you aren't around.

Dean Burnett is a neuroscientist who lectures at Cardiff University and writes about brain stuff over at his blog, Brain Flapping hosted by The Guardian.

This episode’s guest, Michael Bond, is the author of The Power of Others, and reading his book I was surprised to learn that despite several decades of research into crowd psychology, the answers to most questions concerning crowds can still be traced back to a book printed in 1895.

Gustave’s Le Bon’s book, “The Crowd: A Study of the Popular Mind,” explains that humans in large groups are dangerous, that people spontaneously de-evolve into subhuman beasts who are easily swayed and prone to violence. That viewpoint has informed the policies and tactics of governments and police forces for more than a century, and like many prescientific musings, much of it is wrong.

Listen in this episode as Bond explains that the more research the social sciences conduct, the less the idea of a mindless, animalistic mob seems to be true. He also explains what police forces and governments should be doing instead of launching tear gas canisters from behind riot shields which actually creates the situation they are trying to prevent. Also, we touch on the psychology of suicide bombers, which is just as surprising as what he learned researching crowds.

In this episode, psychologist Per Espen Stoknes discusses his book: What We Think About When We Try Not to Think About Global Warming.

Stoknes has developed a strategy for science communicators who find themselves confronted with climate change deniers who aren't swayed by facts and charts. His book presents a series of psychology-based steps designed to painlessly change people’s minds and avoid the common mistakes scientists tend to make when explaining climate change to laypeople.

Oddly enough, we don’t actually know very much about how to change people’s minds, not scientifically, that's why the work of the a group of LGBT activists in Los Angeles is offering something valuable to psychology and political science - uncharted scientific territory.

The Leadership Lab has been developing a technique for the last eight years that can change a person’s mind about a contentious social issue after a 20-minute conversation.

This episode is about that group's redemption after their reputation was threatened by a researcher who, in studying their persuasion technique, committed scientific fraud and forced the retraction of his paper. That research and the retraction got a lot of media attention in 2015, but the story didn't end there.

In the show, you will meet the scientists who uncovered that researcher's fraud and then decided to go ahead and start over, do the research themselves, and see if the technique actually worked.

Common sense used to dictate that men and women should only come together for breakfast and dinner.

According to Victorian historian Kaythrn Hughes, people in the early 19th Century thought the outside world was dangerous and unclean and morally dubious and thus no place for a virtuous, fragile woman. The home was a paradise, while men went out into the world and got their hands dirty.

By the mid 1800s, women were leaving home to work in factories and much more, and if you believed in preserving the separate spheres, the concept that men and women should only cross paths at breakfast and dinner, then as we approached the 20th century, this created a lot of anxiety for you.

In this episode of the You Are Not So Smart Podcast, we explore how the separate spheres ideology is still affecting us today, and how some people are using it to scare people into voting down anti-discrimination legislation.

Hypothetical situations involving dragons, robots, spaceships, and vampires have all been used to prove and disprove arguments.

Statements about things that do not exist can still be true, and can be useful thinking tools for exploring philosophical, logical, sociological, and scientific concepts.

The problem is that sometimes those same arguments accidentally require those fictional concepts to be real in order to support their conclusions, and that’s when you commit the existential fallacy.

In this episode we explore the most logical logical fallacy of them all, the existential fallacy. No need to get out your pens and paper, we will do that for you, as we make sense of one the most break-breaking thinking mistakes we’ve ever discovered.

Here is a logic puzzle created by psychologists Daniel Kahneman and Amos Tversky.

Linda is single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with the issue of discrimination and social justice, and also participated in demonstrations. Which of the following is more probable: Linda is a bank teller or Linda is a bank teller AND is active in the feminist movement?

In studies, when asked this question, more than 80 percent of people chose number two. Most people said it was more probably that Linda is a bank teller AND active in the feminist movement, but that's wrong. Can you tell why?

This thinking mistake is an example of the subject of this episode - the conjunction fallacy. Listen as three experts in logic and reasoning explain why people get this question wrong, why it is wrong, and how you can avoid committing the conjunction fallacy in other situations.

If you traced back the ad hominem attack and the argument from authority to their shared source, you would find the genetic fallacy, a fallacy that appears when people trace things back to their sources.

We often overstate and overestimate just how much we can learn about a claim based on where that claim originated, and that's the crux of the genetic fallacy.

In this episode listen as three experts in logic and reasoning explain when we should and when we should not take the source of a statement into account when deciding if something is true or false.

If you believe something is bad because it is...bad, or that something is good because, well, it's good, you probably wouldn't use that kind of reasoning in an argument, yet, sometimes, without realizing it, that's exactly what you do.

In this episode three experts in logic and rationality explain how circular reasoning leads us to "beg the question" when producing arguments and defending our ideas, beliefs, and behaviors.

For some, we see them as either true or false, correct or incorrect. For others, we see them as probabilities, chances, odds. In one world, certainty, in the other, uncertainty.

In this episode you will learn from two experts in reasoning how to apply a rule from the 1700s that makes it possible to see all of your beliefs as being in “grayscale,” as neither black nor white, neither 0 nor 100 percent, but always somewhere in between, as a shade of gray reflecting your confidence in just how wrong you might be...given the evidence at hand.

In this episode, we explore why we are unaware that we lack the skill to tell how unskilled and unaware we are.

The evidence gathered so far by psychologists and neuroscientists seems to suggest that each one of us has a relationship with our own ignorance, a dishonest, complicated relationship, and that dishonesty keeps us sane, happy, and willing to get out of bed in the morning. Part of that ignorance is a blind spot we each possess that obscures both our competence and incompetence called the Dunning-Kruger Effect.

It's a psychological phenomenon that arises sometimes in your life because you are generally very bad at self-assessment. If you have ever been confronted with the fact that you were in over your head, or that you had no idea what you were doing, or that you thought you were more skilled at something than you actually were – then you may have experienced this effect. It is very easy to be both unskilled and unaware of it, and in this episode we explore why that is with professor David Dunning, one of the researchers who coined the term and a scientist who continues to add to our understanding of the phenomenon.

Does the Bermuda Triangle seem quite as mysterious once you know that just about any triangle of that size drawn over the globe just about anywhere will contain as many, if not more, missing planes?

When you desire meaning, when you want things to line up, when looking for something specific, you tend to notice patterns everywhere, which leads you to ask the question, "What are the odds?" Usually, the odds are actually pretty good.

Though some things in life seem too amazing to be coincidence, too odd to be random, too similar to be chance, given enough time (and enough events) randomness will begin to clump in places. You are born looking for clusters where chance events have built up like sand into dunes. Picking out clusters of coincidence is a predicable malfunction of a normal human mind, and it can lead to the Texas Sharpshooter Fallacy.

What do you do when a member of a group to which you belong acts in a way that you feel is in opposition to your values? Do you denounce the group, or do you redefine the boundaries of membership?

When our identities become intertwined with our definitions, we can easily fall victim to something called The No true Scotsman Fallacy.

In this episode, you will learn from three experts in logic and argumentation how to identify, defend against, and avoid deploying this strange thinking quirk that leads to schisms and stasis in groups both big and small.

Obviously, the world isn't black and white, so why do we try to drain it of color when backed into a rhetorical corner?

Why do we have such a hard time realizing that we've suggested the world is devoid of nuance when we are in the heat of an argument?

In this episode we explore the black and white fallacy and the false dichotomies it generates. You'll learn how to spot this fallacy, what to do when someone uses it against you, and how to avoid committing it yourself.

When confronted with dogma-threatening, worldview-menacing ideas, your knee-jerk response is usually to lash out and try to bat them away, but thanks to a nearly unavoidable mistake in reasoning, you often end up doing battle with arguments of your own creation.

Your lazy brain is always trying to make sense of the world on ever-simpler terms. Just as you wouldn’t use a topographical map to navigate your way to Wendy’s, you tend to navigate reality using a sort of Google Maps interpretation of events and ideas. It’s less accurate, sure, but much easier to understand when details aren’t a priority. But thanks to this heuristical habit, you sometimes create mental men of straw that stand in for the propositions put forth by people who see the world a bit differently than you. In addition to being easy to grasp, they are easy to knock down and hack apart, which wouldn’t be a problem if only you noticed the switcheroo.

This is the essence of the straw man fallacy, probably the most common of all logical fallacies. Setting up and knocking down straw men is so easy to do while arguing that you might not even notice that you are doing it.

In this episode, you’ll learn from three experts in logic and arguing why human brains tend not to realize they are constructing artificial versions of the arguments they wish to defeat. Once you’ve wrapped your mind around that idea, you’ll then learn how to spot the straw man fallacy, how to avoid committing it, and how to defend against it.

If you have ever been in an argument, you've likely committed a logical fallacy, and if you know how logical fallacies work, you've likely committed the fallacy fallacy.

Listen as three experts in logic and arguing explain just what a formal argument really is, and how to spot, avoid, and defend against the one logical fallacy that is most likely to turn you into an internet blowhard.

How strong is your bullshit detector? And what exactly IS the scientific definition of bullshit?

In this episode we explore what makes a person susceptible to bullshit, how to identify and defend against it, and what kind of people are the most and least likely to be bowled over by bullshit artists and other merchants of pseudo-profound, feel-good woo.

Our guest in this episode of the You Are Not So Smart Podcast is psychologist Laurie Santos who heads the Comparative Cognition Laboratory at Yale University. In that lab, she and her colleagues are exploring the fact that when two species share a relative on the evolutionary family tree, not only do they share similar physical features, but they also share similar behaviors. Psychologists and other scientists have used animals to study humans for a very long time, but Santos and her colleagues have taken it a step further by choosing to focus on a closer relation, the capuchin monkey; that way they could investigate subtler, more complex aspects of human decision making – like cognitive biases.

One of her most fascinating lines of research has come from training monkeys how to use money. That by itself is worthy of a jaw drop or two. Yes, monkeys can be taught how to trade tokens for food, and for years, Santos has observed capuchin monkeys attempting to solve the same sort of financial problems humans have attempted in prior experiments, and what Santos and others have discovered is pretty amazing. Monkeys and humans seem to be prone to the same biases, and when it comes to money, they make the same kinds of mistakes.

What effect does Google have on your brain? Here's an weirder question: what effect does knowing that you have access to Google have on your brain?

In this episode we explore what happens when a human mind becomes aware that it can instantly, on-command, at any time, search for the answer to any question, and then, most of time, find it.

According to researcher Matthew Fisher, one of the strange side effects is an inflated sense of internal knowledge. In other words, as we use search engines, over time we grow to mistakenly believe we know more than we actually do even when we no longer have access to the internet.

In psychology, they call it naive realism, the tendency to believe that the other side is wrong simply because they are misinformed.

According to Lee Ross, co-author of the new book, The Wisest One in the Room, naive realism has three tenets. One, you tend to believe that you arrived at your political opinions after careful, rational analysis through unmediated thoughts and perceptions. Two, since you are extremely careful and devoted to sticking to the facts and thus free from bias and impervious to persuasion, anyone else who has read the things you have read or seen the things you have seen will naturally see things your way, given that they’ve pondered the matter as thoughtfully as you have. And three, if anyone does disagree with your political opinions it must be because they simply don’t have all the facts yet.

Since this is the default position most humans take when processing a political opinion, when confronted with people who disagree, you tend to assume there must be a rational explanation. Usually, that explanation is that the other side is either lazy or stupid or corrupted by some nefarious information-scrambling entity like cable news, a blowhard pundit, a charming pastor or a lack thereof.

Ross and Ward concluded that naive realism leads people to approach political arguments with the confidence that “rational open-minded discourse” will naturally lead to a rapid narrowing of disagreement, but that confidence usually short lived. Instead, they say our “repeated attempts at dialogue with those on the ‘other side’ of a contentious issue make us aware that they rarely yield to our attempts at enlightenment; nor do they yield to the efforts of articulate, fair-minded spokespersons who share our views.” In other words, it’s naive to think evidence presented from the sources you trust will sway your opponents because when they do the same, it never sways you.

Listen in this episode as legendary psychologist Lee Ross explains how to identify, avoid, and combat this most pernicious of cognitive mistakes.

Just as you can change your body at the atomic level by lifting weights, you can willfully alter your brain by...thinking in a certain way. In this episode we explore using your brain to change your brain at the level of neurons and synapses beyond what is possible through other methods like learning a new language or earning a degree in chemistry. With mindfulness meditation, the evidence seems to suggest that one can achieve a level of change that would be impossible otherwise. The more you attempt to focus, the better you get at focusing on command, and so a real change begins taking place - you slowly become able to think differently, to hold thoughts differently and to dismiss thoughts that before led to attention difficulties or what feels like unwanted thoughts or clutter - and that’s not magical or the result of shaking hands with a deity, it’s biological. Listen as author and meditation teacher Michael Taft explains the benefits of secular, scientific practice of modern mindfulness meditation

Is all this new technology improving our thinking or dampening it? Are all these new communication tools turning us into navel-gazing human/brand hybrids, or are we developing a new set of senses that allow us to benefit from never severing contact with the people most important to us?

That's the topic of this episode of the You Are Not So Smart Podcast, and to answer these questions we welcome this episode's guest, Clive Thompson, who is the author of Smarter Than You Think: How Technology is Changing Our Minds for the Better. As the title suggests, he disagrees with the naysayers, and his book is an impressive investigation into why they are probably (thankfully) wrong.

Reframing is one of those tools that emerged from psychology that just plain works...it’s practical, simple, and with practice and repetition leads to real change in people with a variety of problems. It works because we rarely question our own interpretations, the meanings we construct when examining a set of facts, or out own introspection of internal emotional states. So much of the things we feel in anticipation are just best guesses and assumptions, models of reality that may or may not be accurate and will likely pan out much differently than we predict.

In this episode, we meet Tom Bunn, a former pilot, and Robert Morris, a startup CEO, who are both exploring the power of reframing to change people's thoughts and behaviors - one to conquer the fear of flying, the other to crowdsource a new social network devoted to mental health.

Just how much control do you really have over your life, relationships, happiness, and the events around the world? Well, much less than we would like to admit.

Yet, time and again, we see in psychology, that in situations in which the outcomes are clearly, undoubtable random, people tend to latch onto any shred of evidence that could be interpreted otherwise.

Our guests in this episode are psychiatrist Michael I. Bennett and his comedy writer daughter Sarah Bennett whose new book, Fuck Feelings makes the case for accepting the illusion of control as a guiding principle for living a better life.

10 years after Katrina the residents of New Orleans and portions of Mississippi are still experiencing PTSD. In this episode we explore what causes this disorder, why it happens, what triggers the symptoms, and how to combat the effects with University of New Orlean psychologist Robert D. Laird.

Before we had names for them or a science to study them, the people who could claim the most expertise on biases, fallacies, heuristics and all the other quirks of human reasoning and perception were scam artists, con artists, and magicians. On this episode, magician and scam expert Brian Brushwood explains why people fall for scams of all sizes, how to avoid them, and why most magicians can spot a fraudster a mile away.

Is psychology too WEIRD? That's what this episode's guest, psychologist Steven J. Heine suggested when he and his colleagues published a paper suggesting that psychology wasn't the study of the human mind, but the study of one kind of human mind, the sort generated by the kinds of brains that happen to be conveniently located near the places where research is usually conducted - North American college undergraduates. They called them the WEIRDest people in the world, short for Western, Education, Industrial, Rich, and Democratic - the kind of people who make up less than 15 percent of the world's population. In this episode, you'll learn why it took so long to figure out it was studying outliers, and what it means for the future of psychology.

Is the person you believe to be the protagonist of your life story real or a fictional character? In other words, is your very self real or is it an illusion? According to psychologist Bruce Hood, the person at the center of your life isn't really there; it's all neurological smoke and mirrors. Sure, you have the sensation that you have a self, and that sensation is real, but the beliefs and ideas that spring from it are not. Learn all about it in this episode in which you'll hear some new material mixed with a rebroadcast of episode four's interview with the author of The Self Illusion, Bruce Hood.

Can new computer programs rid us of the cognitive errors that lead to learned helplessness in the classroom? In this episode Ulrik Christensen, senior fellow of digital learning at McGraw-Hill Education, explains how adaptive learning tools are changing the way teachers approach students, empowering educators to provide the kind of attention required to pass along mastery in areas where traditional approaches don't seem to work.

Stuck in a bad situation, even when the prison doors are left wide open, we sometimes refuse to attempt escape. Why is that? In this episode learn all about the strange phenomenon of learned helplessness and how it keeps people in bad jobs, poor health, terrible relationships, and awful circumstances despite how easy it might be to escape any one of those scenarios with just one more effort. In the episode, you'll learn how to defeat this psychological trap with advice from psychologists Jennifer Welbourne, who studies attributional styles in the workplace, and Kym Bennett who studies the effects of pessimism on health.

It’s peculiar, your inability to predict what will make you happy, and that inability leads you to do stupid things with your money. Once you get a decent job that allows you to buy new shoes on a whim, you start accumulating stuff, and the psychological research into happiness says that stuff is a crappy source of lasting joy. In this rebroadcast, listen as psychologist Elizabeth Dunn explains how to get more happiness out of your money...with science!

Work sucks, but it doesn't have to. In this episode we go inside Google in an interview with Lazlo Bock, head of People Operations. Bock has helped the company conduct experiments and introduce policies and procedures that have applied knowledge gained from psychology and neuroscience concerning biases, fallacies, and other weird human behavior quirks. The result has been a workplace where people are happier, more productive, and better able to pursue that which fulfills their ambitions. Learn all about Google's approach as Lazlo describes his new book, Work Rules, a collection of insights from Google's evidence-based, data-driven human relations.

What if you could give yourself a superpower? That's what Jia Jiang wondered when he began a quest to remove the fear of rejection from his brain and become the risk-taking, adventurous person he always wanted to be. Hear how he forced himself to feel the pain of rejection 100 times in 100 days in an effort to desensitize himself, and how he recorded every moment on his way to making himself a better person.

Can you change a person's mind on a divisive social issue? The latest science says...yes. But it will require two things: contact and disclosure. In this episode you'll travel to Mississippi to see how professional mind changers are working to shift attitudes on LGBT rights, and you'll learn how a man in Los Angeles conducted 12,000 conversations until he was able to perfect the most powerful version of contact possible. In one 22-minute chat, Dave Fleischer can change people's minds on issues they've felt strongly about for decades, and change them forever.

Public shaming is back. Once done in town squares, the subjects of our ridicule locked in pillories and unable to avoid the rotten fruit and insults we hurled at them, now the shaming takes place on the internet. No longer our neighbors, the new targets are strangers and celebrities, and instead of courts meting out justice, it is the aggregate outrage of well-meaning people on Twitter just like you. Listen as author Jon Ronson describes his new book, “So You’ve Been Publicly Shamed,” in which he spends time with people who have had their lives ruined by modern, web-based public shamings in an attempt to reveal to each of us what can happen when, alone but together, we obliterate people for unpopular opinions, off-color jokes, offensive language, and professional faux pas.

In this inbetweenisode you will hear an excerpt from a lecture I gave at DragonCon2014 and an interview with neurologist and host of The Skeptic's Guide to the Universe Steven Novella who discusses the psychology and neuroscience behind conspiracy theories and conspiracy theorists.

In this episode, we talk to Danielle Ofri, a physician and author of "What Doctors Feel" - a book about the emotional lives of doctors and how compassion fatigue, biases, and other mental phenomena affect their decisions, their motivations, and their relationships with patients.

You'll also hear Ofri discuss emotional epidemiology, the viral-like spread of fear and other emotions that can lead to panics like those we've seen surrounding Ebola, the Swine Flu, SARS, and other illnesses.

This episode is a rebroadcast of two interviews from episode 20 all about how we are very, very bad at predicting the future both in our personal lives and as as a species.

The first interview is with Matt Novak who writes for Paleofuture, a blog at Gizmodo that explores how people from the past imagined, often very incorrectly, what the future might be like in the decades to come. The second is with James Burke, the legendary science communicator and historian who created Connections and The Day the Universe Changed.

Scientists are using rubber hands and virtual reality to transfer people's minds into avatars designed to look like members of groups and subcultures to which the subjects do not belong, and the results have been - well, trippy. Can changing your body, even just for a few minutes, change your mind. Can a psychological body transfer melt away long-held opinions and unconscious prejudices? Learn what cognitive neuroscientist Lara Maister has discovered in her unconventional experiments.

In this episode, two stories, one about a football game that split reality in two for the people who witnessed it, and another about what happened when a naked man literally appeared out of thin air inside a couple's apartment while they were getting ready for work.

How far back can we trace our irrational behaviors and cognitive biases? Evolutionarily speaking, why do we even do these things? Can we blame our faulty logic on our cultures and institutions, or should we blame it on our biology and our genetic inheritance? Our guest on this episode is psychologist Laurie Santos who has created a novel approach to solving these questions - a marketplace where monkeys learn how to use money just like humans, and where they also tend to make the same kind of mistakes we do.