In medical school they tell you half of what you are about to learn won't be true when you graduate - they just don't know which half. In every field of knowledge, half of what is true today will overturned, replaced, or refined at some point, and it turns out that we actually know when that will be for many things. In this episode, listen as author and scientist Sam Arbesman explains how understanding the half life of facts can lead to better lives, institutions, and, of course, better science.

Confirmation bias is our tendency to seek evidence that supports our beliefs and confirms our assumptions when we could just as well seek disconfirmation of those beliefs and assumptions instead.

This is such a prevalent feature of human cognition, that until recently a second bias has been hidden in plain sight. Our past beliefs and future desires usually match up. Desirability is often twisted into confirmation like a single psychological braid - but recent research suggests that something called desirability bias may be just as prevalent in our thinking. When future desires and past beliefs are incongruent, desire wins out.

What makes you happy? As in, what generates happiness inside the squishy bits that reside inside your skull? That's what author and neuroscientist Dean Burnett set out to answer in his new book, Happy Brain, which explores both the environmental and situational factors that lead to and away from happiness, and the neurological underpinnings of joy, bliss, comfort, love, and connection. In the episode you'll hear all that and more as we talk about what we know so far about the biological nature of happiness itself.

In this episode, we sit down with author Will Storr to talk about his new book -- Selfie: How We Became so Self-Obsessed, and What it is Doing to Us.

The book explores what he calls “the age of perfectionism” -- our modern struggle to meet newly emerging ideals and standards that tell us we are falling short of the person we ought to be. As he says in the book, "Perfectionism is the idea that kills," and you’ll hear him explain what he means by that in the interview.

Despite their relative invisibility, a norm, even a dying one, can sometimes be harnessed and wielded like a weapon by conjuring up old fears from a bygone era. It’s a great way to slow down social change if you fear that change. When a social change threatens your ideology, fear is the simplest, easiest way to keep more minds from changing.

In this episode of the You Are Not So Smart Podcast, we explore how the separate spheres ideology is still affecting us today, and how some people are using it to scare people into voting down anti-discrimination legislation.

When faced with an inescapable and unwanted situation, we often rationalize our predicament so as to make it seem less awful and more bearable, but what if that situation is a new law or a new administration? The latest research suggests that groups, nations, and cultures sometimes rationalize the new normal in much the same way, altering public opinion on a large scale.

In this episode we explore new research that suggests for the majority of the mind change we experience, after we update our priors, we delete what we used to believe and then simply forget that we ever thought otherwise.

In the show, psychologists Michael Wolfe and Todd Williams, take us though their new research which suggests that because brains so value consistency, and are so determined to avoid the threat of decoherence, we hide the evidence of our belief change. That way, the story we tell ourselves about who we are can remain more or less heroic, with a stable, steadfast protagonist whose convictions rarely waver -- or, at least, they don’t waver as much as those of shifty, flip-flopping politicians.

This can lead to a skewed perception of the world, one that leads to the assumption that mind change is rare and difficult-to-come-by. And that can lead to our avoiding information that might expand our understanding of the world, because we assume it will have no impact.

The truth, say Wolfe and Williams, is that mind change is so prevalent and constant, that the more you expose yourself to counterevidence, the more your worldview will erode, replaced by a better, more accurate one -- it's just that you probably won't realize it until you look back at old posts on social media and cringe.

Little did the champions of the Enlightenment know that once we had access to all the facts…well, reason and rationality wouldn’t just immediately wash across the land in a giant wave of enlightenment thinking. While that may be happening in some ways, the new media ecosystem has also unshackled some of our deepest psychological tendencies, things that enlightenment thinkers didn’t know about, weren’t worried about, or couldn’t have predicted. Many of which we’ve discussed in previous episodes like confirmation bias, selective skepticism, filter bubbles and so on. These things have always been with us, but modern technology has provided them with the perfect environment to flourish.

In this episode, we explore another such invasive psychological species called active information avoidance, the act of keeping our senses away from information that might be useful, that we know is out there, that would cost us nothing to obtain, but that we’d still rather not learn. From choosing not to open open bills, visit the doctor, check your bank account, or read the nutrition information on the back of that box of Girl Scout Cookies, we each choose to remain ignorant when we’d rather not feel the anguish of illumination, but that same tendency can also cause great harm both to individuals and whole cultures when it spreads through politics, science, markets, and medicine. In this show, you’ll learn how.

Last year on this show, we did three episodes about the backfire effect, and by far, those episodes were the most popular we’ve ever done.

In fact, the famous web comic The Oatmeal turned them into a sort of special feature, and that comic of those episodes was shared on Facebook a gazillion times, which lead to a stories about the comic in popular media, and then more people listened to the shows, on and on it went. You can go see it at The Oatmeal right now at the top of their page. It’s titled, you are not going to believe what I am about to tell you.

The popularity of the backfire effect extends into academia. The original paper has been cited hundreds of times, and there have been more than 300 articles written about it since it first came out.

The backfire effect has his special allure to it, because, on the surface, it seems to explain something we’ve all experienced -- when we argue with people who believe differently than us, who see the world through a different ideological lens -- they often resist our views, refuse to accept our way of seeing things, and it often seems like we do more harm than good, because they walk away seemingly more entrenched in their beliefs than before the argument began.

But…since those shows last year, researchers have produced a series new studies into the backfire effect that complicate things. Yes, we are observing something here, and yes we are calling it the backfire effect, but everything is not exactly as it seems, and so I thought we should invite these new researchers on the show and add a fourth episode to the backfire effect series based on what they’ve found. And this is that episode.

• The Great Courses: Free month at www.thegreatcoursesplus.com/smart • Squarespace: Use the offer code SOSMART at www.squarespace.com for 10 percent off your first purchase. • BeachBody on Demand: Get a free trial membership, access to the entire platform, when you text SMART to 303030.

Our guest for this episode, Will Storr, wrote a book called The Unpersuadables: Adventures with the Enemies of Science.

In that book, Storr spends time with Holocaust deniers, young Earth creationists, people who believe they’ve lived past lives as famous figures, people who believe they’ve been abducted by aliens, people who stake their lives on the power of homeopathy, and many more – people who believe things that most of us do not.

Storr explains in the book that after spending so much time with these people it started to become clear to him that it all goes back to that model of reality we all are forced to generate and then interact with. We are all forced to believe what that model tells us, and it is no different for people who are convinced that dinosaurs and human beings used to live together, or that you can be cured of an illness by an incantation delivered over the telephone. For some people, that lines up with their models of reality in a way that’s good enough. It’s a best guess.

Storr proposes you try this thought experiment. First, answer this question: Are you right about everything you believe? Now, if you are like most people, the answer is no. Of course not. As he says, that would mean you are a godlike and perfect human being. You’ve been wrong enough times to know it can’t be true. You are wrong about some things, maybe many things. That leads to a second question – what are you are wrong about? Storr says when he asked himself this second question, he started listing all the things he believed and checked them off one at a time as being true, he couldn’t think of anything about which he was wrong.

In this episode of the YANSS Podcast, we sit down with legendary science historian James Burke.

For much of his career, Burke has been creating documentaries and writing books aimed at helping us to make better sense of the enormous amount of information that he predicted would one day be at our fingertips.

In Connections, he offered an “alternate view of history” in which great insights took place because of anomalies and mistakes, because people were pursuing one thing, but it lead somewhere surprising or was combined with some other object or idea they could never have imagined by themselves. Innovation took place in the spaces between disciplines, when people outside of intellectual and professional silos, unrestrained by categorical and linear views, synthesized the work of people still trapped in those institutions, who, because of those institutions, had no idea what each other was up to and therefore couldn’t predict the trajectory of even their own disciplines, much less history itself.

In The Day the Universe Changed, Burke explored the sequential impact of discovery, innovation, and invention on how people defined reality itself. “You are what we know,” he wrote “and when the body of knowledge changes, so do we.” In this view of change, knowledge is invented as much as it is discovered, and new ideas “nibble at the edges” of common knowledge until values considered permanent and fixed fade into antiquity just like any other obsolete tool. Burke said that our system of knowledge and discovery has never been able, until recently, to handle more than one or two ways of seeing things at a time. In response we have long demanded conformity with the dominant worldview or with similarly homogenous ideological binaries.

My favorite line from the book has to do with imagining a group of scientists who live in a society that believes the universe is made of omelettes and goes about designing instruments to detect traces of interstellar egg residue. When they observe evidence of galaxies and black holes, to them it all just seems like noise. Their model of nature cannot yet accommodate what they are seeing, so they don’t see it. “All that can accurately be said about a man who thinks he is a poached egg,” joked Burke, “is that he is in the minority.”