THE SEARCH BAR

Why do we believe what we believe?

| 38 minutes | Media Contact: University Communications

Summary

What is justified belief? And what drives belief in conspiracy theories? 

Guest: Josh Smith, philosophy professor at Central Michigan University

Summary 

In this episode of The Search Bar, host Adam Sparkes interviews Josh Smith, a professor of philosophy, to discuss the concept of justified belief and the factors that drive belief in conspiracy theories. They delve into the field of epistemology, which focuses on understanding what knowledge is and how we can determine if our beliefs are justified. They explore the idea of relevant alternatives and how evidence can rule out certain beliefs. They also discuss the challenges of weighing evidence and the biases that can influence our reasoning. The conversation touches on the importance of recognizing our own blind spots and the difficulty of self-policing our thoughts. They also address the issue of misinformation and how to spot it by considering the level of surprise a piece of information elicits and the importance of seeking more evidence when something seems highly unlikely. The episode concludes with a discussion on engaging in productive conversations with others who hold unjustifiable beliefs and the importance of empathy, understanding, and open-mindedness in these discussions. 

Transcript

Chapters 

Introduction

Adam: What is justified belief and what drives belief in conspiracy theories? Welcome to The Search Bar. You've got questions. Let's find some answers. I'm your host, Adam Sparkes, and on today's episode, we're searching for answers on why people believe what they believe. Josh Smith, professor of philosophy at Central Michigan University is here to help us do just that. Well, thanks for coming in today, Josh. We're going to talk a little bit about philosophy because you're a philosopher, and we're going to talk more specifically about epistemology, which is a term that I had heard in my life, but I wasn't entirely certain what it was until about a week ago.

And I think I have at least a better grasp now. So I thought we could start by trying to define what that is for folks that are listening. We were talking before, we're recording, and I thought we could do so at least at somewhat of a place of frightening nihilism, which is to say, how do we know that we're even having this conversation right now? How do we know or do we know?

What is epistemology?

Josh: Yeah, so this is one of those terms that has kind of come down through the years and we use it because it's stuck. Basically, it's just about what we know and when we believe in a way that is sensible. So how do we know we're having this conversation right now? There are a number of different competing views about what knowledge is, but many of them share a kind of family resemblance. You can think about it in terms of what people call relevant alternatives. So sometimes when you're worried about whether you've gotten a true belief, you want to make sure that you are aware of what you're talking about. So some people pop the hood of their vehicle and they look and they point and they say, yeah, that's the engine, and they can't pick out the parts of the engine. Other people pop the hood and they're like, yep, there's the alternator, there's the starter.

Here's the exhaust manifold. Oh, it's on the top of this engine. Things like that. And those people have some information that they're able to use that other people lack. I mean, none of us is a specialist in everything. There's only 24 hours in a day, so some of us pick up information about this. Other people pick up information about that. So the relevant alternatives approach to knowledge just says, look, if your evidence rules out the relevant alternatives to what you believe, then you know. So here's a coffee cup in front of me. Here's a relevant alternative. It is in fact a football.

If I can't tell a coffee cup from a football, I don't know, a coffee cup when I see one.

Adam: Right?

Josh: On the other hand, if I don't know a coffee cup from a really clever hologram of a coffee cup, I mean suppose there's some projectors up in the ceiling somewhere and there isn't in fact a coffee cup here. It's just a hologram. That's not something we generally have to worry about. And so it's not a relevant alternative. Of course, the crucial question for a view this is, well, how do you separate the relevant from the irrelevant alternatives? Different people have different answers to that question, but by and large, a lot of people agree on that framework for thinking about knowing how do we know we're sitting here having this conversation? Well, what are the alternatives to that not happening?

Adam: That we're in a dream.

Josh: Yeah, something like that, and those alternatives are by and large going to be irrelevant. So that's how we know.

What is justified belief?

Adam: And that's where just for everyone that's listening to, for me, when I was trying to learn about this, I ended up at this place of nihilism like, well, if you really try to flatten your mind out and make as much surface that is available to create, I guess a really gross euphemism thing, just peel the brain open and lay it flat. I try to give myself as much surface to be a little less pragmatic about thinking about what is true.

I have to first assume that what I'm experiencing is valid at all, and that's where that nihilism comes in, and that's I think where epistemology sort of starts, which is how do I have kind of gradating affirmations of what is true, whether I'm here or not, whether my senses are failing me or not. We have to have a baseline for what we agree is a way to make a belief justifiable, and I think justifiable belief is sort of your area.

Josh: Thinking about when we believe well is interesting. So in my backyard I have some fruit trees, and so local deer like to show up in my backyard and I love them. I think they're great. My neighbors don't because they get into their gardens and things like this. My deer aren't good at picking out which people who come near are friendly and which aren't.

They just run because that's their best way to survive. Deer tend to be skittish. We're a little more complicated than that. We don't just have instincts that kick in. We have robust mental lives and we're able to do things like consider evidence and weigh it.

Now, that's really hard to do it turns out, but we've been thinking about this for a long time and have a pretty good handle on when we're doing pretty well believing and when we're not. Having justified beliefs. The simplest view is just when you believe in accordance with your evidence.

And it sounds super simple, but exactly what that means is tricky. So you can think about it legalistically, right? So if you think about what lawyers try to do with juries, they present evidence and then the jury is supposed to weigh the evidence. But what you want are both sides presented, so to speak. Even still, that doesn't address the question about how you weigh evidence. Is all evidence created equal? For instance, suppose somebody tells you something. Is that evidence for believing that what they said is true? Should you trust them? Do you need to have evidence that they are the kind of person who says true things about that particular topic? So there are all those questions like that. The way I think it's easy to think about that is that it's really hard to get justified beliefs, but what's important is recognizing you might be wrong for whatever you think. I mean, there are all kinds of topics. Some of them are really mundane and boring, like, oh, there's a coffee cup in front of me. It doesn't take a whole lot of evidence to have a justified belief in that.

Adam: Unless you want to do the nihilism thing.

Josh: Unless you want to get really radically skeptical, right? And then the question is how do you not consider that? And again, if you just think about it in terms of relevance and what our ordinary practices are, then you just, I'm not worried about it. So, John Locke has this famous quote at the beginning of his essay concerning human understanding where he says, look, I'm not interested in the kind of knowledge that answers every doubt because we're going to have doubts, right? I'm interested in the kind of knowledge the sailor has when he is trying to figure out how long his anchor chain needs to be or how long the rope needs to be able to effectively lash the ship to the dock. That's the kind of knowledge I care about. So if you think, look, what's the project we're engaged in, or we're worried about answering every possible challenge, we're doomed. We can't, right? I mean, our best science tells us we have very limited access to information because we only have certain kinds of sensory systems.

What is knowledge?

Adam: So, we have to, at least, I'm guessing from the place you're at as a philosopher is I need to have a baseline that I can trust that I am here and that I can trust that for the most part, the sensory perceptions that I'm having are reliable. In the early oughts, there was a movie about metaphysics. It was really popular and had this scene where they talk about people thinking about the water and the water changing under the microscope. And I was like, this is the most ridiculous thing I've ever seen in my life. But people thought it was real. And I think as we start to examine and understand these types of things and how do we beings interact with of metaphysics and astrophysics and all this stuff, how does that all come together? It becomes so muddy.

Josh: Yeah. This is why it's really crucial to step back and say, okay, what's the project? What are we trying to do? If we're trying to answer every challenge, we're doomed. We can't do it. Here's a project that maybe we can get a handle on. We say that we know things. I say that I know things. I say that other people know things. I say that there are some things that I don't know. Same with other people. We have this practice of talking about knowledge, and we do things like say, you shouldn't believe that, or you don't have enough evidence to think that. So instead of saying, okay, what is it to know anything at all? Can we answer every challenge? Instead, what we can say is, what are we doing with that practice? When do we correctly say of each other that we know things in a way that makes sense of our practice? And that's a much more tractable project. So that's the sort of thing that I tend to think about is what are we up to when we're talking about knowing things and not, and there is always in the background that sort of radical skeptical specter where you're like, yeah, can I say something about that too?

And this is part of why people find the relevant alternatives approach attractive is because you can say no. Just don't worry about that. The difficult position we're in with respect to knowing certain things, we just lack information and there's no way to get it. We have the written history we have, and that's it. It's unfortunate that we don't have more and we don't have time machines. It would be great to be able to go back and just kind of witness what actually happened at various points in history, but we can't. And so we have to form a picture of what happened with the limited resources we have now are we're going to call any of that knowledge. It would be very easy to get it wrong, but there's a separate question, have we done a pretty good job with the information we have putting together a picture? If so, then it seems like it's reasonable to accept it.

Do we need evidence to support our beliefs?

Adam: For us to realize that while I might be able to look at the car and read the manual and learn a lot of stuff, me validating those beliefs that yeah, I now know where the alternator is. I now know where the timing belt is. I don't trust the mechanic. I can read this myself. There is a weird gray area of where, yes, the mechanic isn't necessarily the ultimate authority on this, but there still may be a better authority than you. And I was saying that roundabout way to say that the

Catholic church was the only mechanic probably for a while, and it came to the crop rotations, and again, I'm just using Western authority figures here. At the time, that's where you got your knowledge from. The priest gave you the knowledge. I can't read Latin, I can't read it all.

Now, fast forward 300 years, I can read the owner's manual of my car. Am I as informed as the mechanic? Am I justified in my belief on how to fix the car the same?

Josh: You can read the owner's manual, you can go on the internet and read a ton more. You can watch videos of people doing things with their vehicles that might be just the same as yours. And that's a much more challenging question because there's a thing that people forget, right? There are all kinds of simple issues and then there are really challenging issues, and people look at the simple issues and kind of extrapolate to the more challenging issues and think, oh, okay, it's just as simple as that, and it really isn't. Getting a picture of the world that makes sense is really, really challenging. So one of the key features that people kind of lose is they do this is the possibility that they might be wrong.

And when you're weighing evidence, that's one of those things you really need to keep in mind is that I could be wrong. So this crops up in a number of ways people forget to think about, well, okay, if I was wrong, what should I expect to see here? So a really simple example of this is anytime somebody gets sick, especially right now in mid-Michigan, there are so many things going around flu, COVID, people have all kinds of various colds. So suppose somebody starts feeling a little under the weather, they might think, oh, right, because that's still a big thing for us. And it might be at the front of their mind for some reason it's salient to them maybe. But what if they were wrong? What if it isn't? Should they still expect to have those same experiences that they're having? Well, yeah, there are a number of things that could account for that, which is why often it's important that you do things like run certain tests.

So, if you have a home COVID test, that would be a way to get more information because if you're wrong, then that might help you understand that. Go to your doctor, go to the expert, and they'll do some more tests to get more information. So there's this sense in which the lone free thinker is a great kind of approach for things, and it's really fun because one of my biggest complaints about a lot of people is that they just aren't curious enough. I mean, it's a big fascinating world out there. So you want people to be curious and investigate things. On the other hand, it's important that they remember that they could be wrong. At any stage, you use the word prove, which has a couple of different meanings, and in one way of thinking about it, it's like definitive, which when we can get that, it's fantastic.

Adam: Feels good.

Josh: But we are almost never are in that position. Almost never. Usually what we get is some kind of evidence that makes it more like that something is right or wrong. And weighing that likelihood is really, really difficult because we have all kinds of biases that some are cultural, some are maybe hardwired. I mean like the deer in my yard, if they're biased towards running when they see the shape of a person, maybe that helps 'em survive. And maybe we have some biases that are like that, and others, we all like to be right. So when we find things that suggest that we are indeed right, we tend to overemphasize that and think, ah yes, see, here's more confirmation that I'm right. And then we see stuff that suggests that I'm wrong. We're like, ignore that. That's minimal evidence. That kind of ability to try to remain objective in weighing how we think about evidence is really, really challenging, which is why the scientific community is, I think a really good community to look at because just the overall project of what they're all doing is pretty near ideal in terms of how the rest of us could approach trying to form our own pictures of the world.

Adam: If you want to be that uncertain and act like everybody's snap judgment of their own truth is valid, then you better not get on an airplane and you better not put your cell phone to your head because all of the worst possible outcomes are really a lot more likely than we can assume them not to be, I guess, if that makes sense in the way that I've structured it. But we have to go on some of those presumptions.

Whether or not we've really validated themselves or not, because we've kind of culturally validated that commercial airlines are pretty safe.

They're not comfortable, but they're safe,

Josh: Right? Yeah, incredibly safe. Even on really bad days, the odds of you getting where you're going safely are really, really good.

Adam: And yet, because we're always our rational brain and our kind of, I dunno if it's a reptilian brain or our ancient mammalian brain or whatever it is, we're always fighting flying as a super common fear.

But most of those people get into a car and jump on a highway and do a 40-minute average commute every day, and it is far more dangerous. In fact, they might come home from the airport, wake up the next day and go for a joy ride after having had to have three gin and tonics to get through a two-and-a-half-hour flight. And it's, I'm not making fun of that person. I have irrational fears too, but it just goes to show you that we don't know anything compared to what we actually do know.

Why is it difficult to get an unbiased view of the world?

Josh: And part of the challenge is we also don't know what we don't know. We're in a situation where we have blind spots. It can be incredibly difficult to learn that you have a blind spot because from your own perspective, how are you supposed to figure out that you have a blind spot? We need other people for that, but again, that's challenging because they might be like us. It's funny that you mentioned certain kinds of risk aversion because I'm a very risk-averse person and I like to watch videos about all of the things in Australia that are terrible.

Adam: Oh, my gosh.

Josh: I know that the likelihood of encountering any of them is low, and yet I am in no hurry to head to Australia.

Adam: I'm actually sort of with you on that. And I'm a guy who spends a lot of time outdoors, I like the outdoors. And then I'll watch Aussies and I'm like not there though, right?

Josh: Yeah.

Adam: Almost anywhere but there, despite what we've been talking about here for the last hour, we have all this information, but we're starting to question, I would argue unrightly, but rightly or unrightly, what information is good information? What information is real? What gets people set off in this area where for at least to a lot of us, right? Earlier you said this could be a football, right?

If you told me it was a football and then you left, and then I would be like, what makes you start being the guy who thinks that the coffee cups of football? But then how do you convince Aaron and Kyle who are in the room here with us that it's also a football? You'll never convince me. I like to think so. But how do we fall for that type of thought?

And I know it's a ridiculous example, and you can certainly feel free to use a more one that you're more familiar with.

Josh: But these things sometimes actually happen. So for instance, I don't have a lot of sympathy for the anti-VAX community except for people who are aware of Southern history, usually because they're from the South and say, listen, it's the government telling me I need to do that. I can't trust that. And then if you think, wow, that's kind of conspiratorial, what do you think they're doing? And if they come out with something like nanobots, then, alright, well let's have a talk about that until we get those right?

Adam: By southern history, do you mean American South or global south?

Josh: The American South. But if people know about the Tuskegee syphilis study.

Adam: Right? That's what I was driving at.

Josh: Where the government found that there was a syphilis outbreak in a black community in Tuskegee, Alabama, and apparently decided to see what happens if we let syphilis go unchecked. So told the community, okay, we're going to treat you did not treat the community and then collected data for literally decades beyond what they needed. You can sort of sympathize with people who say, look, the government is untrustworthy when it comes to medicine. Okay, that's one example. Now think about everything else the government has done with respect to medicine, polio, vaccines, et cetera. How did that go for us? It went well. So maybe you're focusing a little too much on that one, but because those things happen, they become salient for people. In the same way that I have this sort of irrational fear of venomous animals. Then I say, oh, here's a place where there are a bunch of them.

And even if I know how likely it is that I even encounter one much less, have a bad encounter with one, I'm still sort of stuck. I can't get out of my own way to get my beliefs. So I've got this picture of the world and for whatever reason I struggle to get it right. And what's interesting is that even when you train people to do this, right, to weigh evidence properly, they still screw up. So for instance, the psychologist, Kahneman Tversky did all these studies about how bad we are at probabilistic reasoning, and there are all kinds of explanations about why that is, but we screw up a lot. So for instance, we can be given the same information two different ways, and given how it's presented, it can affect us differently. So for instance, suppose that I'm a physician and I think the best course of action for you is to have a particular procedure, but I know that you might be worried about the procedure.

So I come in and say, listen, there's a 90% chance this is going to work perfectly. No problems, right? Okay. That's one way of presenting it. If you hear it in terms of oh, high success rate, you're more likely to say, oh, that sounds good. But if I come in and say, I mean suppose the situation is slightly different and I'm now a physician who doesn't want you having this done, and I come in and say, okay, listen, we could do this, but you need to know there's a 10% chance that something could go wrong here.

I've given you the same information, I've just packaged it in two different ways, but that can affect you. And it turns out that so Kahneman Tversky did all these studies amongst ordinary folk about how they reason probabilistically, and then they started asking, well, what happens with people who are trained to reason well? So they did some various kinds of studies on physicians and found out that they make the same mistakes everybody else does. The challenge of getting a picture right is a really, really difficult one, and it's the sort of thing that we're just constantly fighting against because it's hard work, it's exhausting.

What's the role of skepticism and cynicism?

Adam: This is one I love. Being skeptical can be great. Being a cynic is almost always just annoying for everyone around you - and dangerous. How do you see those two things being different?

Josh: Having a healthy skepticism is one thing, and just sort of stopping and saying, listen, how might things be wrong here? How might I be wrong? How might other people be wrong? Being cynical and thinking everybody's out to make a buck or everybody's out to get me or something like that, is not a good way to try to get a good picture of the world. So the people who are constantly kind of asking questions and are concerned with, who's wrong here? Who might be wrong here? Okay, that's fine. It's annoying. But the people who are always offering alternative explanations for somebody's behavior, how can you be sure that this is the right explanation of that behavior? They said a thing, maybe they're doing something other than just trying to share information, but maybe not. So the cynical approach, as I understand it, is an approach that's always sort of looking or trying to look behind the curtain to see what's really going on. Are you familiar with the birds aren't real?

Adam: Yeah, that guy's wild. Yes, I'm pretty familiar.

Josh: So, it started as a joke and then he just kept building on how can I make this be funnier by making it fit the evidence better and just kept going. Then people were like, there's something here. So some people started to think, no, that's actually right. He openly says no.

Adam: It's a movement that's a parody of other movements.

Josh: And folks got sucked up into it. Is that because they just want to troll, they want to be contrarian? I don't know. The sort of asking of questions to try to make the picture whole is one of those things that people don't stop to do. And I think if people were more okay just saying, look, I don't understand that. I don't know.

Because they thought to themselves, what I want to do here is get true beliefs and avoid false ones instead of whatever else might lead them to want to accept things. If it's, look, I want to be part of a community or whatever. I mean, so the community point is really interesting because there are various religious groups that are having problems with groups becoming very secular. So they don't leave their association with the religion. They just kind of stop talking about the religion because it's the community that they love. And what they care about is not the deep metaphysics. They're worried about being decent people and being part of something. And we're social creatures. So if you don't care what's true or false, you might join a community.

How do we prevent the spread of conspiracy theories and misinformation?

Adam: Conspiracies are hard, and if they have to have tens of thousands of people to operate, they would be much harder. And that logic seems hard to follow. It's easy for you to agree with me because we're on the same page on this, but if I having a conversation with a relative or somebody who I care about, let's assume I don't want to, I'm not interested in eviscerating somebody in an argument. How do we try to put some of that logical thinking, some of that epistemological thinking into their argument?

Because it strikes me, as you said earlier, that they're not going to hear me just give opposing evidence. How do I fight that conspiratorial thinking or not fight it?

How do I challenge it in a way that may be productive for the person?

Josh: Yeah, so I think the single best example of getting through to people who you think have problematic views is a fellow named Daryl Davis. I think that's not that. I've got that right. So he's a musician, Black musician from the South who got interested in why people join the clan. He is like, these people hate me. Why? He didn't go out and start yelling at people, start organizing things. He wanted to understand them and say, okay, what's going on? And so he talked to people and he found out that a lot of the people who were in the clan were perfectly happy to talk to him. And after talking with them for a while, eventually they were like, yeah, what I was on about was not hating people. That wasn't it. I was concerned about these other issues and you've helped me see that. So thanks. And then he was like, well, can I have your hood? And he's collected something like 200 of them now.

Adam: Whoa.

Josh: Yeah, just talking to people. So I think it's really important to recognize, as you say, it's not about eviscerating somebody, it's not a win-or-lose thing, especially if it's somebody you care about. You have to be careful not to fall into familiar roles with siblings, especially. This can be really hard, or family members in general. If it's mom or dad, they might think, listen, I'm the parent here. You have to listen to me. Okay, well if we're having this role issue, we're going to have problems. But also you want to make clear that what they care about is getting it right.

So if you can get on that same page, then when you start saying things like, wait, hold on, it looks like you're reasoning in this way. So you think that there's this huge conspiracy of people who are keeping this big secret. Are people really that good at keeping secrets? Don't you think it's more likely that if there were that many people involved, somebody would say something and it would get screwed up in the way that all these other examples have and then let them ruminate with that? And people aren't good at changing their minds quickly. Often they need to reflect. So when Davis would go talk to people, it wasn't just a one-time thing. He basically befriended these people and all the better for everybody involved. They sort their views out, they get a new friend and he gets new friends and they change their behavior in ways most of us would say is for the better.

So that sort of approach where it's not a debate where you're going to win, you're not just kind of falling into familiar roles and you're making sure that they're really engaged in trying to get things right. If they're not then pointing out things like, look, here's your reasoning. Isn't that problematic? If I reason this way, that's obviously problematic. So what's the difference if they're just trolling or trying to be contrary, they're not going to respond well to that at that point. I don't know what you would do, but if you can get them thinking, yeah, I care about being right on this one, then you can actually have a conversation.

Adam: Most people aren't a villain. At least they don't think they are, right? Yeah. I mean, I'm sure there are villains out there. I'm sure they exist, but most people that you disagree with, most people that you think are doing something that's bad or that it's going to harm other people, they don't think they're the villain. I don't think most of the time, even Thanos didn't think he was the villain, right? I mean, we all saw the movie.

Josh: When we start talking about morality, people start thinking about the big issues, which is fun. They're the big issues because they're hard. But the little issues are really hard too.

I love asking my students, how many of you think that you ought to call your grandmother on her birthday? And it's a totally mixed bag. Some of them are like, yeah, it's grandma. I have to show my love for grandma to not do so would be a deep moral failing. Other people are like, whatever she gets from me as a bonus, I owe her nothing. And even questions, is it ever okay to date a friend's ex and a bunch of them are immediately like, no. And then you start asking questions like, well, what if they broke up six years ago? Well, if it's six months ago, okay, maybe not, but if it was six years, haven't they both moved on. What if your friend broke it off? And then you start getting into all these different kinds of scenarios and people are like.

Maybe that does change it a little bit for me. Because this idea that on any given in particular moral question, there's a clear right and a clear wrong is seriously difficult to make sense of when you look out in the actual world. And I think if you start from a place that is, I'm the good person, you're not going to do well in terms of getting a good picture of what the world is actually like, because other people are probably trying to do good too. It's they're seeing it a little differently than you are. So maybe having a chat with them, figuring out where they're coming from could be incredibly valuable.

How do you spot misinformation or fake news?

Adam: So, we've talked a lot about trying to engage in conversations and why our friends or family members might be thinking this way, but what if it's us? What if I am kind of getting caught up in something where fact or cap, and it's cap, right? If I am having an unjustifiable belief, is there a way for me to spot that or is there a way for me to spot that that's being fed to me?

Josh: Yeah. This is one of the more difficult positions we find ourselves in is the self-policing of our own thoughts. And it is notoriously difficult because we're rational creatures we like to tell ourselves. And part of what that means is that we kind of take stuff in and we're really good at backfilling reasons. I found myself here, let me make that make sense for myself. And that's the trick is you have to try to be honest with yourself and say, look, what am I doing? Am I really trying to follow the principles for good reasoning or am I doing something else? And it can be very hard to get outside of your own head to make sure that you're not doing something else, that you're not just backfilling reasons for yourself, that you are indeed proactively trying to follow arguments where they lead you and doing so in accordance with principles for good reasoning. The best approach I think to this is just to try to be honest with yourself and really work through whether or not your evidence supports where you've gotten. And if it doesn't, then you need to try to back it up and at the very least say, yeah, I probably shouldn't go there. I'm not sure where I should go, but I'm not there yet.

Adam: In addition to that, in the media, the term fake news has become almost like I want to suck air through my teeth when I say it because it doesn't even mean anything anymore. But, just misinformation or I guess fake news.

How do we spot that? Is there a red flag that exists if I'm reading something or hearing something that I should either steer clear of or I should put on those skeptical goggles when I'm engaging with it?

Josh: Yeah, I think the simplest approach to this is if something is really surprising, the way to think about that in terms of evidence is that means you would give that a low probability of being true.

What you don't want to do is here's something surprising and then just say, well, they said it must be true. Instead, what you need to do at that point is say, okay, how should I be adjusting from here? Given that I think that that's so surprising. So I think that's the simplest direct thing a person can do is think about how surprising a bit of information sounds to them. And if they find it surprising, they should probably think to themselves, I need to do more work here.

Adam: If I hear, oh man, they found tiny aliens in the hydron collect, I don't know, making up something wild. I don't know anything about that. So, if a particle accelerator finds the God particle, and a lot of us react to that article, we go, wow – a God particle? But for people who are involved in physics, it's more of a fancy term for something that maybe they were expecting to find or thought they might find. So it seems like maybe also that skepticism should be prepared what your background is, right?

Josh: Yeah. Even there, because look, the difference between the layperson and the physicist is the physicist has way more information. And so when they get the information about what happened in the Hadron Collider, it's going to fit much better in their picture than it will for somebody like me who doesn't really pay attention to that. So here's another great example that I love. I don't know how many people are aware that the sperm whale is the loudest animal on the planet and can kill you if you're in the water and it clicks as loud as it can and you're within a certain proximity. That's very surprising.

Adam: That is really surprising.

Josh: Right? It is remarkable how loud those creatures can be, and it's remarkable that such a loud noise can have that dramatic and effect on us. So even then, I think the appropriate responses that can't be right. That's so surprising. I need to do some more work here to figure out if that's something I want to add to my picture of the world.

Adam: So, I think we've come full circle on your fear of not only poisonous animals, but really loud animals.

Josh: The sea is terrifying.

Adam: And I'm looking that one up either way, because you should not believe him right now.

Josh: Yeah, no, go check it out.

Adam: I'm going to do that. Everyone else should go do that. Josh, thank you so much for coming in and making me question everything I believe.

Josh: Yeah, you're welcome. It's been a pleasure.

Adam: Thanks for stopping by the search bar. Make sure that you like and subscribe so that you don't have to search for the next episode.

The views and opinions expressed in these episodes are strictly those of the host and guest speaker.