Recently I’ve had the pleasure of speaking with some of our country’s leading historians and philosophers, and have been enlightened through their conception of morality, truth, and cognitive-emotional processes. They have detailed for me the societal and cognitive obstacles to interpreting the truth — and what we can do to establish common understanding.
Jonathan Zimmerman, a Professor of History of Education at the University of Pennsylvania, said that “the real danger to free speech on campuses is self censorship,” highlighting the fact that many individuals aren’t comfortable voicing their genuine opinions, out of fear of repercussion.
Zimmerman mentioned that “The Foundation for Individual Rights on Education did a very elaborate study on 50 campuses over the last few years, and on most campuses people told the surveyors that they weren’t speaking their mind out of fear of repercussion – of being socially ostracized and stigmatized.” This could be problematic for the sharing of new ideas, and also to challenge the status quo.
Additionally, many learning opportunities arise from hearing opinions that oppose your own.
“We don’t learn from people that we agree with. We learn from being challenged, sometimes by being offended, and by being exposed to things that are different.”
In order to develop our understanding of the world, we must absorb new information and new perspectives. To remain siloed in our own perspective is thus to hinder our learning and development.
In illustrating the consequence of this idea, Zimmerman said “When we are all trying to tailor what we are saying, we inhibit our learning. Every truly great social justice hero has been a Free Speech zealot and advocate. The reason is because if you took away their free speech then they wouldn’t be able to challenge oppression.”
When asked to give an example, Zimmerman highlighted Frederick Douglass.
“Frederick Douglass called free speech “the great moral renovator of society.” If Douglass wasn’t allowed to speak, he wouldn’t have been able to make his case against slavery. Lots of people have tried to silence abolitionists. Abolitionists needed free speech in order to fight slavery.”
When asking Zimmerman how we should handle controversial ideas, Zimmerman pointed out that “Controversial speech claims will offend people,” but, Zimmerman claims, it is necessary in order to ensure that justice prevails for all. Without the ability to voice opinions, the wants and needs of the minority groups will never be acted upon.
In regards to censoring information, Zimmerman elucidates that “The terrible thing about censorship is that we never believe it will be us that is being censored.”
Alan Levinovitz, the author of “The Gluten Lie” and “Natural,” is an Assistant Professor of Philosophy at James Madison University. Levinovitz shared insight about the connection between religious thinking and dietary philosophy, and the importance and virtue of uncertainty as a guiding character trait.
The Connection between Religious thinking and Dietary thinking
“Contemporary food taboos are similar to religious taboos,” Levinovitz told me. I was surprised to hear this, because I hadn’t considered the idea that religious thinking could be analogous to the beliefs that some people hold about diet.
As a Professor of Religion, it might not seem immediately apparent why Alan Levinovitz would write a books about nutrition and naturalness. However, understanding the motivations and cognitive biases behind religious thought has enabled Levinovitz to grasp a subtle, deeper pattern behind diet trends. In his book “The Gluten Lie,” Levinovitz makes the case that we have been misled into believing several myths about diet – which hold only partial resemblance to the truth. The fact that these ideas perpetuate throughout society and over time is a testament to his notion that dietary ideology can resemble religiosity.
The Importance and Virtue of Uncertainty
“The virtue of being able to say ‘I don’t know’ is underrated. I have an admiration for people who can admit they are wrong.”
Humility is key, Levinovitz points out. If we can admit when we are wrong and pursue the correct answers – then we miss out on the opportunity to learn and are potentially harming ourselves.
Levinovitz also points out that humility and accepting uncertainty becomes especially difficult for individuals who are experiencing existential or physical pain. “When you are in existential or physical pain it is not as easy to say I don’t know because you are very invested in knowing. With regards to the idea of their being “long covid,” saying “I don’t know” can be very scary if you are sick from Covid.” This is true because when we are in pain or at risk of suffering pain – we often are biased towards wanting to believe that everything will turn out OK.
In discussing how people think and making decisions, Levinovitz mentioned that “People aren’t always persuaded by logical reasons – but by emotional reasons,” remarking on the fact that much of what drives human discourse is subconscious feelings — even when we perceive ourselves to be acting out of rationality. Levinovitz argues that we should allow disagreement, because this contributes to our ability to find the truth.
Shift to Public Debate
There is great polarization within our society, especially in our politics. One of the interesting dimensions of this phenomenon is the fact that philosophical and political debate is often taking place in front of the public. Whereas historically much of this may have occurred behind closed doors, given the advent of social media it has become easier to access differing opinions.
As Levinovitz said, “There is a shift happening. There used to be back-stage expert discussions of the truth that the public didn’t used to see — and now we see those debates in realtime. The shift from back stage to front stage (regarding truth) is interesting.”
Andy Norman is the award-winning author of Mental Immunity: Infectious Ideas, Mind Parasites, and the Search for a Better Way to Think. Norman is well known for coining the notion that conspiracy theories can be understood as “mental parasites.”
Norman told me, “I discovered that the mind has an immune system which protects us from infections ideas — in the same way that the body protects us from external pathogens.” Norman notes that “This is more than mere analogy,” given the precision with which mental “parasites” fit within the general conception of pathogens and viruses.
As Norman illustrates, “The body’s immune system is a complex set of operations that protects us from dangerous substances. Similarly, the mind’s immune system protects us from harmful ideas. If we look back in history we see that the wrong idea can get you killed. Thus humans have developed capacity to identify and remove bad ideas. Of course these systems are imperfect and sometimes we screen out good ideas.”
When asked about how we can categorize or determine what constitutes a “bad” idea, Norman replied that “One type of bad idea is a falsehood. Another time is an idea that harms its host, or induces its host to behave in ways that harms others. For example, if i believe I am god’s gift to humanity then my arrogance will begin to detract from the wellbeing of others — this belief is harmful because it induces harmful behaviors.”
When asked to explain the connection between bad ideas and mind parasites, Norman replied, “Bad ideas are mind parasites. Parasites can hijack a host and induce spreading. It turns out that bad ideas of certain kinds of the same properties as parasites – they check all of the boxes. It is more than an analogy.”
Mind parasites create “antibodies” which compel the thinker to perpetuate the bad idea. “We think of our minds as containers of ideas that can use agency to act on or not act on ideas, but if you think of the phenomena of viral ideas then you realize that ideas are not inert — they can hijack and derange and cause confusion in the minds that host them – in the same way of a virus. Biological viruses don’t think but can still manipulate their hosts – for example sneezing copies of the virus to the other hosts. This is similar to how a malicious rumor can transfer from one host to another. When you study the ways that ideas spread you learn that we are not the sole locus of ideas – ideas take on a life of their own and ideas compromise the interest of their hosts. The cost to civilizations has been enormous; you can see how civilizations rise and fall if you view through lens of cognitive immunology.
What Should we avoid?
“Avoid identifying with your beliefs. When you absorb an idea you begin to treat the idea as part of your identity. Although we all need to rely on some ideas. the trick is to place reliance on ideas that are objectively true and likely to promote human flourishing.”
What should we do?
“Meditation gives you autonomy by allowing you to separate from your thoughts.”
How does the mind typically deal with mind parasites?
“Doubts are the antibodies of the mind. Reservations, qualms, and challenges are the kind of things that a healthy mind generates when it encounters questionable information. When a very unlikely possibility comes up, a healthy mind generates doubts about how unlikely it is, which serves to prevent the idea from taking up residence in the mind. if this idea gets past your minds defenses, however, then your minds immune system starts to defend the belief, because the idea becomes to be seen as part of your identity.”
Why do people believe in conspiracy theories?
“We do this because we all have the need to belong. when a group rallies around a set of ideas – people can develop a sense that the ideas are central to their identity. When people lack a sense of purpose and belonging, we become more susceptible to group conspiracy theories that make us feel special.
How Do we Determine Truth versus Falsehood?
“We tend to treat indications that something is true as evidence of that thing. But indications might not be reliable. For example, let’s suppose that every time I have a glass of pink gatorade I feel better, and I develop a theory based on this correlation of 3 data points that there is something in this pink gatorade that promotes my wellbeing. It could in reality just be a coincidence. In this case there is “evidence” but it isn’t sufficiently reliable.”
How do Scientists view Evidence?
Scientists are trained to think of true evidence as being a genuinely reliable true indicator of something. The problem is that people don’t know how to distinguish between reliable and unreliable indicators. For example, when believers of Qanon look at “evidence” or data points related to Qanon, they mistake it as reliable evidence.”
Are there beliefs that we should hold without evidence?
“Some beliefs have great reasons for believing yet without evidence — like human rights. whether people have rights is irrelevant because [it is of great societal benefit for us to have] human rights.”
How do we know whether an idea is worth believing in?
“The real test of an idea is looking at reasons for it and reasons against it. When you look at both sides it makes it harder to jump to conclusions. This strengthens your resistance to bad ideas. On the other hand, if you believe that anything you can find reasons for is reasonable: this exacerbates confirmation bias.”
How to determine which ideas are valid?
“The reasononableness of a belief is a function of reasons pro and con and how you can answer questions that go against the belief.”