Why Hearing Facts Still Doesn’t Change Our Perspectives
Have you ever tried to explain a topic to someone, but they were so strong in their convictions that they refused to accept the facts? You can see this phenomenon play out in all aspects of society: religious people deny science, scientists refuse to acknowledge spirituality, politicians spin the truth on history, and the list goes on. Even I’ve fallen subject to this numerous times; it can be incredibly difficult to let go of your beliefs and accept the truth, even if it’s right in front of you. Our beliefs are our security blankets, confirming our convictions on what’s right and what’s wrong and affirming our every decision. What would happen if we simply let go of these ideals, took a step back, and looked at the bigger picture? For starters, if we opened our minds we could actually advance as a society collectively. By abandoning old beliefs, you make space for new knowledge, though this is admittedly easier said than done.
There have been numerous studies performed in the past which prove that even if we have all the facts, we still have difficulty accepting them and changing our minds. In 1975, Stanford University performed a psychological study aimed at determining whether or not people’s beliefs would change after they received the facts. If you’ve ever had an argument with someone who refuses to accept new knowledge (or, alternatively, been the individual who is too strong in their convictions), you may be able to predict the study’s findings! The researchers recruited a group of undergraduate students who were told the study would be about suicide. Each participant was given a pair of suicide notes and was told that one was a genuine suicide note and the other had been falsified.
The students were then asked to guess which ones were fake and which ones were real.
The researchers told some of the students they did very well, stating that they correctly identified the genuine note 24 out of 25 times, and then they told the other students they did very poorly, only identifying the genuine note 10 out of 25 times. After completing the first stage, the researchers revealed that they’d lied about the students’ scores.
The entire point of the first stage was to make the students think they either did very well or very poorly, though in reality their scores didn’t reflect their actual performance. All of them had in fact performed similarly, so the students who were told they did very well actually did, on average, no better than the students who were told they’d done poorly. During the second stage of the experiment, the researchers asked the participants to estimate how many notes they thought they got right, and how many notes they thought the average student guessed correctly. Interestingly enough, the students who had previously been told they’d done well guessed that they scored very high, despite the fact that the researchers had told them that was a complete lie. Similarly, the students who were told they did poorly guessed that they performed below average. “Once formed,” the researchers explained, “impressions are remarkably perseverant.” It’s clear that the students allowed the fake information to impact their responses, despite knowing it was incorrect. A few years later, a similar study was performed by Stanford researchers on a different set of undergraduate students.
The students were each handed one of two information packets on two firefighters, which included the firefighters’ results on their “Risky-Conservative Choice Tests.” One packet said that one of the firefighters, Frank, was a successful firefighter who preferred to choose the safest option, and the other packet stated that he wasn’t a very good firefighter and preferred to choose the safest option. Similar to the previous study, the students were informed afterwards that the information packets had been falsified. Afterwards, the students were asked to share their beliefs on what they thought a successful firefighter’s attitude would be toward risk-taking.
The students who were given the information packet that stated Frank was a successful firefighter who chose the safest options said that they thought successful firefighters would avoid risk. Conversely, the students who were given the packets that stated Frank was a poor firefighter and chose the safest options thought that a successful firefighter would embrace risk. Even though the students were told that the information was completely fake, their answers were influenced by the information they were previously given.
These two studies are very well known, as they suggest that even people who seem completely rational and intelligent can act irrational at times, especially when their beliefs are in question. Even if people are told the truth, they tend to still be close-minded about that information and instead continue to perpetuate their old belief systems.
There are so many examples throughout society of people blatantly denying the truth, even if the facts are right in front of them. To name a few examples, consider the entire cancer industry. Both the industry as a whole and numerous doctors profit off of cancer patients who choose to undergo chemotherapy and radiation, even though it’s been proven that there are often better options than these conventional “treatments.” The U.S. government, Harvard University, and many other credible institutions have confirmed that cannabis kills cancer cells, yet most people still choose to undergo these “treatments” that often make their conditions much worse or increase their chances of getting cancer again in the future. You can also see this same theme playing out in religious beliefs. Practitioners choose to believe books that were created thousands of years ago and altered significantly, despite the fact that the truth of our existence and God (or source, the universe, or whatever you choose to call it) is within each and every one of us. Likewise, many scientists refuse to acknowledge the interconnectedness of science and spirituality. We are literally made up of energy and frequency, and this connects us to the entire universe, yet many scientists cannot fathom oneness or collective consciousness even though it’s supported by quantum physics. You could also apply this to politics, extraterrestrial disclosure, and even theories that are commonly mistaken as scientific fact. People choose their political stance, claiming that they’re “left” or “right,” without even looking at political platforms. Many people are uneducated about ETs and UFOs, largely due to lack of exposure and the stigma surrounding it that’s perpetuated by the media, despite the plethora of evidence supporting ET life and UFOs. To learn more, visit the exopolitics section of our website here.
Theories such as human evolution and the big bang theory are so engraved into our brains that we forget that they’re just theories. We mistake them as scientific fact, when in reality both of these concepts have been largely disproven.
The list goes on and on, and I’m sure you can think of numerous examples yourself.
The real question here is: How do we become more open minded? Well, if you believe someone is wrong, take a step back and understand where their viewpoints stem from. After all, “right” and “wrong” are relative, so how confident in our convictions can we ever be? I do believe that there are some universal truths, and if something strongly resonates with you, it’s likely “right.” However, that doesn’t mean you need to shout your opinion for all to hear, especially if it’s “falling on deaf ears.” Yes, I believe that it’s our duty to spread information and the truth, but I don’t think that everyone is ready to gain access to knowledge at the same time.
The best thing you can do is lead by example. If you think someone is wrong, don’t call them out for their ignorance and belittle them. Instead, try to understand where they’re coming from and, if they’re interested, educate them. You can speak your truth and challenge norms without being rude or too intrusive. However, if you do offend someone while communicating your beliefs, remember that that may have more to do with them and their attachments to their belief systems than it does with you. .
Read the full article at the original website
References: