Seriously, that title is worded like a straight up attack. Such a question, while open ended in who would consider what truth, still leads to the same outcome: engagement based purely on outrage and "proving the other side wrong."
I sometimes wonder if people post things like this with the intention of filtering through comments to block people that post their political viewpoints in response. If thats the case, I would conssider this a very effective and intelligent post. However, I don't think that this is the case.
On one hand people often don't like to hear bad news or an idea that means they have to do a thing or face a problem. On the other hand how a person is told the idea is a big part of a negative reaction. Often there is no reason to tell someone the thing at all.
I'll be straight forward if someone asks but I'm not "brutality honest". OP sounds like the "brutality honest" without anyone asking type.
Because in most cases, what is assumed to be “truth”is subjective. If you’re talking political. More often things are blurred with regards to truth as most things tend not to be empirically true, but instead, emotionally true.
For example;
“All conservatives are Nazis!”
This is inherently untrue. Yet I see every day- people who believe this to be the absolute truth. Same thing with-
“All liberals want to do is make our children gay!”
Also untrue. But when you try and correct them, they will almost always entrench themselves within their own version of the truth and disregard any form of critical thinking.
This is why asking questions is important. All conservatives are Nazis may actually be true if the person merely equates conservatives with Nazis, the proposition a mere tautology. Same for liberals trying to make kids gay, where people who make kids gay are liberals.
And by asking questions, trying to understand someone else, both parties can engage in critical thinking.
I think it's wrong to think that critical thinking should spontaneously arise because someone's beliefs are challenged. That's never how it works. Rather, one person has to be vulnerable and ask, "What do you mean? Help me understand where you're coming from."
Humans are more influenced by emotions than logic, which means that critical thinking alone may not convince them. Only those who are receptive to logical reasoning can be persuaded.
When in reality, if you are faced with knowledge of your wrongness and make a correction, then over time you grow. In which case, being wrong is a strength of sorts.
That's great question! From psychological perspective, people like to think that they are right. If they encounter some person or situation that threatens their believes they have three choices:
accept that they were wrong - might cause some unpleasant emotions, risks being perceived as not trustworthy/knowledgeable
assume the other party is wrong - the belief is upheld, no negative consequences
find some condition under which the belief in question does not apply - middle ground
Of course, there are many situational and personal qualities that affect how easily person accepts other view as their own.
Eg. if you are self-proclaimed expert on some topic, naturally opinions different than yours are wrong, at least to you. However, if you approach your expertise with attitude of trying to understand underlying principles, it would be easier to accommodate for new, sometimes very surprising facts or theories.
Also, humans are very susceptible to biases, meaning the world they perceive is different to what "objectively" is. One of them is attribution bias, which causes people to assume some results depend on their actions - even if there in no basis for that. This bias started the whole "vaccines cause autism" belief. The reaserch paper which started the whole thing is based on a survey directed to parents of autistic childen asking, do they think autism of their child was caused by a vaccine. It is ridiculous belief for most nowadays, but it provided a clear cause of the disease for those parents.
I know my writing can be confusing sometimes, so let me know if you would like some clarification.
Be honest, on a scale of 1 to 10, how much does this question have to do with your constant posting about how the maaaaan, maaaaan, is holding down all your crypto "investments" and they're due to go to the moon any day now as soon as the cabal of lizard people who run the world is eradicated?
Some people honestly believe the Earth is 6000 years old. And not a little amount of people, giant percentages of the United States of America. They believe dinosaur bones were placed by Satan. These people walk amongst us.
how are you going to reason with somebody like that??
The vast majority, virtually all, believe so because they believe that is what the Bible says.
And since they also believe their interpretation is the only correct one, and said interpretation requires everything be accepted OR they’ll go to eternal hellfire and burn forever there, as they deserve, there is basically no way to change this worldview without shattering it entirely. It benefits from being fragile because it causes so much mental anguish to depart from it, and people who walk away can turn into totally different people as a result of rejecting it and thus being rejected by their friends and family and community at large.
You, as a single, and likely, stranger to them, can’t get them to change. Alternative points of view or lifestyles are evidence of Satan’s trickery, so directed and deliberate debate with these people functions for them as a test of faith: they just have to weather the blows and they get Good Christian points and become closer to God. Nevermind that you have no intention of causing them harm or tricking them: you want to do the opposite, but it doesn’t matter.
The best you can do is be a kind person and be sure of yourself and your views. Planting a seed of doubt is much better than being used as a piece of evidence that they should not be looking for friends in worldly places.
I mean - it may be that OP finds himself constantly coming across Young Earthers - but I am interested in hearing directly about the kinds of opinions they find the rest of the world struggles with.
I think we underestimate how much normalisation is a survival mechanism. Personally I struggled to acknowledge the 'truth' about my traumatic childhood but I can see now that I did this because it was easier to get through life.
It's not simply that people believe specific things, but that they define themselves in terms of what they believe.
And in fact, it's often the case that people invest in specific beliefs not because they've reasoned their way to that conclusion, but simply because they've effectively picked it off the rack of possible beliefs as the one that most clearly represents whatever image of themselves they wish to promote - it's the position held by smart people or enlightened people or trendy people or moral people or strong people or whatever.
So if you try to argue against their belief, you face two immediate and generally insurmountable obstacles.
First, they're psychologically invested in the belief, so if you call it into question, you're not just threatening the belief - you're threatening their self-image. Anything that casts doubt on the belief by extension casts doubt on their self-affirming presumption that holding the belief demonstrates their intelligence/morality/whatever.
And second, since it's likely the case that they didn't reason their way to the position in the first place, they can't becreasoned away from it anyway. So itvinevitably shifts back to their psychological investment in the position, and your attempts at reason are a distraction at best.
Because we're emotional creatures first, we default to what's familiar or comfortable. Logic/critical thinking take sustained practice and a lot of effort. There's a study that suggests that many of our conscious choices are simply post-hoc rationalizations for decisions made in the unconscious.
I absolutely no longer trust anyone that insists they're naturally and perfectly logical, they are unquestionably hiding some fixation or personal opinion which--if challenged--will make them unravel in the worst fashion.
I think you could argue that this is an example of cognitive dissonance. It is uncomfortable to come face to face with new information that contradicts your beliefs or actions, and it requires energy if you want to integrate that new information into your worldview and adjust your actions. It is much easier to deny that information, even when it is clearly true.
For example, when it came out that aspartame might cause cancer, if you (like me) have eaten/drunk a lot of products containing it or have had a strong belief that it was completely safe, then it may be more comfortable for you to criticize WHO or think "well, it's not really relevant for me because my family isn't predisposed for cancer." If you didn't care about aspartame or artificial sweeteners before, you will probably readily accept that there may or may not be a cancer link.
Is there even an objective truth though? I’d say there technically is, but I think we all have our own subjective versions of what our “truth” is that rise and fall like a sine wave around the straight line of objective Truth.
Just remember that what is popular is not always true, and what is true is not always popular.
When it comes to changing someone's mind, I believe it helps to first question whether there's even a need to do so. If there is, then asking questions is vital. You can't just hit someone with Facts & Logic™ and expect that it will immediately undo something they may have had drilled into them since childhood, or something that requires recognition that would challenge other dearly held beliefs (e.g. "if my dad did a bad thing, then is he not the great, infallible man I thought he was? If he's a bad person and people tell me I look and act just like him, does that mean I'm a bad person, too?"). Finding out why someone believes what they believe, and taking time to understand it yourself and validate their experience is instrumental in opening up people's hearts and minds. Or, at least, that's been my experience and is therefore true to me. 😉
Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values.
The older you get the more you believe that your view of the world is right. This makes sense. Children still need to find out how everything works. They get corrected all the time because the formed wrong assumptions and opinions.
However, Imagine if you checked your smartphone’s manual every time you used it. Imagine your colleague had to fetch their reference books whenever you asked them something about their job. No-one would survive for more than a week.
This issue is a research point in AI: How ‘certain’ do you want an AI to be? Always second-guessing itself would render it as useless as always assuming it was right.
People don’t like making mistakes. I don’t know if it’s innate or a cultural phenomenon, but in my experience, the immediate reaction to a mistake is a bad feeling—even for inconsequential ones in a friendly environment. Being wrong is not only making a mistake, but living by it. There’s a much greater incentive to not be wrong. The easiest way for an individual to “not be wrong” (in their view) is to assume that the other is wrong, so they reject their hypotheses in a discussion.
There was a study about sometihng simiilar a while back. It was posted on Reddit, so if that site hasn't imploded yet, you might be able to find it. I don't remember the whole thing, but it said a lot of people rather double-down on their already accepted beliefs than open themselves up to new results. It wasn't everyone, of course and it wasn't for all topics either. Maybe someone can go find that study and post it here for OP.
Most people's values and beliefs are all wrapped up with their sense of self, so if those beliefs get attacked, they feel like they're being attacked.
Avoiding this is very tricky and counter-intuitive, but there are techniques. Look up "street epistemology" if you'd like to know more. There's a guy on YouTube who goes to college campuses and has discussions with passersby regarding their beliefs. Basically, it's asking people "What do you believe?" and "Why do you believe that?" Like I said, though, it's tricky and takes a lot of practice, and it's really easy to fall back into old patterns again.
Here's a fun thought experiment: you'd think that it's because people don't like being told they're wrong, because you'd make them seem stupid in public and their pride & ego will kick in and they'll do everything so that they don't appear like they're wrong, but what's funny is that they have the exact same attitude also when online with anonymous pseudonyms. It should in principle be easier to just say "mea culpa, I'm wrong" online since you're dealing with random internet strangers and no one knows who you are, but no, you will very rarely see that.
When you interact with people, you often do it on your grounds, i.e. in your area of expertise. This inherently means that you are more likely to be right in a discussion. I believe this transfers to other areas of your life – where you are not the expert. So you automatically assume you’re right even if you aren’t. However, in my experience this doesn’t apply to situations where you are very aware that you are the (intellectually) subordinate person, e.g. when talking to a doctor.
People just make up their own truth and say that you're the one in need of "truth." It's a product of the "alternative facts" era that mainly Trump ushered in and others have picked up. If your facts do no support the preferred agenda, it's just dismissed as "fake news." Easy-peacy.
Not to this extent in any way. It has been infinitely exacerbated with the dawn of social media. Those who seek to divide have the tools to do so. And we've seen since 2016 what it leads to when people are being continually lied to and they believe it because the lies fit into their own belief system. Trump is not a master at much, but he's a master at that.
People just make up their own truth and say that you're the one in need of "truth." It's a product of the "alternative facts" era that mainly Trump ushered in and other have picked up. If your facts do no support the agenda, it's just dismissed as "fake news." Easy-peacy.
Every person I know who is one of those people that can't be persuaded by logical arguments backed by irrefutable sources seems to have a powerful sense of ego/pride trending towards narcissistic. My father is like that, my stepdaughter-in-law, my former best friend, my kids former football coach, they're all people think they're smarter than everyone else, they still suffer from the same self doubt we all face but instead of facing those self doubts and finding teachable moments to learn and grow from, they instead doubledown on how great they are, and how much better than you they think they are.
I can only read the beginning of his tweet/story as someone who is not signed up to Twitter. He also says there is a link somewhere, but I don't see one.