Note: You are currently viewing my old web site. There is a new version with most of this content at OJB.NZ.
The new site is being updated, uses modern techniques, has higher quality media, and has a mobile-friendly version.
This old site will stay on-line for a while, but maybe not indefinitely. Please update your bookmarks. Thanks.


[Index] [Menu] [Up] Blog[Header]
Graphic

Add a Comment   (Go Up to OJB's Blog Page)

Confident, but Wrong

Entry 2029, on 2020-02-27 at 21:51:13 (Rating 4, Philosophy)

There are a lot of people out there who are extremely confident about their views. Then there are others who are a bit less certain, whose views are a bit more nuanced, or who are more prepared to compromise, or even abandon a view they currently have. You might expect that the most confident people are the ones whose views are the best supported by facts, and are the most reasonable; and that the others are less certain, because their beliefs are less well supported. But, no. The opposite seems to be true.

When I interact with someone who is totally confident about their beliefs I always assume there is a very good chance that they are wrong. And conversely, people who are a bit less certain I am more likely to think are probably right. This seems odd, but I think it is a real phenomenon which I often notice, and I think I even have an explanation for it.

Imagine a person who believes something which is poorly supported by evidence, and is almost certainly false. If they were the type of person to look at contradictory facts, alternative arguments, or even just look at all sides of a debate evenly, they would probably change that view fairly quickly. But the process of change itself emphasises the fluid nature of ideas. If a person changes their mind once they are probably more open to changing it again. In other words, they are less certain.

But now imagine the opposite situation. A person with a fake belief (or one which appears fake, according to the best evidence) might refuse to contemplate changing their mind. That stubbornness shows an inability to seek new ideas and to question existing beliefs. There's your person who is absolutely certain that they are right, but are very likely wrong.

When you understand this effect, you see it everywhere. Politicians are always confident their policies will work, but the more fanatical they are about an approach, the more likely it is that they have that view through self-deception, and the more likely they are to be wrong. A politician who changes his mind (if that ever happens) after listening to opposing views is far more likely to implement effective, balanced policies.

And this overconfidence happens equally on both sides of the political spectrum. Left-wing politicians are just as likely to be wrong about whatever plan they currently favour as those on the right. For example, the blatant ignorance and fanaticism of green politicians regarding issues like the use of nuclear energy and GMOs is something which annoys me, and I have mentioned it in several past posts.

And as the groups get more extreme, the ideas get more extreme too, as does the apparent need to defend those ideas with no chance of compromise. We see this with extreme groups on all sides, such as the Proud Boys and Antifa.

Clearly, this is a very dangerous phenomenon. The problem is that there is no easy way to escape it. When I debate people on-line I sometimes say something like "that's a good point, and it does make my argument less persuasive, but the basic facts behind what I said are still true, so my argument stands". At that point my opponent's supporters might see it as a victory. But the real victory is being able to critically examine alternative views and adjust your own view to make it better. Refusal to adjust is ultimately a defeat.

Another aspect of this is that people go beyond thinking they are right, and usually think they are not only factually right, but morally right as well. In other words, they believe they are not only correct, but also good. And that is a very dangerous combination, because it is possible to imagine someone thinking they knew the facts, but those facts might lead to bad outcomes. In that case they might compromise or re-examine what they believe, or at least not act on it. But a person who believes they are morally superior is very unlikely to do that.

Many religious people are in this category. They might think that their god has told them what is right, and what they should believe, and how they should act as a result. For example, they might think their god has told them that their religion is the only true religion, they think they should help expand and protect it, and so they go out and blow up themsleves and others as a result. It takes a lot of confidence in both the factual and moral correctness of a view to do that!

And yet, if that same suicide bomber had been a bit less certain of his views, then he might have started taking more notice of the obvious flaws in them, and he might start questioning the assumption that they are undeniably true. And, even if that failed, the moral correctness of killing innocent people might have started planting a few doubts in his mind. But no, his total confidence equates to both factual and moral failure in this case.

Just to finish, here are some cases of this phenomenon which lead me to write this post. They cover a range of beliefs, but all are strongly held and ultimately wrong - or at least, wrong according to the best evidence, because I have to avoid being one of those totally certain people myself!

1. Vegans starving their kids by refusing to feed them a balanced diet. Yes, children die because their parents think veganism is the right option, both from a factual and moral perspective. To be fair, they are partly right, but lacking a nuanced view - including the bad aspects of veganism - can lead to really bad results.

2. Religious nuts refusing to treat their kids, and using faith healing instead. This is a similar situation to my point above, except there is no good evidence that faith healing has any merit beyond the placebo effect. So while veganism has some good points, faith healing basically has none.

3. Environmentalists repressing nuclear power or GMOs. These two technologies could offer huge benefits to the environment if they were implemented responsibly, but green extremists will not even contemplate the idea that they might be useful. Because of this, both the environment and people suffer unnecessarily. Meanwhile the environmentalists congratulate themselves on saving the world from these "evils".

4. Police, judges, etc who are convinced someone is guilty and not prepared to look at alternative possibilities. I often say the law is an ass, and that is sometimes true. I also recognise that it is a system which works quite well most of the time. But people involved in the legal process who aren't prepared to look at the accuracy of their assumptions cause a lot of harm to innocent people, including deaths in countries which have capital punishment.

So yeah, look at those examples and try to deny that the most dangerous people are always those who are convinced they are right. The more confident someone is, the more suspicious of them we should be!

-

There are no comments for this entry.

-

You can leave comments about this entry using this form.

Enter your name (optional):

Enter your email address (optional):

Enter the number shown here:
Number
Enter the comment:

To add a comment: enter a name and email (both optional), type the number shown above, enter a comment, then click Add.
Note that you can leave the name blank if you want to remain anonymous.
Enter your email address to receive notifications of replies and updates to this entry.
The comment should appear immediately because the authorisation system is currently inactive.

[Comments][Preview][Blog]

[Contact][Server Blog][AntiMS Apple][Served on Mac]