Moreover, the comprehension that other people think the same things you do is in fact one of the hallmarks of intelligence. It's called empathy. The knowledge that other people could think like you - and therefore want the same things as you and therefore you must protect things you like - is essential to modern life.
Similarly, a Normalcy Bias protects us from wasting money on false fear. The Normalcy Bias is when we don't plan for something that we never experienced. We don't plan for giant tsunamis - but we also don't plan for bees to develop tiny little guns and kill us all.
Some of the Biases have to do with the cost of being wrong, more than the truth behind the issue. For example, in Bandwagon, people believe what is popular just because it is popular. If you are a 'heretic', others think you are strange. It causes social problems. So given zero additional information, it makes logical sense to choose to believe what everyone else believes. Unless of course you are trying to be different - to be leader, the inventor, the discoverer.
The "Cognitive Biases" listed here are valid "guesstimate" methods, that people would be stupid to stop using. But the logicians are correct when they claim these guesstimate methods do not belong in a strict logical proof.
Most cognitive biases are heavy influences on politics. It affects who and what we vote for, and what we believe. Here are a few of the cognitive biases that particularly affect politics:
- Actor-observer - Good I/my friends do is intentional, bad I/my friends do is accidental. BUT good others do is accidental, while the bad they do is intentional.
- Anchoring - We give extra value to one specific event, fact, or piece of information.
- Backfire - Evidence against our belief enforces it because we think our belief is under attack.
- Bandwagon - People are more likely to believe something because most people around them believe it.
- Bias Blind Spot - We see ourselves as less biased than others (This one does not apply to me ;)
- Confirmation Bias - W seek out information that supports our point of view or interpret neutral information as supporting our point of view, ignoring information that contradicts it.
- Distinction Bias - We compare two options we think they are much more different than we do if we throw in more options as comparisons. I.E. Calling Obama a socialist because you never compared him to say Stalin, Mao, Castro etc.
- False Consensus effect - Everyone thinks most people agree with them.
- Illusion of Control - The belief that you (or someone else) has more effect than they really do.
- Moral Credential Effect - If he/she/I was good in the past, it lets him/her/me get away with other stuff now.
- Normalcy Bias - Refusing to plan for something that has never happened before/you never experienced before.
- Omission Bias - Judging harmful actions as worse than failing to act - even if the harm is the same (A guy that puts poison in your cup is worse than a guy that leaves poison in a sugar jar and never tells you about it when you add 'sugar' to your coffee)
- Outcome Bias - Judging an action by it's outcome instead of the reasoning - he shot and killed a man because he made eyes at his grandmother. If the guy turns out to be a child rapist, most people won't care - until he does it again to a non-rapist.
- Semmelwis Reflex - Ignoring new evidence that proves an old idea you believe in wrong.
- Zero risk bias - Believing that reducing a small risk to zero is better than reducing a very large risk to a small one.
There are others, but these are some of the more common one found in political arguments. Take the anti-terrorist actions taken by the TSA. Illusion of Control - surely the TSA is why there have been no more terrorist acts. Right? Only, they haven't actually arrested a real terrorist.
Another reason I prefer "guesstimate methods" as opposed to 'biases' is that I don't think the human mind has in-built flaws, but instead in-built features. We need to make decisions, even about things we lack information about. In fact, those are the primary things we argue about in politics - things we don't have enough information about to make an informed decision. Sometimes that is an artificially created situation (as when people try to prevent evolution from being taught in school), but most of the problems involved in politics, we just don't know the solution to. If we knew the solution, we would test it, prove it right, and implement it.
So, politics is full of these shortcuts. Keep that in mind when you argue. Don't stick to just traditional logic, work on these shortcuts as well. Recognize why people think things, and work counter to them.
Among other things, this explains the mud throwing. When a politician (or their proxy - usually a SuperPac now a days), engages in Mud Throwing/dirty politics, they are trying to counter the Moral Credential Effect. By destroying their moral authority on one issue, they hope to lower the amount of respect others get. That doesn't make mud throwing OK, it just explains it.
No comments:
Post a Comment