Monday, August 19

Critical Thinking: Fallacies and Bias

Introduction

You are a good person and correct about pretty much everything. Or so we like to tell ourselves on a daily basis. I'm not being facetious, we literally do. Our inability to correctly critically assess our own views leads to regular and serious social problems, examples include racism, rejection of good scientific theories and debilitation of our political system. Key to become more correct is understanding your own limitations, this is not a discussion of how to persuade people with fancy words or emotive devices, but how to strive to be less flawed. (Note to reader, this is as much an exercise to help me as you).

A fallacy is simply an incorrect argument, and they can come in any shape or size. Often what is meant by "fallacy x" and perhaps a more helpful definition is: an incorrect logical construction or argument which humans often fall into. Bias can mean almost the same in this context, a tendency for humans to think in a certain way which can often lead to incorrect conclusions. Some of these biases may even be obvious, you may palm off the severity of them, saying: "I notice myself doing these all the time and stop myself." Maybe you are a cognitive superman and stop yourself all the time, but I doubt it because you are most probably human. Even if you are exceptional, these biases take a lot of work to find and minimise. You will never be done either, it is a process of continuous improvement, it is both unhelpful and inaccurate to think you will ever finish. Enjoy instead every time you eliminate some bias, becoming a little less wrong.

Fallacies and Bias

In general (there may be exceptions), fallacies and biases stem from a few human traits. Firstly, we view ourselves in an overly positive light and others in a negative light. We dismiss our own flaws, and assume we are correct more than we are. Second, we are overly prone to type 1 errors (false positives). After all, it is better to over-sensitive and avoid a harmless thing than be under-sensitive and ignore a dangerous one. This, apart from making us survive better, means we tend to see patterns in events that aren't there. Another major issue with human reasoning is our tendency to see a discussion as war. We are built to form groups, tight knit groups who believe broadly similar things. This is obvious in politics, where battles which should be about ideas and data become battles about groups and parties (Check out Politics is the Mind-Killer on Less Wrong for more).

Lastly, and a more general point which encapsulates the others: we aren't designed for this. Long, certain logical processes may be accurate but they are slow. It's not all bad, as I said with type one errors, with our limited processing power evolution has provided us with decent heuristics to survive. Heuristics are essentially cutting corners when it comes to finding solutions. They are far more efficient than proper logic, even if not perfect, complete or always correct solutions. In days gone by, when our minds were preoccupied by more basic needs, this was great. In fact it is tautological to say, those who survived to reproduce were better at surviving in that primitive environment. That's great, everyone loves survival. The problem arises when we focus our mind on one problem, our heuristics cloud our vision; offering their needlessly quick and half-arsed ideas, when we have all the time in the world. Evolution is an imperfect process, and contrary to popular belief (due to our knack for type 1 errors) it did not and does not "aim" for a perfect being. In this case, it has produced a being which is not very good at logical deduction, instead one which is good at surviving and reproducing.

Confirmation Bias

Confirmation bias is a tendency for people to find evidence that agrees with their hypotheses. We don't like the feeling of being wrong or the effort of having to think to change our opinion (I'm speaking from experience). So as a coping device, we either avoid, suppress or deride such evidence. This can express itself in all kinds of ways. One example is when we seek out friends who agree with us, I'm not judging this practice as wrong, we all need down time (check out is-ought fallacy). Nevertheless, we should be aware of it, because sometimes it can lead us to another fallacy: the false-consensus effect (there are many other ways this can arise). We can also avoid conflicting evidence by only reading news-sources that agree with us. In the same regard we can be over-critical of people who disagree with us, (a little more of this later) or automatically put up barriers if we think they might. You may feel yourself do this, try reading two conflicting newspapers and feel yourself glide over one opinion but ridicule another. Of course, it is time consuming to do this, to question evidence you agree with and entertain evidence you don't (maybe you can see why we evolved this bias). I am by no means saying all evidence is equal, one important conclusion of Bayes theorem is that we take into account prior observations, but naturally we aren't sensitive enough (hence the bias). Don't waste your time seriously appraising evidence of the boogie-man, but do occasionally ask yourself, "is it my level of trust for this evidence reasonable or am I unfairly ignoring it/giving it undue credit?" And be willing to change your hypothesis in the light of sufficient evidence.

Attribution bias

Wikipedia sums this bias up very well with a story:

"For example, when a driver cuts us off, we are more likely to attribute blame to the reckless driver (e.g., “What a jerk!”), rather than situational circumstances (e.g., “Maybe they were in a rush and didn’t notice me")."


The situation acts in reverse when it is us cutting up the other driver, and although the bias is a little broader than this (and is very much worthy of a further read) it stems from a simple idea: we want to portray ourselves as positively as possible. To counteract this bias, remember that everyone else is like you.

Cultural bias

We see evidence through the lens of our current theories, we saw that in the confirmation bias. However, many of there theories are first formed and learnt from our culture. By culture I mean in the broadest possible definition, including the culture of your friend groups, the culture of humanity, the culture of your educational establishment etc. Perhaps it would be better to refer to it as environmental bias, what are the people around you saying? Again, this is a bias of magnitude not type, don't throw out everything your culture tells you; our ability to absorb and pass on huge amounts of culture is one of the few things which separates us from other primates. Remember that nothing is sacred, no idea above reproach. The fact that religion practically runs in families should serve as a stark reminder to question received knowledge. Cultural bias can be much more subtle, one interesting example is the obsession of western nations to enforce democracy on countries with authoritarian rule. However, we are surprised when the country votes for a despotic leader again, "they are silly", we exclaim, "why would you not want a secular and liberal democracy?!" Culture is powerful and paints everything you think: question the accepted norm.

Ingroup-outgroup formation and bias

This is a particularly toxic bias: we love to make sides, seek to widen gaps and reduce proper argumentation to throwing shit around. What do I mean?

We need no encouragement to produce sides, as the robbers cave experiment suggests. I would highly recommend reading it, but to quickly sum it up: two isolated groups were formed, and upon discovering each other, started a constantly escalating war which also standardised behaviour within the groups. See also, the two sided debating chambers of the House of Commons and the Cambridge union. But what is wrong with a bit of back and forth in a debate as long as we don't fight?

Firstly, issues are rarely two sided or even three sided, the best solution (and remember that is our goal) is somewhere on a many dimensional map of the solutions (x amount of this and y amount of this... etc). Reality and people are complicated, and so are the solutions to our problems, I don't want a choice between only two. Furthermore, I and you need no encouragement to conform to a binary choice. That isn't all of it though, our tendency to take sides often leads to poor argumentation. We provide our own side concessions and don't question, all the while demonising and criticising the enemy. We see the discussion as a war, where our side needs to win. To question your side is betrayal and making concessions is losing vital ground. This serves only to widen the gaps as insults become personal, make our own views more ardent and produce a more stark binary. Choices are rarely binary so don't take sides, you are trying to find the truth not win a war.

In politics these effects are blown up and their effects are too terrible to fathom, I would recommend reading politics is the mind-killer.

Ad Hominem

A famous politician once concluded "Rarely is the question asked: Is our children learning?". He was stupid => his policies were stupid => we should do the opposite.

Ad Hominem is a decent heuristic, stupid person says stupid things, a bad tree bears bad fruit. But it isn't a valid method of refuting a specific argument. The truth of statement is entirely separated from the person saying it. If you have enough time, there is no need even to consider who is saying it. Simple statements about whether one thing implies another require no context. With all the time and resources in the world, more complex claims could be tested with your own evidence and all logical steps could be checked. Realistically though, it is useful to have context and possible motivations if it is a particularly complex argument. It is useful to know where someone is coming from to watch out for specific flaws, but remember when you do it. In your mind, you must separate the person and the idea. (To link it to the previous bias, bear in mind a discussion is not a battle between two people or even ideologies or cultures, the truth is unconcerned with the frame and participants of the debate.)

I feel I must defend poor George Bush, in my example I remarked at his grammatical slip. On this basis and on his mannerisms, you may conclude that George Bush was stupid, maybe he was. I don't care to be honest, good grammar and spelling are probably only weakly correlated with intelligence, or at least weaker than you think. Even if it was, it wouldn't be the point, the truth could appear badly spelt on the side of a railway track or uttered by a pompous fool and it would still be true.

An extension of this fallacy is the tendency to reject wholesale the views of the stupid or the evil. I don't think I can explain this any better than Eliezer Yudkowsky, see Reversed stupidity is not Intelligence.
The stupid and evil occasionally get it right.

Is-ought fallacy

The is-ought fallacy is the movement between statements of reality and statements of ethics without suitable justification. How things are (is) and how things ought to be are separate questions. We see a specific type a lot, when people appeal to nature; flogging food with all "natural flavours" and accusing homosexuality of being unnatural and therefore wrong. It's not that either statement must therefore be wrong, it just requires justification (you'll probably struggle more with the latter statement, I hope).

The connection between reality and morality should be reasoned and not simply implied.


Conclusion

These are literally a tiny sample, check out List of cognitive biases on wikipedia (less wrong also has some very good articles on such things). I have attempted to pick particularly common ones, but this is of course shaped by my experience and personal biases. Perhaps you would have picked different ones, but that's not the point, I just wanted to point out to you (and me) how often and how badly wrong we are. The answer to "why?" might be from our culture and genes, but don't blame your ancestors too much, wondering whether we are in the matrix isn't very helpful when a tiger is bearing down at you. If your ancestor had been having an existential crisis at that point, you may not be here to read this (maybe you wish s/he had). Corner cutting heuristics are much quicker, but take your time and don't skip the logical steps, there's often no need. If a decision is important enough to take your time about it, keep the logical steps short to avoid falling in such holes.

No comments:

Post a Comment