Important Note

In reading this web site, be very certain you apply “analytical thinking” or “critical thinking” to all you see here.

Failure to apply a logical method of thinking when studying new ideas can lead to a person being unable to learn, due to an inability to change fixed ideas.

Definition:

Critical Thinking is the process of using reasoning to discern what is true, and what is false.

Critical Thinking is a higher order thinking that clarifies goals, questions assumptions, discerns hidden values, evaluates evidence, and assesses conclusions and decisions.

 

 

 Introduction to Critical Thinking Tools: 

 

Firstly, recognize that thinking critically does not mean simple criticism. It means not simply accepting information at face value in a non-critical, or non-evaluating way.

The essence of critical thinking centers not on answering questions but on questioning answers, so it involves questioning, probing, analyzing, evaluating.

Beware of groupthink. This can seduce you at any time, but especially if you are a member of a close-knit group of people where there is a strong sense of loyalty to the group, deference to an authority figure within the group, and a demonized view of those outside the group.

Remember the story of the little boy who was the only one brave enough to point out that the emperor had no clothes? It is amazing how very senior and very intelligent people can collectively delude themselves about something – for instance, that a particular policy is working when so much evidence shows that it is not (the war on drugs?).

Groupthink is further defined by philosopher Irving Janis as “A mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members strivings for unanimity override their motivation to realistically appraise alternative courses of action.”

Groupthink symptoms are as follows:

○ Feelings of invulnerability creating excessive optimism and encouraging risk taking;

○ Unquestioned belief in the group’s morality, causing members to ignore the consequences of  their actions;

○ Stereotyped views of enemies and enemy leaders;

○ Pressure to conform applied to dissenting, “disloyal” group members;

○ Shutting down of ideas that deviate from the apparent group consensus;

○ Illusions of  unanimity;

 “Mindguards”– members who shield the group from dissenting opinions.

Remember that prominence does not equate to importance. A newspaper may have made its lead story the rumour of a break-up between Britney Spears and her latest boyfriend, but that does not necessarily make it the most important news item that day. Conversely, in 1914 that tiny story about the assassination of an obscure nobleman in some backwater called Sarajevo proved to have rather more repercussions than most readers first appreciated.

Check the source. Who wrote the article or scripted the program? How knowledgeable is the source? Does the source have a particular interest or ‘angle’ or prejudice? Do you know the source by reputation or previous work?

Use different sources. For example, if there is a dispute over the ecological impact of oil exploration, check out the views of the ‘green’ pressure group and the oil company and other, more independent, sources such as scientists and commentators.

Always prefer prime sources. A personal, eyewitness account is to be preferred to the statement from the politician who was told by a journalist who read it on a news wire that obtained it from a company spokesman who was briefed by a senior manager on the basis of an eyewitness report from a colleague.

Check the date. Generally speaking, the more recent the material, the more accurate it is likely to be, and the more useful it is.

Older material could also show that a current condition has been a long-term trend, and so point to the possible starting time or source of the condition.

Check the publisher or promoter or funder. Many newspapers, magazines and television stations have a definite orientation and can be expected to push a particular ‘line’ or interpretation.

Seek out assumptions. Sometimes assumptions may be implicit and therefore hard to discern. For instance, a political opinion poll may assume that everyone polled is telling the truth about their likely voting intentions. This sort of assumption is unlikely to be spelled out anywhere in a report.

Question assumptions. For instance, does everyone polled tell the truth about their likely voting intentions? Maybe supporters of racist parties are reluctant to be honest about their true voting intentions.

Be especially skeptical about surveys and polls. Who is funding the project; how the questions are chosen, worded and posed; how those questioned are selected and the context in which the questions are put to them; how the statistical analysis is carried out and the statistics are interpreted; how the findings are presented and reported (or misreported) – all these factors can have a massive influence.

Look out for exceptions. There is a popular saying that: “It’s the exception that proves the rule.” In fact, in scientific terms, it is the exception that disproves the rule. So, for instance, for many centuries it was assumed that there could not be a black swan and therefore that ‘All swans are white’. However, in the 17th century, the discovery of black swans in Australia forced a change in that thinking. The identification of exceptions or black swans requires us to rethink the current orthodoxy.

Look out for trends. If a consistent method of measurement is used (that is, over time one is comparing like with like), then trends may well be apparent, so that one can see a rise or a fall or a cycle.

Make time comparisons. If a company announces that it has increased revenues by 10% in the last two years, look at the rate of growth in revenues in the two previous years. This will indicate whether recent performance is impressive or merely continuation of a trend.

Consider extrapolations. If you have discovered something not quite right in your geographical area, it may well be that you are not isolated, and that this extends to other areas also, especially if they are being governed by the same “authorities” or principles.

On the other hand, just because one part of the thing you are dealing with works and makes sense is no reason to believe that the rest does too. Just because one thing a person says is true is no reason to believe that everything that the person says is true.

People often find it hard to see the dual nature of others, and usually view them as only good or only bad, and cannot conceive that a person who does bad most of the time can also do good, or that a person who has done seemingly good deeds can also do some very bad things.

Always look for evidence. What is the evidence? It is tempting to seize on evidence that confirms one’s original view or the prevailing orthodoxy and to dismiss evidence that challenges it, but one needs to be un-biased about all the evidence and equally rigorous about establishing its authenticity.

Be ready to change your mind if the evidence changes. British economist John Keynes once said: “When the facts change, I change my mind – what do you do, sir?” Before the US invasion of Iraq, many people thought that Iraq possessed weapons of mass destruction based on the then available evidence and the interpretation of it by the intelligence services. Following the invasion and extensive searches, the evidence changed, but many were reluctant to change their minds.

Always consider alternative explanations. For example, the fall in crime levels could be the result of more police, better detection procedures, social changes or simply new methods of reporting.

Beware of making assumptions. Someone once said that: “Never assume, as assume makes an ass out of u and me”. So, just because a particular source is usually accurate doesn’t necessarily make it accurate this time. Just because the facts can be explained by one particular scenario doesn’t mean than another scenario isn’t possible and maybe even more likely.

Don’t jump to conclusions. Although the currently available facts may suggest a particular conclusion, other conclusions may be possible. Further facts may support an alternative conclusion and even invalidate the original conclusion. Even when this is not the case, it is always helpful to have further, supporting evidence to support the original conclusion.

Remember Occam’s Razor. This rule-of-thumb states that when two or more explanations are possible on the basis of the same facts, always prefer the simplest possible explanation.

Look for cause and effect. Correlation does not necessarily mean causation. For example, when I get up from bed, the sun comes up – but there is obviously no causality. Yet some tribes used to believe that particular rituals were essential to ensure the rising of the sun. On the other hand, when I go to bed, I feel refreshed – and there clearly is a relationship.

Don’t rest on authority. The scientist Albert Einstein once remarked: “Foolish faith in authority is the worst enemy of truth”. There is a popular saying that: “Great minds think alike” – but in fact the greatest minds (such as that of Einstein) frequently think very differently from their contemporaries and peers. Remember that another popular saying is: “Fools seldom differ.”

In the early 1990s,a lecturer gave a presentation to a group of Russians using slides in Russian. At one point, he realized that he had been speaking to the wrong slide for the last five minutes. When he asked his audience why no one had told him this, he was advised that in Communist Russia no one challenged the teacher! Just because the management or the government states something does not necessarily mean that it is true.

Closely related to this, don’t necessarily rest on the received wisdom. Galileo was excommunicated for challenging the Church’s view that the sun, the planets and the stars revolved around the earth – but he was right. Today even the most fundamental rules of modern physics are being challenged. Many management styles and political policies are the received wisdom for a time, but frequently deserve to be challenged. The important thing is to marshal the evidence and subject it to review and analysis.

Be challenging of the seemingly seductive comment “It works”. There are two problems here: agreeing a definition of what ‘works’ means and establishing a cause and effect relationship between action and outcome. If I perform a traditional Indian rain dance in my back garden, it may rain in an hour, a day or a month. Over what period are we going to assume the dance may have had an influence? Then, can we reasonably infer causality here? It may be that my neighbor was performing a different, more effective rain dance in her garden; it may be that the rain clouds had been seeded by a specially charted aircraft to ensure good weather for a sports event tomorrow; it may be that I am in India in the monsoon season and it usually rains at this time of day at this time of year anyway.

Be challenging of the seemingly convincing comment “There is no alternative”. There is always an alternative – even if it is simply doing nothing and waiting to see what happens. In fact, usually there is more than one alternative – in which case technically you do not have alternatives but choices.

Beware of anecdotes. People sometimes talk of “anecdotal evidence”, but really this is an oxymoron. An anecdote is not evidence – at best, it is one person’s experience and, at worst, it is simply unsubstantiated rumor.

On the other hand, trust your instincts. If something doesn’t ‘feel’ right, even if it is in a newspaper or a television program, check it out. Strange though it may seem, the media can make mistakes and corrections rarely achieve the prominence of the original story.

Be aware that, when observing a situation, the observer can sometimes change the situation. For example, researchers were trying to establish what change in working conditions would lead to an increase in productivity. To the astonishment of the researchers, they found that every change in conditions – andeven a return to the original conditions – resulted in an increase in production. They concluded that this was because the workers were being motivated by the interest shown in them by the researchers.

Make sure that the statistics you are viewing are actually relevant. Are the statistics actually measuring the most important things, or are they being used to deflect attention off of more important statistics, statistics that actually measure what really matters at the end of the day?

Have the statistics been artificially boosted for PR purposes in order to create a good impression for a specific occasion? Do the statistics continue once the fanfare is over?

Are the statistics entirely false and untrue?

‘Translate’ statistics. So, convert a percentage into an absolute figure. A claim to have increased customers by 100% might simply mean an increase from two to four. Conversely, a 2.5% increase in a nation’s economic growth could – in the case, for instance, of the UK or the USA – mean the availability of billions of more pounds or dollars.

Similarly, convert absolute numbers into percentages.

Think about what is not there. When invited to respond to material, most people confine their comments or their thinking to what they can see or hear. Sometimes what is not there is just as important. You might want to ask: Why are certain arguments missing? Why have certain sources not been used? Is this the full picture? A political manifesto will inevitably mention achievements but not failures and will often criticize another party’s policy or performance but fail to offer a constructive alternative. A company’s annual report will put the most favorable possible ‘gloss’ on activities and may not mention at all financial difficulties or threats from competitors.

Learn to think ‘out of the box’. Albert Einstein once said that: “Problems cannot be solved by thinking within the framework in which they were created”.

If you dare, go beyond thinking ‘out of the box’ to thinking the ‘unthinkable’. What does this mean? It means considering variations to the most basic of parameters and entertaining the most radical of possibilities.

The unthinkable may mean that what you once believed and argued about is, in fact, wrong. This may mean personal embarrassment or the realization that one has been wasting one’s time and resources, or missing out on other opportunities. This may be unpalatable, but it may yet be true. Truth is what it is, not what you want it to be.

Take your bias out of the equation. Do not defend a standpoint just because you always have defended it. Look at it as if you were getting involved for the first time. How would others view the situation for the first time? Don’t get attached to a hypothesis just because it is yours.

Practice critical thinking. Alfred Mander asserted in his book “Logic For The Millions”: “Thinking is skilled work. It is not true that we are naturally endowed with the ability to think clearly and logically – without learning how or without practicing”. The British philosopher Bertrand Russell bemoaned that: “Many people would die sooner than think; in fact, they do.”

Finally, remember that ‘thinking critically‘ ends in ‘why?’ The word ‘why?’ is the most powerful tool in your mental toolbox. Keep asking ‘why?’ Why is this person writing this story in this particular newspaper? Why is this politician making this statement now? Why has the author of this paper quoted this source and not that one? Why has she used a percentage instead of an absolute figure? Why am I asking all these questions?!?

By Roger Darlington

 

 

Emotional Barriers to Critical Thinking:

 

  • We often overreact to potential losses, focused more on the short-term consequences rather than the longer-term effects.
  • The more meaningful a loss is, the more loss averse we become, meaning we don’t want to give up our hold on the loss (even when it’s economically, emotionally or otherwise stupid not to do so)
  • The more meaningful a potential loss is, the more likely we are to make irrational decisions.
  • We hold on to the pervasive pull of commitment. When we are committed to a relationship, decision, or position in our lives, it can be very difficult for us to see the better, healthier alternatives available.
  • Humans have a tendency to imbue someone or something with certain qualities based on its perceived value rather than objective data. This is called value attribution.
  • If we see something labeled a certain way, we’ll take that label at face value.
  • Humans have a propensity to label people, ideas or things based on our initial opinions of them. The authors term this the “diagnosis bias,” and it includes our inability to reconsider those initial value judgments once we’ve made them.
  • A single word or label can color our entire perception of a person, closing off avenues of shared experience and seeing people for who they really are. Once a person is given a label (and even directly, a diagnosis), it’s hard for people to see people in a way that isn’t biased by that label.
  • “Mirror, mirror” effect – we like and look for people like us.
  • We constantly influence others and are constantly being influenced by our expectations and labels — what the authors call the “Chameleon effect.”
  • We can either approach a task altruistically or from a self-interested (or pleasure) perspective, but usually not both at the same time.
  • It’s not that rewards for specific tasks or behavior are bad, it’s the possibility of a reward dangled ahead of time that can potentially result in destructive, unintended effects.
  • Dissent is invaluable – you need a dissenter, even if you don’t agree with the specific dissent itself. Dissenters open up discussion and shows up problems you may not even be aware exist.
  • People sometimes choose to attack the arguer and not the argument itself.
  • Considering that only two extremes can exist: “you are either with us, or you are against us.”
  • We more readily choose short-term rewards rather than real long-term solutions.

“Cognitive Dissonance” as a barrier to Critical thinking:

Cognitive Dissonance: This is the feeling of uncomfortable tension that comes from holding two conflicting thoughts in the mind at the same time.

People strive to reduce dissonance. They do this by changing their attitudes, beliefs, and actions. Dissonance is also reduced by justifying, blaming, and denying.

 

Examples:

 Leon Festinger first developed this theory in the 1950s to explain how members of a cult who were persuaded by their leader that the earth was going to be destroyed on 21st December, and that they alone were going to be rescued by aliens, actuallyincreased their commitment to the cult when this did not happen.

The dissonance of the thought of being so stupid was so great that instead they revised their beliefs to meet with obvious facts: that the aliens had, through their concern for the cult, saved the world instead.

 Dissonance is aroused whenever individuals voluntarily engage in an unpleasant activity to achieve some desired goal. Dissonance can be reduced by exaggerating the desirability of the goal.

Aronson & Millshad individuals undergo a severe or mild “initiation” in order to become a member of a group. In the severe-initiation condition, the individuals engaged in an embarrassing activity. The group turned out to be very dull and boring. The individuals in the severe-initiation condition evaluated the group as more interesting than the individuals in the mild-initiation condition.

 One situation that may create dissonance is when someone does a favor for a person that they dislike. Here, the dissonance is between those negative feelings for the other person, and the awareness of having expended effort to help them. Cognitive dissonance theory predicts that people will try to resolve this dissonance, by adopting a more positive attitude towards the other person.

A counterpart to this effect is when someone’s actions hurt another person, whom they regard positively or neutrally. In this case, one way to resolve the dissonance is to think more negatively about that person, so that they seem to deserve what happened to them.

 Consider someone who buys an expensive car but discovers that it is not comfortable on long drives. Dissonance exists between their beliefs that they have bought a good car and that a good car should be comfortable. Dissonance could be eliminated by deciding that it does not matter since the car is mainly used for short trips (reducing the importance of the dissonant belief) or focusing on the cars strengths such as safety, appearance, handling (thereby adding more consonant beliefs). The dissonance could also be eliminated by getting rid of the car, but this behavior is a lot harder to achieve than changing beliefs.

 A classical example of this idea is expressed in the fable The Fox and the Grapes.In the story, a fox sees some high-hanging grapes and wishes to eat them. When the fox is unable to think of a way to reach them, he surmises that the grapes are probably not worth eating, as they must not be ripe or that they are sour.

 

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s