Clinical biases in diagnosis

interview-2207741_640.jpg

In the abnormal psychology option, what is the difference between cognitive biases and clinical biases?

Clinical biases are cognitive biases that take place when a psychiatrist or psychologist is trying to make a diagnosis and label the behaviour. They can arise from experience (‘these symptoms nearly always mean this mental health problem’) and result in a misdiagnosis when other explanations for the behaviour are too readily discarded. This is a confirmation bias – symptoms are interpreted to confirm the mental health professional’s original swift diagnosis. It was demonstrated in Rosenhan (1973)  when the admitting medical staff interpreted the very vague symptoms described as schizophrenia, and even more clearly when the normal behaviour exhibited by the pseudo-patients was interpreted by medical staff to confirm the validity of the original diagnosis.

They can also arise from an existing societal and/or personal bias, such as an ethnic or gender bias. Jenkins-Hall and Sacco (1991) found that a sample of USA psychotherapists showed an ethnic bias against black clients in that they evaluated depressed black clients more negatively than depressed white clients. While both groups were diagnosed with depression, the black clients (ethnic minority in this case) were seen as being less socially capable and were evaluated as more seriously depressed, using a standardised scale. A larger study by Bertakis et al (2001) demonstrated that women were much more likely than men to be diagnosed as depressed by their primary care physicians, even with a similar number of visits. This showed a gender bias in diagnosis.

So, there is a clear link between this material and the study of cognitive biases, especially confirmation bias.

References

Bertakis, K.D., Helms, J., Callahan, E.J., Rahman, A., Leigh, P. & Robbins, J.A. (2001).  Patient Gender Differences in the Diagnosis of Depression in Primary Care.  Journal of Women’s Health & Gender-Based Medicine, 10(7), pp. 689-698.

Jenkins-Hall , K. & Sacco , W.P. (1991).   Effect of Client Race and Depression on Evaluations by White Therapists.  Journal of Social and Clinical Psychology, 10(3), pp. 322-333.

Rosenhan, D. L. (1973). On being sane in insane places. Science179(4070), pp. 250-258.

Planning your course effectively – more overlaps

Cog blue corrected

Similarly to the biological approach, there are many overlaps between the cognitive approach and the options of abnormal psychology, development, health and human relationships.  For example, the psychology of cognitive processes and their reliability can explain clinical biases in diagnosis of disorders, debates regarding the etiology of  disorders and also inform their treatment.

Watch out for more of these!

Can we learn to love anything or anyone if we just hang around them long enough?

One of the Cognitive Approach studies that we cover in our fabulous book, ‘Psychology Sorted, Book 1’ is by Slovic et al. (2017) and which concerns the Affect Heuristic. The Affect Heuristic is a cognitive bias composed of several dimensions, one of which is:

  • The ‘mere exposure effect’: this may be a factor in the affect heuristic. It involves a favourable (‘good’) judgement being made of stimuli by participants who had been presented with that stimuli several times over compared to less familiar material. In other words, the participants in the study preferred the stimuli they had simply seen/been exposed to more times than the other stimuli.

So, this finding shows we human beings to be fairly simple creatures: we like something on the grounds that it is more familiar than the alternative choice. This obviously saves us a lot of time and effort in trying to compare the relative merits and demerits of two possibly similar items or people. For example, I am interviewing two candidates for a job. One of the candidates already works at my company and I have known her for two years now. She’s a good enough worker, doesn’t cause any trouble and well, let’s face it, she’s a known quantity.

The other candidate is someone that I don’t know. On paper they seem far more interesting than the candidate I already know: they have some good ideas for the role and they may bring a breath of fresh air to the company. But…..what if they aren’t as good as they seem? What if they don’t get on with the team? What if their ideas never actually see the light of day? Can I be bothered training up someone new? Maybe the candidate I already know is actually the best person for the job. Hmm, yes, maybe the familiar person is best – I’m used to their face, they fit in etc, etc.

This choice may, in fact, turn out to be the best choice but it is still an example of the mere exposure effect guiding someone’s behaviour rather than a fair and unbiased assessment of the evidence. Could the mere exposure effect explain seemingly baffling phenomena such as particular politicians becoming less reviled and more accepted the longer they are in office? Could it explain you humming along to a song you detest simply because it is constantly being played on the radio? Be aware of this in your own life – we all do it and it’s not necessarily the best way to make decisions as to what is good and valuable in our lives.

signs-1172209_1920

Cognitive biases – don’t let them confuse you.

Studying the reliability of thinking and decision-making leads us into the slightly

chimpanzee-978809_640
System 2 thinking

complex world of System 1 (fast) and System 2 (slow) thinking and heuristics.  Teaching cognitive biases is straightforward, and less is more.  The key point is that we are inclined to base our current thinking and decision-making on past experiences and present perceptions.  Our memories distort the past, and the media and our selective attention distort our present, especially if we are being pushed into a fast decision.

Tversky & Kahneman (1974) review a range of research in which they themselves have tested different heuristics, looking for evidence of ways in which System 1 thinking (effortless, fast, a short-cut to the answer) may operate when tested under specific conditions.  They describe three different heuristics, leading to cognitive bias.

The representative heuristic is based on the idea that one event is representative of other events very similar to it, using the idea of how probable something is according to the individual’s prior knowledge of it. Even though participants knew that 70% of the descriptions of people that they had been given had referred to engineers, while 30% had referred to lawyers, when faced with a description of a man who could have been either, they judged that there was an equal chance of John being either an engineer or a lawyer.  Similarly, when given a description of a shy quiet person, they were immediately judged to be most likely to be a librarian, even though the list of possible occupations included those that were much more statistically probable.  This can be seen as the basis for stereotypes – taking a shortcut based on prior knowledge and assumptions.

The availability heuristic works by people tending to judge an event using the probability of its occurring, according to their prior knowledge: e.g. a middle-aged man with chest pains might be assumed to be having a heart attack but a four-year-old child with similar pains would not elicit the same response as four-year-old children do not tend to have heart attacks.  This can lead to bias in diagnosis, as clinicians base their diagnoses on previous examples that come readily to mind; they are cognitively available.

The anchoring bias involves an initial value or starting-point in an information processing task determining how the final value is arrived at. The researchers tested high school students asking them to estimate one of the following: 8x7x6x5x4x3x2x1 or 1x2x3x4x5x6x7x8. Of course, each answer is the same as the numbers are identical per list. What Tversky and Kahneman found was that the descending list (8x7x6 etc.) produced a much higher estimate than the ascending scale (1x2x3 etc.) with the researchers concluding that the first value anchored the value as either high or low and that this is what caused the adjustment to the estimations.  This is related to our first judgements about people: if we judge them in a positive light because of their friendly behaviour, this can ‘anchor’ our appraisal of their subsequent behaviour.

Use these examples as the basis for discussing how stereotypes are developed, or how diagnoses can lack validity, and they are also useful for discussing the lab experiment method.  I am sure students can think of many more examples of how these heuristics can occasionally (not always) work to distort our thinking and decision-making in real life. But that might take some time and some logical, patient reasoning using System 2 thinking!

Cognitive Biases

reflectionCognitive biases – distortions of reality

Cognitive biases arise from heuristics (shortcut thinking) and systematically distort the way we think and affect our decision-making. Three of the most common cognitive biases are confirmation bias, anchoring bias and  cognitive dissonance. All three arise from our tendencies to focus on part of a story, especially the part that confirms our pre-existing beliefs, and to avoid holding two inconsistent beliefs at the same time.  That is, we are selectively attentive, and filter out what we don’t agree with.

Confirmation bias – this is when we seek out information that tells us we were right all along! It affects the type of information we seek out, and also how we interpret neutral information that we meet along the way.  If I post on Facebook or Instagram and don’t get immediate positive feedback, I feel that people dislike what I’ve posted, and are ignoring me. I think we can all identify with this. We look for evidence that supports our pre-existing beliefs about ourselves, or others, rather than useful information.

Wason (1960) demonstrated the confirmation bias by giving British university students the 3 ascending numbers 4-6-8 and asking them to guess the rule he had used to devise the series. (It was simply ‘any 3 ascending numbers’.)  Students had to generate their own sets of 3 numbers, and ask if they conformed to the rule. Of course, they did, as any three numbers that were neither identical nor descending conformed, but this stopped them from isolating the rule. They were seeking evidence that confirmed what they thought, not evidence that refuted the rule, so it could be more easily identified. This is a mistake, as an attempt to refute the hypothesis would have given more clues to its true identity.

Anchoring bias – this is when we make estimates of a total or of a probability by starting from an initial idea, which ‘anchors’ us to it.  An experiment described by Tversky & Kahneman (1974) illustrates this:

Two groups of high school students estimated the total of a calculation written on a board. Group 1 estimated the product of  8 x 7 x 6 x 5 x 4 x 3 x 2 x l while Group 2 estimated the product of 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8.  The earlier numbers in the sequence acted as anchors for the expectations, and Group 1 estimated  a median of 2,250 and Group 2 estimated  a median of 512.  (The correct answer is 40,320 – I had to check it to believe it!)

Cognitive dissonance – if people cannot change their behaviour, then they change their beliefs about that behaviour, so there is less mis-match between belief and behaviour.  The best example of this is the young smoker who feels unable to give up.  They know that smoking causes cancer, but choose to believe it doesn’t do so in people as young and fit as them, who will, anyway, give up in the future.  They know it makes breath and clothes smell, so they keep to the social circle of other smokers and choose to believe that ‘smokers have more fun.’

Festinger et al. (1956) empirically tested this theory by conducting a covert participant observation study of a cult that believed the world would end at midnight on 21 December 1954. They alone would survive and start a new civilization, and for this, a spacecraft was coming to pick them up earlier on that night.  When the spaceship did not come and the world continued, then the cult members believed the world had been saved through their prayers.

UFO