Cognitive bias in school-based research
In developing a culture of research at Veritas MAT, I have been thinking about how we can best support our staff team in spotting bias in their own research and in the research of others.
Cognitive bias
Heuristic – providing a simple and often imperfect
answer to a difficult question (Kahneman, 2011)
I love the work of Daniel Kahneman and if you haven’t read his book, Thinking Fast, Thinking Slow it is a worthy read for anyone engaging in reviewing literature or indeed working in school leadership. Here is a link to a previous blog I wrote on slow leadership.
Research is riddled with heuristic. While the Cambridge English Dictionary defines heuristic as a method of learning or solving problems that allows people to discover things for themselves. There is a wider caution of heuristic in school-based research. The heuristic is described by Kahneman as providing ‘a simple and often imperfect answer to a complicated question’. An example of this would be to respond to the question posed by a work colleague, ‘how are you?’ with the standard and benign response ‘Fine.’. If we are to use some slow thinking, we would delve, and ask a question that would proffer a more fulsome response. We could encourage the respondent to dig deeper into their response by asking ‘are you equally happy about all aspects of your work life?’. This may ask the responder to go beyond their initial heuristic of ‘Fine’ and they may then be prompted to speak more widely about their job role, their relationships with their team leader, their frustrations about appraisal, their joy of teaching music to a group of pupils and so on. Challenging our school-based researchers to go beyond the heuristic is key in broadening their understanding of their field of study.
In his book Thinking Fast, Thinking Slow, Kahneman talks of
lazy thinking. His book is truly worthy of a read for any researcher
considering the effects of bias on their own research and the research of
others. He outlines two systems of thinking. Genuinely, I’m not making money on
sales of his book. The first system depends on heuristic. This lazy thinking
allows someone to answer a complex question swiftly without engaging in deep
thought. It is in this system of
thinking that our cognitive bias runs riot and we come to decisions based on
our current thinking or little thought at all. Just like the respondent above who answered
‘fine’ when asked ‘how are you?’. In becoming increasingly aware of cognitive
bias, and in particular, recognising bias in your own and in others, your
researcher will become increasingly deep thinking. They will move their thinking from lazy
system 1 thinking into deep system 2 thought where the multifaceted threads of
their field of study will start to present themselves.
I share these cognitive biases with my early career teacher research
group to help them to develop an awareness of their own bias and to spot bias
with increasing accuracy in the research they encounter. Knowledge of cognitive
bias is also invaluable as a school leader as a shared understanding of
cognitive bias in a senior team opens out the breadth of discussion you can
have with one another. I will now draw on some of the cognitive
biases outlined Kahneman’s book.
Availability bias is rooted in our natural response to answering a difficult question. It operates on the assumption that we already know all there is on the topic being discussed. Put in simple terms by Kahnaman, he uses the acronym WYSiATi to represent ‘What You See is All There is’. This sums up the availability bias in us all that we often assume that we know all there is to be known on a given subject when forming our response.
An example of availability bias operating in a school can
be seen in this scenario. One senior teacher, observing the behaviours in a
teacher’s class states, ‘ Jane’s teaching seems to be deteriorating – she has a
tough class – she clearly isn’t coping well with the pupils.’. Jane’s team
leader is trapped by some lazy thinking and their availability bias is blocking
them from thinking more deeply about what is not known. When questioned, Jane’s
work colleague states, ‘Jane has changed her teaching strategies recently and
is off curriculum as she needs to plug concept gaps for the students who have now
entered Year 10 after a Year 9 where six supply teachers led to poor
progression for the students.’ Further to this, when Jane is questioned about
the issue the leader finds that Jane has noted the students have poor
collaborative skills so has mixed up previously established groupings to help
build resilience in the students prior to their accelerated programme for GCSE.
While this has unsettled the students in the short term, causing low level
disruption, it is a strategy to support progress in the long term.
As you can see, in the assumption that what you know is all there is, the understanding of Jane’s rationale for her classroom management would go unseen. By working through this cognitive bias, the senior leader is in far better position to support Jane to get the very best from her students. This also works in the field of research, and in particular when forming the research question and engaging in reviewing the evidence relating to your researcher’s field of study. We need our researchers to be aware of availability bias in both themselves as researchers and in the research and ideas presented by others. We do this by asking if what you see is all there is ?
As you can see, in the assumption that what you know is all there is, the understanding of Jane’s rationale for her classroom management would go unseen. By working through this cognitive bias, the senior leader is in far better position to support Jane to get the very best from her students. This also works in the field of research, and in particular when forming the research question and engaging in reviewing the evidence relating to your researcher’s field of study. We need our researchers to be aware of availability bias in both themselves as researchers and in the research and ideas presented by others. We do this by asking if what you see is all there is ?
Confirmation Bias
Confirmation bias is where we seek out only that which confirms what we already know or think. This is seen in the associations we unconsciously make. For example, if you buy a new car, without warning the road is full of drivers in that same car. Our brains are hard-wired to recognise and make sense of patterns and connections. Because of this, we often subconsciously ignore anything that does not support our view of the world. Confirmation bias is important to recognise in the research of others as well as in our own research and our professional lives.
So, how does confirmation bias appear in schools and in
research? Let us consider the following statement, ‘Mrs Jones gets great
results in her maths GCSE by grouping students by gender’. Being aware of
confirmation bias, what are the assumptions in the statement? One assumption
would be that gender segregation has resulted in the great results. A second
assumption could be that all students in Mrs Jones' class achieve these great results. A third assumption may link to the quality of teaching
delivered by Mrs Jones, who may have a range of interpersonal skills that lead
to high engagement for all her pupils. I could go on. Without thinking more deeply about the wider
factors, the researcher could use this statement to simply confirm what they
already think about gender grouping, asserting that gender segregation in the
teaching of maths is the key factor for the results achieved by students.
Sunk cost fallacy describes a bias where you continue blindly doing something because you
have invested time or money in it. You see this in someone who goes to the
theatre to see a play and finds it to be the most tedious experience ever. They
reach the interval but head back to their seat after because they have paid for
the ticket and although they are not enjoying the experience, continue as they
don’t want to waste the money spent. Taking this bias into the school
environment, we may hear a school leader say when questioned by their staff on
the value of the digital assessment system in place, ‘we have been using this
assessment system for five years. We have invested time and money in the system
and it produces super graphs of students’ progress.’.
Then a keen member of the
senior team states, ‘Is there a better solution to manage the time our staff
are spending entering data into the system?’. With sunk cost fallacy engrained
unconsciously in the school leader’s mind, they respond by saying, ‘We have been using this assessment system
for five years. We have invested time and money in the system and it produces
super graphs of students’ progress.’.
‘But,’ retorts the keen teacher, in a desperate attempt for the school leader to see the folly of their thinking, ‘The current assessments on the system are no longer in line with the new exam board’s expectations.’. The senior leader then responds (yes you’ve guessed it) ‘Seriously, do you know how much time it took us to train our teachers to use this system and what the set up costs were? We have been using this assessment system for five years. We have invested time and money in the system and it produces super graphs of students’ progress.’. The school leader is resolute, even though the wider evidence suggests the current system is no longer fit for purpose.
Sunk cost fallacy is an unconscious bias in us all and you
will find this also in research articles, books and rooted in the practice of
colleagues. You need to alert your researcher to listen for evidence of sunk
loss fallacy as this is often used as a key argument for opposing
organisational change in schools. As such, research can sometimes be wrongly
used to validate the status quo and obstruct the evolution of new practice
within our schools.
Group Bias – if all around you agree, it must be true.
Group bias is an unconscious bias that relies on the social
convention that if all around you agree, it must be true. An example of group
bias is seen in the story of the Emperor’s New Clothes where the Emperor is
convinced by his aides that while his new robes are invisible to him they are visibly
the finest and most luxurious stately robes to all around him. As his subjects were all too scared to admit
to the Emperor was in fact naked, the Emperor himself believed what was
blatantly not true. Group bias is a powerful and convincing bias and has caused
atrocities to take place across the world as groups of people and nations hold
biased, yet unfounded viewpoints.
In the field of research, group bias can be seen where groups
of people gather with similar viewpoints. This is often the case on social
media where groups of like-minded or like-opinionated people meet. Forming a
bubble of understanding. In order to dispel this bias, a lone voice in the
group needs to ask a challenging question, ‘ but what would others who may
disagree with our perspective say and on what information would their opinions
be founded?’ This is an important bias for your researcher to challenge when
undertaking their evidence gathering relating to their research question.
‘In the modern world the stupid are certain while the
intelligent are full of doubt.’ Bertrand Russel
Find someone who has an absolute view on an issue and
therein you are likely to find over confidence is a bias. This is also linked
to the Dunning Kruger Effect, that stupid people are too stupid to recognise
their own stupidity. (Kahneman, 2011) . It is over confidence
bias that can lead a selection panel to appoint the most enthusiastic candidate
and then find they are not capable in the roles and responsibilities of the
job. To counter over confidence bias, we need to help our researchers to
understand the simple principle that the more we know, the more we know we
don’t know. Also, to watch out for over confidence in the evidence they use to
deepen their understanding of their field of study.
Your researcher also needs to be alert to the dangers of
being blinded by big numbers. The anchoring effect (Kahneman,
2011)
is a cognitive bias that connects a false validity to research outcomes because
the number appears significant. Remember the importance of the p level or
Cohen’s level in academic research which clearly quantifies the significance of
the research. An example of the anchoring effect is seen when shopping and you
spot a pair of shoes for £120. You then spot a similar pair for £60 and you
immediately feel you have found a bargain. In using your initial reference
point of £120, you naturally assume that this is all the information you need
and come to the conclusion that £60 is a great price to pay. However, there is
a danger of referencing your conclusion to the first or limited piece of
information you have, as it may well be that the shoe at £60 is also
over-priced and a wider search of other shops give a much broader picture of
the value of these shoes.
Another example of the anchoring effect is seen in the staff
room when introducing a new school initiative with colleagues. It is often the
voice of caution or dissention that will speak out first in a discussion. The
response usually starts with the word ‘but…’. The anchoring effect comes into
play and it is then very hard to argue against the first comment made,
particularly if this is a negative comment presented with conviction. The
negative comment then becomes the ‘anchor’ to all other thinking on this issue
and the conversation and wider possibilities of a deeper discussion considering
all the perspectives available is hindered. Your
researcher needs to be alert to the anchoring effect as they may come across a convincing and compelling argument from a
member of staff when investigating their field of study. If they are convinced by the first strong
opinion they meet, they may not deepen their understanding of the wider
perspectives of others.
The halo effect is similar in ways to the over confidence
bias. It depends on an unconscious bias that we are drawn to believe someone who is deeply
enthusiastic about their viewpoint or has a track record of performing well.
Teachers are more likely to grade a student’s essay favourably if the student
has a track record of writing well. David Didau puts it in this way when
writing about the achievement of boys in English exams, ‘Do we expect girls to be more compliant and achieve better than boys in school? Are boys and girls
treated differently in school and wider society? We expect girls to be made of
sugar and spice and all things nice while boys are unwashed louts. Might we be
making it easier for girls to achieve in schools because of the expectation we
have of them?’ (Didau, 2015) .
Be on your guard
As we strengthen our researcher’s capacity to review
evidence relating to their research question, we need to draw their attention
to the powerful influence of unconscious cognitive bias. In learning more about
bias in other’s work, they become increasingly erudite at spotting this bias in
themselves and as such, strengthening their objectivity when engaging in their
own research.
References:
Didau, D (2015) What if everything you knew about teaching is wrong? Crown House Publishing
Kahneman, D (2011) Thinking, fast and slow. Penguin
No comments:
Post a Comment