Thursday, July 9, 2020

Analysing data with maps


Using thinking maps to analyse your written data.

Graham Chisnell

In developing a culture of research at Veritas MAT, I have been thinking about how we can best support our staff team in analysing their qualitative data by using visual representations. Adapting the use of Thinking Maps we use with pupils, we can help the researcher to filter their thoughts as they analyse their findings.

Mind Maps were popularised by Tony Buzan who developed the use of mind maps to categorise learning.  Here is an example of the chap himself inside a mind map.

Thinking Maps were developed by  David Hyrle and Yeager in 2007 for use in schools to help students to reason and recall. They involve a range of map templates that help students to develop thinking. Here are some examples from a Lyndsay Pryslak.


The first strategy to use in analysing your data is the constant comparative method. This is used when you have collected evidence in words and don’t want to convert the data to numbers through further analysis. This process involves reading through the evidence collected and comparing similarities, differences and anomalies to spot patterns in the words. The researcher is encouraged to read their data over and over to compare and contrast and allow the information to come together to present patterns of information they can analyse.

Analysis of words in your evidence base


Gary Thomas once again provides a helpful checklist for the researcher analysing written evidence. I have simplified this in these steps:
    
      1.      Read through your written data.
     
      2.      Mark up your data, highlighting points of interest, similarities across the data, make notes about parts you find relevant or interesting.
     
      3.       Draw initial thoughts together linked to your research question from your first read through.
   
      4.       Use your initial thoughts or emerging findings to re-read the data to seek for further patterns relating to this.

      5.       Make notes of key patterns and helpful quotes from the data that can be used to answer your research question.
  
      6.       Draw out the key themes from your analysis and map these out and illustrate this with the quotes you have highlighted.

Mapping out your findings


Creating a visual map of your analysis of written evidence can help your researcher to visualise the patterns in their analysis. There are a few ways this can be done, let’s look at these.

Concept maps


I first came across thinking maps when lecturing in primary education at Canterbury Christ Church University in the late 1990s. My students were grappling with how to make effective notes and I used the mind maps of  Tony Buzan (Buzan, The Mind Map Book, 2009) to help them to connect their thinking during lectures. This process looked at recording key words, images or ideas and making connections through link arrows to define how these ideas were interrelated. 

David Hyrle (Hyerle, 2011) took Buzan’s idea of the concept maps and applied this to structure a range of thinking maps. These maps can also be really helpful when analysing written data. Here are a few.


Flow map


The flow map helps the researcher to define the cause and effect within their research. The researcher will have reviewed the raw data and if there is a clear sequence of consequences from their data, this can be recorded in a flow map. A teacher researching the impact of the introduction of a new retrieval practice initiative in history could record their analysis using this template. The teacher’s research question is ‘Does retrieval practice affect pupil’s confidence in geography?’ Each new action leads to a new observation in the study over the course of three terms. Each term the teacher introduces a new element of the strategy and reflects on the impact the has on the pupil’s perceived confidence. The map produced is shown here.



Double Bubble Map


A double bubble map allows the researcher to compare and contrast two things. For example, a researcher may be pursuing the research question, ‘How does teaching style effect student motivation in history?’. Their observations have been recorded in narrative form and the researcher has also used discussion with staff and pupils using the pyramid method. In reading through the narratives, the researcher uses the bubble map to map out the similarities and differences between the motivation of students to a didactic textbook-based approach to teaching as opposed to a discussion-based approach. The double bubble could look like this:



The researcher has used this double bubble to start to focus on the key similarities and differences they have noted in the written data. With this clarity, the researcher is now in a position to draw together their analysis of their key findings.

Tree map


The purpose of the Tree Map is to help the researcher classify their findings into key branches. This does not help the researcher consider the interconnectivity of their findings, but helps clarify the grouping or common threads to their findings. An example of this would be seen in a research question, ‘What are the most effective strategies to encourage reading at home for my reluctant readers?’. The researcher may have used The Pyramid as a strategy to speak with students and hidden voices by sharing this also with parents, a questionnaire to survey the pupils in their class and discussions with colleagues across schools to review thoughts, feelings and actions further. The reading through of the evidence base then led to the following summary in a Tree Map.



The researcher can then unpack the common themes drawn from the evidence base. This researcher then went on to produce a second Tree Map outlining the main themes from the parent’s response. This allowed the researcher to compare and contrast the two viewpoints in their research analysis. The invaluable insight to how both parents and pupils of reluctant readers responded led to a strengthening of practice to support this group of pupils to increase their confidence and motivation to read.

And finally...


When thinking about analysing your qualitative data, you can use these maps to help crystallize your thinking and use them to present your key findings by seeking for similarities and differences and points of interest in your evidence base.


Thursday, June 25, 2020

Slow Leadership - the art of thinking slow when making important decisions



Slow Leadership - the art of thinking slow when making important decisions

Working in the often frenetic environment of a schools often poses senior leaders to be faced with the task of making swift and decisive decisions.  When out of school, I often play 'email Top Trumps' with colleagues to see who has the highest number of unanswered emails. My average email count is in the high sixties after an afternoon away from my inbox and it is often the case that I am trumped by colleagues with in excess of one hundred unanswered emails.    When walking the corridors in school, I am often met with several demands for decisions or questions about unresolved issues that demand a timely and decisive response.  As school leaders face the myriad of questions and problems, it is often tempting to fire out solutions and answers to complex issues or make swift decisions on developmental practice without asking deep and searching questions about the issues at hand.  It is, therefore our challenge as school leaders to slow down and develop the art of thinking slow when making important decisions.

I came across the concept of 'slow thinking' in a book by Daniel Kahneman called Thinking, Fast and Slow. The book explores the concept of cognitive bias 'heuristic', namely providing a simple and often imperfect answer to a difficult question. He explores two systems of thinking, one that is described as 'lazy thinking' that is open to influence and emotion and one that is conscious of the heuristic influences  at work.

Kahneman explains that the lazy thinker will answer a complex question by presuming they know everything there is to know about the problem.  This is the concept of 'WYSIATI':

hat
Y  ou
S  ee
I   s
ll
T  here
I  s


Slow thinking - resolving a parental complaint


I have used Kahneman's WYSIATI concept when approaching complex problems in school.  It is easy to be drawn in by a passionate and heartfelt problem explained by a staff member or parent and to presume that this is the only information you require in order to resolve the issue.  For example, a parent may raise a concern about bullying in the classroom.  Taken at face value  and thinking fast to resolve the issue, the issue can be resolved by speaking with the bullies and sanctioning them with a stern warning to abate any future bullying.  Oh, but if it were always so simple!  Let us now presume WYSIATI is not the complete picture.  With WYSIATI in mind, we then speak to the other children involve and to those witnesses in class who are impartial.  At this point we ascertain that there is no perceived intent to bully and the allegation appears spurious.  Once again, we could leave this issue here as resolved, but once more WYSIATI may have clouded your understanding of the issues and therefore make any resolution at this point trivial.  When we dig further, we speak with the child again and discover that one of the children she was accusing was absent on the day of the alleged bullying; at this point the child in alleged to have been bullied breaks down and admits she is concerned about an incident at home relating to online bullying from a penpal on a social media site from USA who has threatened to fly over to the UK and 'beat her up' (yes this actually happened).  The girl in question was not able to admit to her mother why she was crying as she was not allowed on the social media site and fabricated a bullying incident to explain her tears.  By using the simple presumption that there may be more information about a problem than you first presume, the understanding of the problem and consequential action becomes far deeper.

Kahneman teaches us that a presumptuous confidence that you have a deeper understanding of a situation than those around you can restrict thinking and therefore lead to poor decision making.  Applying this concept to decisions made about curriculum, assessment, teaching and learning opens out a research-based approach to school improvement and strategic planning.

Slow thinking - strategic development


The concept of slow thinking can also be applied to strategic development.  Our Senior Team applied slow thinking to the introduction of our system of assessment without levels.  Our initial thoughts, now over two years ago, were to stick with levels to assess progress and attainment as we understood them and had invested much time in refining our teacher's understanding of the levels.  David Didau has also written a super book entitled What if everything you knew about education was wrong?  David explores a range of heuristic that leads us to make judgements as mental shortcuts and echoes the research of Daniel Kahneman on cognitive bias. David Didau talks about the 'sunk cost fallacy', this occurs when you continue with an action or decision because you have invested significant time or money in it, regardless of whether it is the right action or decision.  This was a concept our Senior Team grappled with deeply with assessment beyond levels as our time investment in the current assessment system was significant and we found this very hard to relinquish.  Understanding that we needed to move to a different assessment system, while not jumping at an easy solution allowed our Senior Team to ask deep and searching questions about the best route ahead for our children, staff, governors and parents.  As a result of deep thinking, the school has adopted an assessment system that has integrity and purpose with a process , carefully managed by the Senior Team (@primaryreflect, @MisterHackett, @AnneMarieMiddle and @KS1Rocks), that has given ownership to all.  Slow thinking allowed us to work beyond the majority of pitfalls before we tumbled in.

David Didau @learningspy also provides a very useful checklist in his book devised by a surgeon, Atul Gawande, who devised questions that to slow thinking in junior doctors who faced the immense pressure of making quick but at times flawed decisions on the A&E ward.  This checklist is worth a read as it acts as a check on your cognitive bias, allowing you and your team to check that you haven't fallen into Kahnaman's  WYSIATI trap.


Go forth and think slow


Thinking slow has deepened our Senior team's ability to understand complex problems and provide solutions that have deep and lasting impact on the quality of provision at Warden House.  The concept of slow thinking can also be applied to the classroom environment as students face complex problems and engage in lazy thinking because of their cognitive bias. The concept of slow thinking can also underpin a culture of research-based learning for staff, leading to deep changes in practice.

I challenge you to have a go at slow thinking and build in time to ask yourself whether WYSIATI.  I would love to know how you get on.


References:

Didau (2015)  What if everything you knew about education was wrong? Crown House PublishingThinking 
Daniel Kahneman (2012) Thinking Fast, Thinking Slow  Penguin Books





Evidence Informed Education - The affirming power of research




Evidence Informed Education - the affirming power of research

Staff in a busy school can appear to race along the landscape like a startled herd of gazelles darting every which way as the educational landscape changes;  swiftly changing direction as the needs of their children change, as the leadership of the school evolves, as each framework from Ofsted is introduced, as each new curriculum test forms new challenges for our most vulnerable pupils, as the government introduce another white paper; I could go on.

Evidence based practice allows us as professionals to stop for a moment, to look around at the landscape beneath us, to enjoy its beauty, to learn from those around us and to deepen our understanding of what truly works in education. Is it possible in the midst of such educational flux for evidence based practice to give teachers and schools a renewed authority to create their own destiny and to provide an individualised environment in which teachers and pupils thrive?
 

Creating space for research


The key to engaging school staff in evidence based practice is to ensure there is a time when the noise of every day life in school is stilled.  While we continue with our myriad of roles and responsibilities in school our mind is filled with stuff to do.  Evidence based practice forms when we still this noise and spend time reading and researching, take time to deeply reflect on our own practice, ask challenging questions and engage in deep observations that allow us to reflect on what truly works in education. As a school leader, I am also acutely aware I need to make time for myself to be reflective as well as my staff in order for research to take place.

After reflecting on the impact of the year's staff training, I looked at the impact of the training days in particular.  Although the days provided valuable time for staff to be together to discuss practice and learn key skills; their impact appeared more beneficial for specific groups rather than the whole staff. In order to create space for my staff, I had to take something away; I therefore used three training days, equating to fifteen hours of research time for each teacher across the school. The days were placed at the end of the school year but the teachers were charged with accruing fifteen hours of research across the year to equate with the three training days given.  The pay back, was to ask each teacher to publish or present their findings formally through a research paper, leading a staff meeting, writing a blog or presenting at a Teach Meet.

Formal Research Networks

Graham receiving the
Evidence Based Leadership Award

In order to provide rigour to our evidence based practice, we joined the South East Region Cambridge Primary Review Trust as a partner school.  The CPRT, led by Vanessa Young from Canterbury Christ Church University gathers together research-active schools to share practice and link with other schools and research bodies nationally. This group provided our school with a powerful model of research based on the key priorities of the CPRT.

As a group of eleven primary schools in our collaboration, we developed a more formal understanding of research based practice with Canterbury Christ Church University.  Teachers from the nine schools met across the year to learn about research methodology and were given an opportunity to put the methodology into practice in an action research project.  The outcomes of the research projects were then published by the university and a celebration event held to share the outcomes of the research projects. The research empowered staff to deepen their pedagogical understanding and share their new learning with colleagues.


Using Appraisal to Develop a Culture of Evidence Based Practice

Appraisal offers a powerful tool with which to target evidence based practice.  We trained senior staff as Mentor-Coaches and used the principles of Mentor-Coaching and appreciative enquiry to allow the teachers to devise a research based target that would develop their practice and enhance pupil's learning. Appraisal discussions led to a range of exciting and meaningful targets that encouraged teachers to develop their research based practice.  Evidence based targets were varied and included research on the impact of parents on early reading, use of Google Docs to enhance learning and the impact of Twitter on professional development. Some amazing blogs have been produced by both teaching and non-teaching staff across the school, many have been re-tweeted by +ukedchat . Blogs have included ones by our stunning staff, including @primaryreflect, @KS1Rocks , @AlisonMoon , @MisterHackett @Annemariemiddle .

 The Learning Ticket

I gave each teacher a 'Learning Ticket', each ticket had a cash value of £150 and was to be spent on their research based appraisal target.  In addition to the Learning Ticket, three Research Bursaries were made available for teachers to bid for.  Each Research Bursary had a cash value of £500 and teachers could bid collectively for these to enhance their research.  One teacher bid for a research bursary to research into the impact of Lego in story writing while another undertook an international research project into the teaching of phonics in the USA, Japan and Finland.

Blue Sky

We adopted a digital appraisal and CPD tracking system called Blue Sky. This system allowed appraisers to input appraisal targets, link them to the school key priorities and track each staff members' appraisal and training activity.  Once trained, staff were able to upload appraisal evidence, CPD courses and their impact and upload relevant evidence linked to their appraisal targets.  The programme also allowed staff to track their research time while their reviewer was able to give a gentle nudge to staff who had been less than active over a period of time.  As a result, appraisal reviews became truly owned by each member of staff and there were no surprises at the end of the appraisal cycle as there had been a regular conversation through the Blue Sky program between appraisee and appraiser throughout the year.

Teach Meets

With an evidence based appraisal target in place for each teacher and Blue Sky tracking progress towards the targets, teachers developed a variety of new practice based on the research undertaken.  We needed a forum to share this practice and celebrate the success across the school and beyond school.  We therefore used the Teach Meet model to provide a platform for the research outcomes for staff.  The Teach Meet is a meeting of teachers to share their practice in short 'micro-presentations'.  Each presentation at our teach meet lasted no longer than seven minutes.  Our first Teach Meet focussed on 'Irresistible Writing" and the second on 'Irresistible Learning' and shared a range of practice across the school that was a result of the research based practice in appraisal. The Teach Meet has been an exciting and engaging way of celebrating success of research based practice and sharing practice that makes a positive difference to children's learning. Teach Meets have also encouraged the sharing of practice across schools locally and beyond.

Where to now?

Working in a school and multi academy trust that has a evidence based practice embedded in it's pedagogy is a real thrill. Engaging with the CPRT has allowed our staff to deepen the rigour and effectiveness of their research and has led us as a school to develop a culture of evidence based practice that helps engage our staff, raise standards for our pupils and draw high quality staff to our appointments. The CPRT has now ended its tenure as a research body and handed the flame on to the Chartered College for Teaching who will act as custodian of all CPRT research.  We will now lead on with the principles that underpinned the CPRT as we move this research network forward with the CCoT.


If we are to create an exciting and engaging education system, we must continue to ask questions that encourage us to gently push boundaries and give us the conviction to create our own path into the horizon.  By providing our staff with the space to engage in evidence based research in our exponentially busy life within school, the benefits to our school, our staff and our children are palpable.  Allowing us to stop for a moment, look around and breathe before creating our path ahead. Enjoy the journey!






Cognitive bias in school-based research


Cognitive bias in school-based research

In developing a culture of research at Veritas MAT, I have been thinking about how we can best support our staff team in spotting bias in their own research and in the research of others.



Cognitive bias
Heuristic – providing a simple and often imperfect answer to a difficult question (Kahneman, 2011)

I love the work of Daniel Kahneman and if you haven’t read his book, Thinking Fast, Thinking Slow it is a worthy read for anyone engaging in reviewing literature or indeed working in school leadership. Here is a link to a previous blog I wrote on slow leadership.

Research is riddled with heuristic. While the Cambridge English Dictionary defines heuristic as a method of learning or solving problems that allows people to discover things for themselves. There is a wider caution of heuristic in school-based research. The heuristic is described by Kahneman as providing ‘a simple and often imperfect answer to a complicated question’. An example of this would be to respond to the question posed by a work colleague, ‘how are you?’ with the standard and benign response ‘Fine.’. If we are to use some slow thinking, we would delve, and ask a question that would proffer a more fulsome response. We could encourage the respondent to dig deeper into their response by asking ‘are you equally happy about all aspects of your work life?’. This may ask the responder to go beyond their initial heuristic of ‘Fine’ and they may then be prompted to speak more widely about their job role, their relationships with their team leader, their frustrations about appraisal, their joy of teaching music to a group of pupils and so on. Challenging our school-based researchers to go beyond the heuristic is key in broadening their understanding of their field of study.


In his book Thinking Fast, Thinking Slow, Kahneman talks of lazy thinking. His book is truly worthy of a read for any researcher considering the effects of bias on their own research and the research of others. He outlines two systems of thinking. Genuinely, I’m not making money on sales of his book. The first system depends on heuristic. This lazy thinking allows someone to answer a complex question swiftly without engaging in deep thought.  It is in this system of thinking that our cognitive bias runs riot and we come to decisions based on our current thinking or little thought at all.  Just like the respondent above who answered ‘fine’ when asked ‘how are you?’. In becoming increasingly aware of cognitive bias, and in particular, recognising bias in your own and in others, your researcher will become increasingly deep thinking.  They will move their thinking from lazy system 1 thinking into deep system 2 thought where the multifaceted threads of their field of study will start to present themselves.

I share these cognitive biases with my early career teacher research group to help them to develop an awareness of their own bias and to spot bias with increasing accuracy in the research they encounter. Knowledge of cognitive bias is also invaluable as a school leader as a shared understanding of cognitive bias in a senior team opens out the breadth of discussion you can have with one another.   I will now draw on some of the cognitive biases outlined Kahneman’s book.

Availability Bias – WYSiATi

Availability bias is rooted in our natural response to answering a difficult question. It operates on the assumption that we already know all there is on the topic being discussed. Put in simple terms by Kahnaman, he uses the acronym WYSiATi to represent ‘What You See is All There is’. This sums up the availability bias in us all that we often assume that we know all there is to be known on a given subject when forming our response.

An example of availability bias operating in a school can be seen in this scenario. One senior teacher, observing the behaviours in a teacher’s class states, ‘ Jane’s teaching seems to be deteriorating – she has a tough class – she clearly isn’t coping well with the pupils.’. Jane’s team leader is trapped by some lazy thinking and their availability bias is blocking them from thinking more deeply about what is not known. When questioned, Jane’s work colleague states, ‘Jane has changed her teaching strategies recently and is off curriculum as she needs to plug concept gaps for the students who have now entered Year 10 after a Year 9 where six supply teachers led to poor progression for the students.’ Further to this, when Jane is questioned about the issue the leader finds that Jane has noted the students have poor collaborative skills so has mixed up previously established groupings to help build resilience in the students prior to their accelerated programme for GCSE. While this has unsettled the students in the short term, causing low level disruption, it is a strategy to support progress in the long term.

As you can see, in the assumption that what you know is all there is, the understanding of Jane’s rationale for her classroom management would go unseen. By working through this cognitive bias, the senior leader is in far better position to support Jane to get the very best from her students.  This also works in the field of research, and in particular when forming the research question and engaging in reviewing the evidence relating to your researcher’s field of study. We need our researchers to be aware of availability bias in both themselves as researchers and in the research and ideas presented by others. We do this by asking if  what you see is all there is ?


Confirmation Bias

Confirmation bias is where we seek out only that which confirms what we already know or think. This is seen in the associations we unconsciously make. For example, if you buy a new car, without warning the road is full of drivers in that same car. Our brains are hard-wired to recognise and make sense of patterns and connections. Because of this, we often subconsciously ignore anything that does not support our view of the world. Confirmation bias is important to recognise in the research of others as well as in our own research and our professional lives.

So, how does confirmation bias appear in schools and in research? Let us consider the following statement, ‘Mrs Jones gets great results in her maths GCSE by grouping students by gender’. Being aware of confirmation bias, what are the assumptions in the statement? One assumption would be that gender segregation has resulted in the great results. A second assumption could be that all students in Mrs Jones' class achieve these great results.  A third assumption may link to the quality of teaching delivered by Mrs Jones, who may have a range of interpersonal skills that lead to high engagement for all her pupils. I could go on.  Without thinking more deeply about the wider factors, the researcher could use this statement to simply confirm what they already think about gender grouping, asserting that gender segregation in the teaching of maths is the key factor for the results achieved by students.

Sunk cost fallacy

Sunk cost fallacy describes a bias where you continue blindly doing something because you have invested time or money in it. You see this in someone who goes to the theatre to see a play and finds it to be the most tedious experience ever. They reach the interval but head back to their seat after because they have paid for the ticket and although they are not enjoying the experience, continue as they don’t want to waste the money spent. Taking this bias into the school environment, we may hear a school leader say when questioned by their staff on the value of the digital assessment system in place, ‘we have been using this assessment system for five years. We have invested time and money in the system and it produces super graphs of students’ progress.’. 

Then a keen member of the senior team states, ‘Is there a better solution to manage the time our staff are spending entering data into the system?’. With sunk cost fallacy engrained unconsciously in the school leader’s mind, they respond by saying,  ‘We have been using this assessment system for five years. We have invested time and money in the system and it produces super graphs of students’ progress.’. 

‘But,’ retorts the keen teacher, in a desperate attempt for the school leader to see the folly of their thinking, ‘The current assessments on the system are no longer in line with the new exam board’s expectations.’. The senior leader then responds (yes you’ve guessed it) ‘Seriously, do you know how much time it took us to train our teachers to use this system and what the set up costs were? We have been using this assessment system for five years. We have invested time and money in the system and it produces super graphs of students’ progress.’. The school leader is resolute, even though the wider evidence suggests the current system is no longer fit for purpose.

Sunk cost fallacy is an unconscious bias in us all and you will find this also in research articles, books and rooted in the practice of colleagues. You need to alert your researcher to listen for evidence of sunk loss fallacy as this is often used as a key argument for opposing organisational change in schools. As such, research can sometimes be wrongly used to validate the status quo and obstruct the evolution of new practice within our schools.

Group Bias – if all around you agree, it must be true.


Group bias is an unconscious bias that relies on the social convention that if all around you agree, it must be true. An example of group bias is seen in the story of the Emperor’s New Clothes where the Emperor is convinced by his aides that while his new robes are invisible to him they are visibly the finest and most luxurious stately robes  to all around him. As his subjects were all too scared to admit to the Emperor was in fact naked, the Emperor himself believed what was blatantly not true. Group bias is a powerful and convincing bias and has caused atrocities to take place across the world as groups of people and nations hold biased, yet unfounded viewpoints.

In the field of research, group bias can be seen where groups of people gather with similar viewpoints. This is often the case on social media where groups of like-minded or like-opinionated people meet. Forming a bubble of understanding. In order to dispel this bias, a lone voice in the group needs to ask a challenging question, ‘ but what would others who may disagree with our perspective say and on what information would their opinions be founded?’ This is an important bias for your researcher to challenge when undertaking their evidence gathering relating to their research question.


Over confidence Bias

‘In the modern world the stupid are certain while the intelligent are full of doubt.’ Bertrand Russel
Find someone who has an absolute view on an issue and therein you are likely to find over confidence is a bias. This is also linked to the Dunning Kruger Effect, that stupid people are too stupid to recognise their own stupidity. (Kahneman, 2011). It is over confidence bias that can lead a selection panel to appoint the most enthusiastic candidate and then find they are not capable in the roles and responsibilities of the job. To counter over confidence bias, we need to help our researchers to understand the simple principle that the more we know, the more we know we don’t know. Also, to watch out for over confidence in the evidence they use to deepen their understanding of their field of study.

Anchoring effect

Your researcher also needs to be alert to the dangers of being blinded by big numbers. The anchoring effect (Kahneman, 2011) is a cognitive bias that connects a false validity to research outcomes because the number appears significant. Remember the importance of the p level or Cohen’s level in academic research which clearly quantifies the significance of the research. An example of the anchoring effect is seen when shopping and you spot a pair of shoes for £120. You then spot a similar pair for £60 and you immediately feel you have found a bargain. In using your initial reference point of £120, you naturally assume that this is all the information you need and come to the conclusion that £60 is a great price to pay. However, there is a danger of referencing your conclusion to the first or limited piece of information you have, as it may well be that the shoe at £60 is also over-priced and a wider search of other shops give a much broader picture of the value of these shoes.

Another example of the anchoring effect is seen in the staff room when introducing a new school initiative with colleagues. It is often the voice of caution or dissention that will speak out first in a discussion. The response usually starts with the word ‘but…’. The anchoring effect comes into play and it is then very hard to argue against the first comment made, particularly if this is a negative comment presented with conviction. The negative comment then becomes the ‘anchor’ to all other thinking on this issue and the conversation and wider possibilities of a deeper discussion considering all the perspectives available is hindered. Your researcher needs to be alert to the anchoring effect as they may come across a convincing and compelling argument from a member of staff when investigating their field of study. If they are convinced by the first strong opinion they meet, they may not deepen their understanding of the wider perspectives of others.



Halo Effect

The halo effect is similar in ways to the over confidence bias. It depends on an unconscious bias that we are drawn to believe someone who is deeply enthusiastic about their viewpoint or has a track record of performing well. Teachers are more likely to grade a student’s essay favourably if the student has a track record of writing well. David Didau puts it in this way when writing about the achievement of boys in English exams, ‘Do we expect girls to be more compliant and achieve better than boys in school? Are boys and girls treated differently in school and wider society? We expect girls to be made of sugar and spice and all things nice while boys are unwashed louts. Might we be making it easier for girls to achieve in schools because of the expectation we have of them?’ (Didau, 2015).

Be on your guard

As we strengthen our researcher’s capacity to review evidence relating to their research question, we need to draw their attention to the powerful influence of unconscious cognitive bias. In learning more about bias in other’s work, they become increasingly erudite at spotting this bias in themselves and as such, strengthening their objectivity when engaging in their own research.

References:
Didau, D (2015) What if everything you knew about teaching is wrong? Crown House Publishing 
Kahneman, D (2011) Thinking, fast and slow. Penguin