What Is Implicit Bias, And How Might It Affect Teachers And Students? (Part II - Solutions)
This is the second in a series of three posts about implicit bias. Here are the first and third parts.
In my first post on this topic, I argued that teachers are better positioned than, say, doctors or judges, to learn specifics about the individuals they serve. This strategy – called “individuating” – has proven to be effective in reducing implicit biases (related to race, gender, ethnicity, etc.). This post offers additional thoughts on how we might support teachers' orientation to get to know their students. Second, I discuss additional strategies that have been proven to be effective in mitigating the effects of implicit biases.
A couple of weeks ago, a colleague asked a great question during the Shanker Institute’s Good Schools Seminar on "Creating Safe and Supportive Schools." His question was motivated by a presentation on implicit bias by Kirwan Institute director Sharon Davies. The question was: Wouldn’t you expect more conscious, systematic decision-making (and fewer automatic, snap judgments) from teachers who, after all, see their students everyday and get to know them well? (See here, minute 50:55.)
As I related in the previous post, individuating (or learning about the particulars of a person, his/her interests, skills, family, etc.) can be a very effective "de-biasing" tool.* So, how might we leverage and support teachers' natural inclination to get to know students well? How might a potential de-biasing intervention build on this feature of teachers' work?
The reason I ask this question is that cognitive biases come in all sizes and shapes – stereotypes are just one source. For example, we tend to remember and pay more attention to information that confirms our preexisting beliefs – a.k.a. "confirmation bias." We also tend to give more weigh to information that is presented to us earlier rather than later – a.k.a. "primacy effect." Very important too is the "fundamental attribution error": the belief that, while our own actions can be explained by circumstances (i.e., I yelled at a colleague because I had a stressful day), others' behaviors are explained by their personalities and dispositions (i.e., he yelled at a colleague because he is a bully). The list goes on and on - for an overview of heuristics and biases see Thinking and Deciding and Thinking, Fast and Slow). My point is that tools and strategies that can guide and scaffold the way we gather and weigh information are essential if we hope to arrive at decisions that are more objective and less biased -- think of structured analytic techniques as "reins for the (often unruly) mind."
A teacher might be in a better place to get to know his/her students but this process is still very complex; all kinds of biases such as those mentioned above could get in the way. For example, the primacy effect suggests that something a student does on the first day of class may have more influence on the teacher than subsequent student behavior. But if the teacher has a good way of documenting behavior throughout the year, then he/she might be less prone to more vividly remembering (thus, of disproportionately considering) what the student did or didn't do that first day. So, the question becomes: What kinds of tools might help teachers collect, record, and evaluate information about their students in a way that is more systematic and less prone to error or bias?
This is not a trivial question and I don't have all the answers, but let me just offer a couple of thoughts. I was curious about ClassDojo, an app designed to help teachers collect and share data on student behavior. It made me wonder if teachers could use this or similar tools to collect information about students’ particular strengths, talents, and interests. I personally like apps because they can facilitate processes that can be labor intensive such as data gathering, aggregating and sharing data or even tasks that require additional knowledge such as data analysis. But let's face it; low-tech approaches could be just as effective. For example, Getting To Know My Child is a (paper and pencil) mechanism that enables parents to share information with their child's kindergarten teacher, including questions about the child's background, health, abilities, preferences and so forth.
Individuating is an important strategy because it sets the stage for a second, more complex but even more powerful type of intervention: Assigning competence to low status students, a strategy that takes advantage of the power of the teacher as an evaluator. "Assigning competence is a public statement that specifically recognizes the intellectual contribution a student has made to the group task" - more here. Students tend to believe and respect the evaluations that teachers make of them. Thus, if the teacher publicly commends a low status student for being strong on a particular (and real) ability, that student will tend to believe the evaluation. At the same time, the other students in the classroom are likely to accept the evaluation as valid. Once this happens, the expectations for the student’s competence – as well as his/her relative status in the classroom – can rise dramatically, which is likely to result in increased activity and influence of the low status student as well as increased success in future classroom tasks.
In Designing Groupwork (3rd edition, in press), Cohen and Lotan specified that an effective assignment of competence has three critical features: 1) evaluations must be public, 2) evaluations must be specific, referring to particular intellectual abilities and skills and 3) the abilities/skills of the low status student must be made relevant to the rest of his/her classmates. Thus, learning about students' particular strengths and skills (as well as detecting when these abilities are being demonstrated) is a very important feature of effective status treatments.**
***
What additional strategies might help mitigate implicit biases? Experimental psychologist Patricia Devine has argued that biases are like “habits"; with effort and practice, they can be broken. According to Devine three conditions need to be met for individuals to successfully counteract their biases:
- Acknowledgement that we all harbor unconscious biases and motivation to change.
- Attention to when stereotypical responses or assumptions are activated.
- Time to practice strategies designed to break automatic associations.
- Stereotype replacement: recognizing when one is responding to a situation or person in a stereotypical fashion, and actively substituting the biased response with an unbiased one.
- Counter-stereotypic imagining: detecting one's stereotypical responses and visualizing examples of people who are famous or known personally who prove the stereotype to be inaccurate.
- Individuating: gathering specific information about a person, so that the particulars of that person replace generic notions based on group membership.
- Perspective taking: adopting the perspective of a member of a stigmatized group. This strategy can be useful in assessing the emotional impact on individuals who are often being stereotyped in negative ways.
- Increasing opportunity for positive contact: actively seeking out situations that expose us to positive examples of stereotyped groups.
I am a big believer in changing contexts (not individuals) and making it easier for people to "do the right thing" or, in our case, engage in structured thinking and ditch the tendency respond to other people and situations automatically. So, in addition to training and practicing strategies that can help break the habit, what features of the local context may support structured decision-making?
Accountability: "Holding individuals accountable for their decisions has helped to reduce bias in hiring and promotion decisions," argue Correll and Benard (see also here, here, here and here). "When managers know they will be required to justify their actions (particularly to an impartial authority), they tend to engage in more complex thought processes and fewer snap judgments (here and here)." A study by Foschi (1996) found that participants were less likely to hold women to a higher standard than men when they were required to explain their responses to a partner in a subsequent task. "Requiring those responsible for making decisions to explain those decisions to a disinterested third party helps preempt the introduction of bias into decision making."
Transparency: When criteria are objective and explicit, it is easier to ensure that that everybody is held to the same standard. Researchers Uhlmann and Cohen (2005) found that listing job requirements immediately prior to selecting a candidate constrained opportunities to use subjective criteria during candidate selection. Subjective criteria allow bias to be hidden because the standards by which decisions are made are unclear.
Time: Allowing sufficient time to make decisions is another important contextual element. In a field study, Ayres et al. (2004) found that African-American cab drivers received lower tips than white drivers. The authors concluded that decisions made quickly, when one is preoccupied with other things, can result in unconscious discrimination.****
Diversity & Messaging: As mentioned earlier, intergroup contact is one of the best researched means of reducing both explicit (here and here) and implicit bias (here and here). Exposure to a male pre-K teacher, to a black school principal, or to a female mathematician can help challenge and expand our assumptions (conscious or unconscious) about who is good and bad at certain tasks – and even the nature of those jobs and the skills required to do them well. This is important because it helps us break free of the strong cultural associations that often place needless limitations on the aspirations and achievements of women and minorities. Because of its importance, I will address the role of diversity and messaging as "institutional-level de-biasing strategies" in my next post.
- Esther Quintero
*****
* I am not suggesting that teachers, because they are teachers are immune to implicit biases - in fact there is research suggesting that they are not (see Kirwan Institute's recent review, pp. 30-35).
** There are other important preconditions that must be met for status treatments to work. For example, multiple-ability tasks are a "necessary condition for teachers to be able to convince their students that there are different ways to be "smart." Students who do not excel at paper-pencil tasks often do excel when academic content is presented in different ways. Tasks that require multiple abilities give teachers the opportunity to give credit to such students for their academic and intellectual accomplishments." More here.
*** See also, Rudman et al 2001; Legault et al 2011; Mann & Kawakami 2012; Plant & Devine 2001; Valian 1999; Wilson & Brekke 1994. While the motivation to be non-prejudiced may lead to reduced discrimination (Plant, Devine & Peruche 2010), thinking of oneself as non-prejudiced may increase discrimination - e.g., instructing people to assert that they are objective decision-makers prior to a hiring decision increases gender discrimination (Uhlmann & Cohen 2007).
**** For more on the importance of time see Beattie, 2013; Bertrand, Chugh, & Mullainathan, 2005; Richards-Yellen, 2013.
Esther, thank you for continuing to write on this important topic and moving the focus towards actionable steps. Too often these topics remain stuck on the question, "do we have bias?" Now we're working on practical ways to overcome bias.
My first comment on what teachers can do to reduce bias comes at a critical point during the process where they are introduced on paper for the first time to a student, often via a list of some sort. In my experience, the information that is passed along to the new teacher can, in and of itself, reinforce a negative bias towards a child. For example as a special education teacher, I often am told information like the following BEFORE I'VE EVER MET THE CHILD: "The student is in the 1st percentile for reading and math. The student has had 20 behavioral referrals in the past year and is now off his medications. The parent is very difficult to reach and never comes to meetings. The student has a tendency to slap other kids in a play like manner but it turns into confrontations."
Imagine getting that report on a child and only after all of this negative information be told the following: "He has been very sweet at times. He really benefits from having a trustworthy adult read with him in a one on one context. When he is given a leadership opportunity, he has taken it very seriously and has thrived. He is very active and needs lots of opportunities to move about the classroom to help him."
My point is simple: through our own practices in learning about new students, we often start out, not only with a strong bias, but we reinforce such negative bias by STARTING with the behavioral/academic challenges a student has had with staff in the previous year.
Perhaps the first step we can take as teachers to overcome implicit bias is to practice the art of seeking out everything positive we can about a student prior to reading anything at all that might predispose us towards some negative perception.
I know there are other practical factors that might lead toward bias (such as grouping students instructionally in appropriate ways) prior to meeting them, but I need the help of other teachers to suggest how we might prevent some of these necessities from forming/reinforcing bias among us.
Jim Barnhill
MN Board of Teaching