The human mind is wired to think in stereotypes but practices can be utilized to protect against bias.
Rick Larrick’s research focuses on individual, group, and organizational decision making, as well as, “debiasing” (techniques for helping people make decisions not influenced by unconscious bias). In a LinkedIn Live series on fairness, justice and race, Larrick used this lens for describing stereotypes in society and the workplace and tools for reducing their influence. The full video of the talk is above with shorter snippets below.
The human mind is wired to think in terms of categories, and stereotypes are one example. “The word ‘stereotype’ tends to evoke negative images, that somehow it sounds inherently wrong and bad. And certainly the consequences of stereotypes can be that, depending on how we act on them. I want to look at stereotypes differently, as a natural outcome of the human mind.”
Larrick points out that it is overly simple assumptions that are the problem, and they can play out around many categories in organizational life, including demographics and functions. For example, marketers might think they're creative and that engineers aren't – which isn’t always the case.
This kind of categorical thinking happens very quickly and automatically. A well-known test called the, “Implicit Attitude Test” (IAT) measures the kinds of thoughts people have about race, age and gender. Both young and old, for example, associate youth with positive words and old with negative words.
U.S. culture and media are full of images of vigorous young people having parties on the beach and older people needing to buy a variety of health products. This kind of automatic thinking becomes a challenge when we can’t “turn it off” – when we’re tired or distracted or overloaded with work.
Larrick advocates an approach called “blinding” which enables the evaluation of work without knowledge of who performed that work. Larrick often institutes blinding techniques as a best practice in teaching. He values being able to evaluate performance solely on the merits of performance without preconceived ideas about which students are stronger than others.
“When I grade our MBA students on their exams, I'm a big believer in not knowing the name of the student as I'm grading. To the extent that there are different parts of the exam, like an open-ended essay or a multiple choice section, I also don’t want to know the student’s score on the multiple choice section while I'm grading the essay.”
Larrick’s reasoning is that he wants the words to speak for themselves, without any prior expectation about who that person is or how well they've performed on the examination thus far. This is a precaution meant to eliminate any potential influence when judging the quality.
As in academia, Larrick argues there is a time and place for companies and organizations to use this technique. He believes that during the hiring process or later on, with performance and evaluation opportunities, blinding should be implemented.
A classic example of blinding in practice is how orchestras use it to audition new members. The current best practice is to have a violinist audition behind a curtain, while the conductor and other decision makers sit in the audience, completely unaware of who’s playing the instrument.
“The assumption is we want the best musician, which means it's the quality of the music that matters. Historically, if those who ran orchestras had prior beliefs that certain people are better at playing the violin than others, their prior belief can leak into and affect how they hear the music.”
Larrick says techniques like blinding are just the beginning of working to achieve racial equity. He worries that simply knowing about biases isn't enough to undo them. Implementing the right practices is the best protection against bias.