Archive for the ‘Implicit Bias’ Category.

Addressing Unconscious Bias in the Tech Industry

The tech industry has come under scrutiny for its lack of diversity. While there are some prominent women in the industry, such as Sheryl Sandberg (COO of Facebook) and Melissa Mayer (CEO of Yahoo!), the industry is dominated by males. Moreover, few people in tech are Latino or African American.

Major tech companies have woken up to this and are taking steps to become more inclusive. Recently, Sheryl Sandberg announced that Facebook is sharing part of its new training program on unconscious bias. In her announcement, Sandberg wrote, “Managing bias is an essential part of building diverse and high-performing organizations. We know we still have a long way to go, but by helping people recognize and correct for bias, we can take a step towards equality – at work, at home and in everyday life.”

The presentation portions of Facebook’s training program are available online at The training program is divided into seven short modules, which should be watched in sequence:

1. Welcome
2. Introductions and First Impressions
3. Stereotypes and Performance Bias
4. Performance Attribution Bias
5. Competence/Likeability Tradeoff Bias
6. Maternity Bias
7. Business Case for Diversity & Inclusion and What You Can Do

Facebook recommends that before you view the presentation modules, you take an implicit assumption test (IAT) to explore your unconscious biases. Project Implicit at Harvard has over a dozen different tests you can take, on anything from gender, race and sexual orientation to weight, religion and disability. Each test takes less than 10 minutes. To take an IAT, go to One the opening screen, select the second option on the left to continue as a guest.

Google has also been addressing unconscious bias with training of its own. In the following video, Dr. Brian Welle, Google’s Director of People Analytics, discusses how unconscious bias works at Google and how the company is interrupting.

Warner Norcross Issues 8th Annual Diversity and Inclusion Report

2014 DIAR

Warner Norcross & Judd has issued its eighth annual report regarding the firm’s initiatives to become a more diverse and inclusive organization. The 2013 Diversity and Inclusion Annual Report includes a letter from Managing Partner Doug Wagner in which he reviews the firm’s progress. It also includes brief articles regarding some of the firm’s diverse professionals and a demographic profile of the firm. To see a copy of the report, click on the image above. Copies of the firm’s annual reports for 2006 and 2012 may be found on the firm’s website by clicking here.

Warner Leaders Participate in Inclusive Leadership Workshop

Arin ReevesWarner Norcross & Judd’s Managing Partner, Doug Wagner, was among 11 leaders from the firm who participated in the first annual Inclusive Leadership Workshop sponsored by the Managing Partners Diversity Collaborative.  The workshop, which was conducted on June 3 and June 4,   explored the difference between diversity and inclusion and the business case for diversity and inclusion in law firms.  Participants discussed implicit biases and learned how to identify impediments to inclusion in their firms.

The workshop was conducted by Dr. Arin Reeves, of Nextions LLC.  Dr. Reeves is one of the foremost consultants in the area of law firm diversity and inclusion. Her book, The Next IQ: The Next Level of Intelligence for 21st Century Leaders was published in 2012 by the American Bar Association. Dr. Reeves has worked with law firms and legal departments on diversity and inclusion for nearly 20 years. She is an advisor to the Center for Legal Inclusiveness in Colorado and is the co-author of its manual, Beyond Diversity: Inclusiveness in the Legal Workplace.

 The Managing Partners Diversity Collaborative was formed in 2011 by 12 of the largest law offices in Grand Rapids, Michigan, in association with the Grand Rapids Bar Association, to promote diversity and inclusion in our firms and the profession. Conducting the annual workshop is one of 45 action steps in the Collaborative’s Action Plan adopted in 2012.  Over 40 leaders from the 12 member firms participated in the workshop.

WNJ Diversity Partner’s Remarks at the 4th Annual Justice Initiatives Summit

The 4th Annual Justice Initiatives Summit of the State Bar of Michigan, held on April 29, featured keynote speaker Kimberly Papillon who spoke about the neuroscience of bias in an address titled “Why Did I Do That? The Science Behind our Decisions.” Warner Norcross & Judd’s Diversity Partner, Rodney Martin, was asked to offer a brief reflection following Ms. Papillon’s address. Here are his remarks.

In some ways, I think the topic of today’s talk is a “Good News – Bad News” topic.

The good news is that understanding the science of bias helps explain some things that we struggle with. For example, it provides an answer to one of the often heard assertions that “I don’t see color. I treat everyone the same.” Besides the fact that treating everyone the same usually means “I expect others to be like me,” rather than “I treat everyone fairly with respect for who they are and what they bring to the table,” today’s topic suggests another reason why the “I don’t see color” defense is wanting. Consciously, it may be the case that we try to be color blind, or gender blind, or blind to people’s sexual orientation, but our subconscious minds don’t go along with that. Our subconscious minds harbor biases that we fail to recognize. And, if we fail to recognize them, we have no ability to control them.

Understanding the science of bias may also explain why, after years of focusing on diversity, our profession still lags other professions when it comes to persons of color and women. Despite our efforts, collectively African Americans, Asian Americans, American Indians, Arab Americans, and Hispanic/Latino Americans comprise just 10% of the active resident members of our bar association. (State Bar of Michigan, “Commentary to the Michigan Pledge to Achieve Diversity and Inclusion,” p. 2, available at Could it be that unconscious bias helps explain why – despite our good intentions and our affirmations of equal opportunity – the percentage of women in our partnerships has been stuck at a ceiling of around 19%, notwithstanding that for over 20 years women have made up nearly half of the graduates of law school?  So, I think it is very good news that science is helping to expand our understanding of how bias works and how it can impede the progress we so desire in making the profession and our justice system more inclusive.

But there is also some bad news in this understanding. For me, the first bit of bad news is that I have to come to grips with my own bias. I have taken the Project Implicit test (  from time to time over the past six years. I have also done the University of Chicago test, called “The Police Officer’s Dilemma.” (  As much as I want these tests to show that I have no preference, every time I have taken them, they have shown me to have an unconscious bias in favor of white people as opposed to African American people. I have found this very troubling.

As I have tried to better understand the workings of unconscious bias, I read a book that I commend to you called Thinking Fast and Slow, by Nobel Laureate Daniel Kahnemann. In it, Kahnemann summarizes years of research on how our brains work. He describes two systems, which psychologists refer to as System 1 and System 2.

System 1 is the intuitive brain. It “operates automatically and quickly, with little or no effort and no sense of voluntary control.” (Kahneman, Thinking Fast and Slow 20 (2011).) It makes quick decisions and acts upon them. Ninety percent of our thinking is done with System 1. In most cases, we don’t even know that System 1 is doing so. System 1 is making it possible for me to stand before you today, adjusting my muscles to make sure I don’t fall on my face. I don’t have to give any thought to how to stay standing. System 1 does it automatically, in the background. System 1 responds to stimuli without any input from you. System 1 has a lot to do with our survival. It is instinctual and cannot be turned off.

System 2 is where we do our conscious, deliberate thinking. It uses reason, rather than instinct, to solve problems. Unlike System 1, System 2 is slower to judgment and works hard to make the right decisions. Let me illustrate the difference between System 1 and System 2 in a simple fashion:

  • If I asked you to tell me what 1 + 1 equals, the answer would immediately come to your mind. It takes no brain power to answer the problem. Your System 1 learned the answer long ago and instinctually calls it out.
  • But if I asked you what 42 x 27 equals, you would almost certainly not have an immediate response. Your instinctual thinking – your System 1 – can’t handle this problem and has to turn it over to the rational System 2.

One of the functions of System 2 is to monitor System 1 to keep it in check, because, psychologists tell us, System 1 is instinctual and cannot control itself. But Kahnemann explains that there is a problem with System 2: It is very lazy. Most of the time, our rational System 2 is content to let System 1 operate without any interference. This is especially true when we are stressed or tired. And this is a real problem, because System 1, though fast, is imperfect.

Research has shown that System 1 relies upon a number of biases and rules of thumb to help it make snap judgments. One of these is the “In-Group Bias.” (Arin N. Reeves, The Next IQ: The Next Level of Intelligence for 21st Century Leaders, 129 (2012).) Your System 1 has a bias for people who are just like you. We find comfort and safety in sameness. System 1 does not like to be uncomfortable.

A second bias is the “Halo Effect.” (Kahnemann at 82-85.) When we find one thing we like about someone, System 1 tends to infer that that person has other traits that are good too, even though it does not have any evidence for doing so. For example, if I know that someone gives to my favorite charity, I am more likely to think that they share other values with me as well. System 1 does not like to deal with ambiguity.

Yet another example is the “Confirmation Bias.” (Kahnemann at 80-81.) When System 1 observes something, it has a tendency to sort what it sees in a fashion that confirms what System 1 already believes. System 1 wants the world to work according to System 1’s world view. So it has a tendency to organize what it sees in that fashion. And when System 1 sees something that confirms its belief, System 2 usually stays uninvolved. System 1 provides impressions that turn into beliefs and become our impulses and choices for our actions.

How might these implicit biases manifest themselves in a law firm?

  • The in-group bias may lead to an inequitable distribution of career enhancing assignments and opportunities.
  • The Halo effect may lead us to prematurely anoint someone as the next superstar.
  • The Confirmation bias may lead us to prematurely write someone off after just one mistake.

The associations that System 1 builds are not necessarily based in fact. System 1 accepts associations as true without investigating them. Unless System 2 gets involved to challenge the association, which it does not usually do, we act upon our System 1. And, because the bias is unconscious, we don’t know that we are doing so.

It should concern all of us, that our brain is content to rely on our unconscious thoughts and biases most of the time and that our lazy System 2 may be oblivious or slow to challenge the System 1 biases that we have. This is especially true when we are tired or under stress. How often does that occur in the legal profession?

And there is more bad news: because it is instinctual, System 1 is not easily changed. Moreover, it is really hard for us to see the flaws in the associations made by our own System 1.

There is more good news though. There are things we can do to make our System 2 thinking alert to the potential for System 1 bias. The first thing is to acknowledge that implicit biases exist. I encourage you to try the tests at Project Implicit. As I said before, it can be disheartening.

My testing shows a preference for white people. Does that make me a bigot? I don’t think so. Consciously, I believe that discrimination is wrong and that is what I choose to practice in my life.  But like everyone else, I have to acknowledge that I am subject to unconscious biases that result from cultural messages to which I have been exposed. It concerns me that, left unchecked by my conscious self, my unconscious self may jump to conclusions that my rational self would abhor. And this concerns me more when I read the research that shows that System 2 is lazy. It is content to rely on System 1 most of the time. So my lazy System 2 may be oblivious or slow to challenge the System 1 biases that I have.

There are ways we can address this. We can recognize situations where System 1 bias would be especially costly and put in safeguards against rash judgments. For example, in my work on the Recruiting Committee, I have to guard against my biases and strive to keep an open mind. Studies have shown that if identical resumes are shown to attorneys on recruiting committees with the only difference being that one has a “white” sounding name while the other has a “black” sounding name, the white sounding candidate is much more likely to get a call back.

In those circumstances that may be affected by unintended bias, we have to slow our thinking down and ask if the conclusions we are making are supported by objective facts or behaviors. We have to try to turn those situations over to our rational System 2. But that is not easy to do. If we could easily recognize our biases, most of us would address them and correct them. But, unless challenged, our System 2 is willing to rely upon good old dependable System 1, which is always there with an opinion.

If we can’t rely upon ourselves to see our biases, then we need to be willing to let others help us see them and call them to our attention. We can’t be defensive when they do so. We have to learn to be open to constructive observations about unintended bias.

Since biases grow out of our experiences, another thing we can do is consciously address our biases by expanding our experiences. System 1 takes comfort in situations with which it is familiar. So, if we expand our experiences, we can expand our comfort zone and tame some of our unconscious biases.

Finally, I think it is very good news that we are here today to take on the issue of unconscious or implicit bias. Working together, we can take real and concrete steps to address the effects of implicit bias, and make our profession and our system of justice more equitable and diverse.

In his book Blink, Malcolm Gladwell explores our intuitive judgments and discusses the Implicit Association Test. Gladwell writes that “[t]aking rapid cognition seriously – acknowledging the incredible power, for good and ill, that first impressions play in our lives – requires that we take active steps to manage and control those impressions.” Malcolm Gladwell, Blink, 97-98 (2005).  We must actively guard against the potential that our implicit responses will infect our decisions with bias. It is especially important that we do so when those decisions involve fairness and justice. That is our mission for today. I thank you all for being part of this day and look forward to this afternoon’s sessions.