2020-07-28 HR Examiner article John Sumser We Dont Know If We Can Eliminate Bias From Tech photo img Adobe Stock 219115069 544x408px.png

“Bias related technical tools fall into two categories. The tech group assumes that things work better when humans are not involved. The human group assumes that people should be the decision-makers when lives and careers are affected.” – John Sumser

 

What We Learned From Amazon’s Sexist AI Recruiting Tool

 

In October 2018, Amazon terminated a project they hoped would automate and transform their recruiting process. They trained their system to hire the kinds of people who had been hired over the last ten years. The goal was a system that could whittle 100 resumes down to 5 ideal candidates.

 

But, there was a fly in the ointment. The majority of their historical resumes were from men so the system favored male candidates. It penalized and downgraded resumes from all-women’s colleges and resumes that included female descriptors such as “women’s decathlon champion.” No matter how Amazon adjusted the algorithm, as long as they used historical data to train the tool, the tool favored male candidates.

 

One way of understanding this result is to conclude that machines designed to mimic existing decision processes will always do exactly that – warts and all. From this perspective, the level and degree of historical bias is an inherent characteristic of the organization. It also suggests that data carries bias in deep and subtle ways and that it can’t be scrubbed out without destroying the data itself.

 

Teaching your Predictive Tool with Someone Else’s Data

 

The generally accepted view is more practical and uses this example as a way to prove the value of using more than historical data. In order to fix the output of the system, you change the historical record by adding more of the kind of data that is missing. In other words, you teach the system by giving it someone else’s data. As you might imagine, that introduces a whole slew of unknowns.

 

It’s one thing to use the historical record as a point of departure for decision making. It’s an entirely different thing to rewrite history in order to achieve a desired outcome for today or tomorrow. By introducing data that skews the machine’s learning to reach a desired outcome, you introduce new and unknown biases.

 

It may be the case that direct influence over outcomes is the only real way to solve longstanding problems with bias and discrimination (think quota system). That’s an easy-to-make but difficult-to-implement human decision that shouldn’t be masked by blaming the machine.

 

Intelligent Tools and Bias — Who Should Decide?

 

Bias related technical tools fall into two categories. The tech group assumes that things work better when humans are not involved. The human group assumes that people should be the decision makers when lives and careers are affected. The tech group tries to eliminate or bury bias so it can’t be seen. The human group illuminates bias so that people can make the choice. One group assumes that people are the problem; the other sees them as the solution.

 

The tech group includes:

  • Assessment tools that ‘reduce or eliminate’ hiring bias
  • Search tools that redact information that can trigger bias
  • Matching tools that improve talent pool diversity
  • Structured interviewing tools that constrain and drive questions

 

The human group includes vendors whose tools illuminate bias in behavior and offer alternative choices.

 

Both kinds of tools can be applied as a part of a larger cultural transformation process. While any individual tool may not change or solve the issue, the fact that the organization is using these tools and working and taking action to mitigate bias is a powerful message that can be the beginning of a larger cultural change.

 

The Outcomes Delivered By Intelligent Tools May Be Less Important Than The Questions That They Raise

 

The ‘can technology solve bias’ question is an important part of a larger conversation. Using intelligent tools can help us more clearly understand the variables involved in building businesses that value merit and contribution over monoculture. Interestingly, it may be that the outcomes delivered by intelligent tools are less important than the questions that they raise.



 
Read previous post:
2020-07-27-HR-Examiner-article-John-Sumser-Based-on-Historical-Inputs-John-Adobe-Stock_71938310-sq-200px.jpg
Based on Historical Inputs (you might want to consider your options)

Intelligent software can tell you a lot about the past and nothing at all about the future. That should be...

Close