HRExaminer Radio Executive Conversations Badge Podcast Logo

HRx Radio – Executive Conversations: On Friday mornings, John Sumser interviews key executives from around the industry. The conversation covers what makes the executive tick and what makes their company great.

HRx Radio – Executive Conversations

Guest: Judd Kessler & Corinne Low, Associate Professors Judd Kessler & Corinne Low, Business Economics and Public Policy Department, Wharton School
Episode: 342
Air Date: October 11, 2019




Important: Our transcripts at HRExaminer are AI-powered (and fairly accurate) but there are still instances where the robots get confused and make errors. Please expect some inaccuracies as you read through the text of this conversation. Thank you for your understanding.

Full Transcript with timecode

John Sumser 0:15
Good morning and welcome to HRExaminer’s Executive Conversations. I’m your host, John Sumser and today we’re talking with Judd Kessler and Corinne Low, two professors at the Wharton School who have been working on research into bias and hiring.

Hi, guys, how are you?

Corinne Low 0:35
Good. Thanks for having us on.

John Sumser 0:37
Yeah, welcome.

John Sumser 0:38
You’re welcome. Would you take a moment and introduce yourselves and tell us a little bit about how you came to study this question? Why don’t you go first Corinne.

Corinne Low 0:50
So, I am a professor in my sixth year working at Wharton, and I think this was really a project that kind of combined interest That John and I had. So John has worked on various matching problems, you know, with the has worked with MBA program office on how do we match students to classes has worked in, you know, jobs as far as teachers being assigned to, to teaching jobs across America. And I had worked on a study that was about detecting preferences in dating. And so we put our heads together. And we kind of looked at this big matching problem that was facing seniors graduating from u Penn, which is, you know, how do they get matched with employers for jobs and you know, we thought about how can we study this process and maybe make this matching process better in some ways.

Unknown Speaker 1:49
To How did you come to when you when you were a little guy, this couldn’t be what you wanted to do when you grew up? So how did you wind your way into studying

John Sumser 2:00
Matching inside of the job market.

Judd Kessler 2:05
So I, I think I know I want it to be an economist at an early age the research that that brought me here, I think Corinne should give herself even more credit. So when she was I’ve been at Wharton a few years longer than her and when she started, I, she gave her dissertation paper, which was investigating dating preferences. But she did something that was very clever, which I’d never seen before in an experiment which was, she had people rate, hypothetical dating profiles for how much they liked them, but under a real incentive, so people have an incentive to be thoughtful and truthful, which was that they were going to get advice from someone who is going to serve an expert who’s going to help them optimize their their dating profiles to find the kinds of matches that they were were interested in as a reveal to be interested in through Their ratings. So when I saw that presentation, I thought about my own work. Current and I got to talking to this would be perfect to study the preferences of employers for candidates like the seniors at Penn that we were thinking about, because before you’re at that point to study preferences for candidates, what researchers did was they they mailed out fake resumes to employers called a resume audit study, then mail out fake resumes, the resumes would be identical except for one variable, say the name of the candidate which would reflect race and gender. And then the researchers would look to see what fraction of the resumes that they sent out got a callback will be a fake phone number fake email address at the researchers will put on the fake resumes and they’d see whether names of when the resume had a black name black candidate name, they got fewer callbacks then when it had the same resume with the white candidate name and that resume audit study had been around for Long time and nobody had really innovated on it, and Corinne’s sort of innovation of creating a platform where people could rate hypothetical candidates in this case, you know, Penn seniors, but then we could give real incentives. We’re going to match these employers who evaluate these fake resumes, we’re going to match them with real Penn seniors based on their preferences. Once we had that, as an idea, it sort of created the opportunity to do this research.

John Sumser 4:28
That’s interesting. It’s still you know, one of the things I think that that Amazon discovered when they tried to shed bias from their hiring process with automated tools using the history of the company, was that maleness or some other attribute occurs all the way through the resume rather than just being a header or footer on it, and

I wonder if When you make

the hypothetical resume,

do you do you have the kind of person who you’re trying to understand in the demographics? create that resume?

Unknown Speaker 5:16
Yeah, that’s that’s a great question, because that’s kind of exactly what is innovative about this technique. So the reason you can’t use like real resumes to study bias and hiring is exactly what you said is that there might be correlations between the traits that you’re interested in such as, you know, gender or race, and the experience that’s on the resume. And so if you see that, you know, people tend to rate you know, female resumes lower you don’t know if that’s because, you know, they are biased against women or because there’s some sort of earlier stage issue that’s happening where, you know, there’s fewer women being computer science majors and so that’s why, you know, the big tech firms Aren’t you know, taking as many females resumes, right? So in order to study that, we needed to create resumes that were exactly identical except for the name. And so when we did it, we actually created a software tool. So it’s a brand new tool, where we create a base of thousands of resume characteristics, work experiences, leadership skills, majors, all of the things that appear on a resume, and our tool goes through and pulls randomly from that bank of characteristics and creates a resume, we call it a Franken resume, like you know Frankenstein. And so it pulls these characteristics and creates a resume, and then randomly assigned a name to the top of that resume. So that means statistically, that resume is identical to another resume that somebody else sees that’s been randomly assigned with a different name. That connotes a different race or gender. So that’s the key reason of why we have to create fake resumes and then as Judd mentioned Then there’s the second problem of Okay, once the resumes are fake, how do you get companies to really be honest and really tell you what their gut is telling them about that resume? Because you know, you don’t want them telling you, you know, the fake story that’s like, Oh, yeah, of course, we would hire somebody, even if they, you know, went to a bad school and had a low GPA. We’re very open minded here, right? You want to get the real story of Who do you really actually hire. And so the way that we did that is we said, go through and rate these fake resumes for us. But we’re going to put some teeth into that exercise, because when you read those fake resumes, our tool is going to use the machine learning algorithm to apply your preferences over these hypothetical resumes to a bank of hundreds of real candidates. So by screening our 43 fake resumes, we apply that screening that you did to hundreds of Real resumes and we’re going to pick out the best matches based on your preferences. So, you know, if you don’t want candidates from, you know, X and Y major, then you shouldn’t rate those candidates highly in the hypothetical resumes. Because whoever you rank highly, that’s who we’re going to, we’re going to pick out for you. So that was, that’s why we call our tool, incentivize resume rating, because you’re rating these hypothetical resumes. But you’re incentivized to do that carefully and to do that in line with your real hiring practices, because it’s going to create a pipeline for you of real candidates that match those preferences.

Unknown Speaker 8:38
So I had to go Sorry, just wanted to add because I think there’s another key point that Corinne didn’t necessarily underline but it’s important which is that when we went through and took those components from real time resumes that then we can sort of mix and match to create a hypothetical resumes. We made sure to sanitize out any thing that Might be indicative of a race or gender. So, you know, if, if I was previously on the men’s varsity basketball team, you know, I’m on a resume that would not appear as men’s Varsity basketball because that wouldn’t you know, that would reveal something about gender instead it would say, you know, Penn varsity basketball team. So that would be a little little different than it might have originally appeared on a resume, but that way we could very carefully control what information people had when they were writing the the fake resume,

John Sumser 9:31
what was the diversity composition of the resume source? Right. So because because the question the question that I started to ask this last time was either language patterns and and I believe there are other companies that are out there designed to discover the language patterns that are indicative of the various various elements of diverse populations. Even fairly sanitized language still gives off signals of gender and

that sort of thing.

So so the composition of the people who created the the resumes that you then dissected and sanitized would be a pretty important variable wouldn’t it? And, and so so the question is, did you did you do something special to get the right mix of people in the source of the resume slices that you use?

Unknown Speaker 10:36
Yeah, I want to get to that. So the first. So the first response that we gave, we just use the sort of UPenn student population. That’s where the real resumes came from. And so it was a mix of, you know, the whole population. But the second response is that, of course, f matters. If we are giving people let’s say, anonymous resumes. And then you say, well, could they detect, you know, Oh, actually this resume belongs to x or y. But that’s not what we’re doing. We take the components and we deconstruct them, then we reconstruct them in sort of a random fashion kind of mix and match between different resumes. And we add a name on top. And so the name tells you, you know, this is the candidate that you’re looking at. And all of those characteristics are going to be used across multiple resumes. So let’s say there’s a certain leadership experience that you think is indicative of a sort of a certain background. Oh, let me make a joke. Be a little stereotypical. Let’s say it’s look cross, right? And you’re like, I know what kind of person plays lacrosse. So that component, though, is not just going to be shown on one resume. That component is going to be shown multiple times, interacted with names of different gender and race and so Again, we should be able to separate out the impact of the actual experience itself, versus the impact of the gender race that we’re attaching to the resume.

John Sumser 12:13
I’m not so sure. But it’s a it’s a it’s a knit that’s in it. So, so what did you find out?

Unknown Speaker 12:24
So, yeah, thanks. So it was quite interesting thoughts. I’ll give you the sort of top line results on on discrimination. So first thing to note is that in this method, we have employers, hiring managers, recruiters from from firms who are doing on campus for recruiting at Penn so they care about the incentive of getting real real Penn seniors to match with they we survey them as part of the survey tool. Besides rating the resumes we asked them about things like they’re they’re interested in diversity and employers very systematically say that getting a diverse pool of candidates is very important to them. It’s for many of them, it’s one of the most important things that they’re looking for. When they’re, they’re recruiting. So then, you know, that gives us sort of hope. Okay, maybe we’ll see your preferences reflected in that way. And for employers that work specifically recruiting for humanities and social sciences and business, we we actually see no evidence of discrimination on on this question of how much do I like the candidate for for employers recruiting for STEM and our sample, we do actually see that they rate resumes that were randomly assigned female and minority names, they rate them lower on how much they like the candidates. And so that sort of the magnitude of that is a white male candidate with a 3.75 their rating about equivalent to A female or minority candidate with a 4.0. So it’s about a quarter of a GPA point is the measure. They also sort of across all employers, we’re seeing less credit for prestigious internships for female minority candidates. And then one thing that was new was not something that we had seen in the literature before. We actually asked him, employers who were evaluating these resumes, both about how much they liked the candidate, and also how likely they they think that the candidate would accept the job if offered. So trying to tease apart these two questions of how much do I like somebody versus how much do I think they would like to join my firm? And one thing that we found it was a bit of a surprise to us is that across every across all employers the average across all employers, we see that employers think that in particular, women are less likely to join their farms. So again, with the server randomly assigned name at the top If it’s a female name, they rate the candidate has been less likely to accept the job offered.

John Sumser 15:08
So is that that that’s that’s interesting. I’m not sure is that a good thing or a bad thing? that that that your your study shows that women are choosier?

Corinne Low 15:19
Yeah, that’s such a great insight in that we think it’s one of those things that, you know, maybe it’s a compliment. But that cuts against you, because our theory of how employers would actually pursue candidates is that they pursue them based on, you know, how much benefit Am I going to get for my recruiting and energies that I put into this candidate. And that benefit is made up of two components. That’s not enough. It is made up of the skills that this person actually brings to my firm if I actually hire them, and my likelihood of being able to actually hire them. And so an example of this is that it’s been found in this other literature. We talked about the resume on it literature where people can Fake resumes, that when when researchers sent fake resumes of unemployed versus employed candidates that firms call back the unemployed candidates more, why would they call back the unemployed candidates more? Well, because they’re putting their energy where it’s likely to have a higher return. And they know it with the employed candidates, you know, possibly this person is just trying to get a counter offer to get a raise in their current job, maybe they’re not really serious about their search, you know, maybe I’m not going to get a big return from this effort. And so, the reality is, even if, you know, the firms are saying, Oh, this because you know, it’s good for women, it’s because so many people are going after them, you know, so they’re going to be choosier. If it means that then those firms put less effort into hiring women, it could be a very subtle form of discrimination. That’s not intended as discrimination at all. But at the end of the day, results in women getting you know, fewer calls, fewer offers.

Unknown Speaker 16:58
To add to that, it would have indifferent if we saw that in our ratings data, the women were getting much higher ratings on how much I want to go after a candidate, and then lower ratings on, you know how likely they are to come those two, those two, if we had seen that, that might fit together in this sort of consistent way. But because we’re seeing no benefit for women on how much you know, I like the candidate. And in fact, for STEM employers, we’re seeing, you know, in our data that women are getting lower ratings, on average, that doesn’t mess well with me. Oh, there, you know that they’re less likely to come because everybody’s going after them. Sort of empirically not. Yeah, not the case in our data, which then means that it’s, yeah, it’s grants that a compliment that cuts again,

John Sumser 17:47
such an interesting and complicated set of things to tease apart. I really appreciate the fact that you’re doing it then you’ve discovered a couple of things that, you know, I’ve been I’ve been watching the space for 25 years, and I don’t think that I’ve seen anybody articulate so well the notion that people hire and make a decision that’s partly influenced by what they think is the easiest path. You know, so much of the so much of the literature suggests that what people are after is a perfect match at the qualifications level. And I haven’t I haven’t heard that particular insight before. That’s, that’s pretty interesting. So the next question might be if you solved all of the bias at the sort of resume sifting point of the process, if you could do that, does it actually have an effect on hiring?

Corinne Low 18:56
implicit in your question is, you know, are people just going to be buying Later, right? So if we eliminate the bias at this, you know, upfront stage, does it just mean that later that can just going to get excluded, possibly for reasons of bias? And I think one of the key takeaways from our research, you know, at least the way that we viewed it was that we, we took these firms at face value when they said to us, we want diverse candidates. That’s one of our major hiring priorities. They actually told us in the survey we did with them, which was after they had done this resume rating, exercise. They told us, this was one of the top things that they were looking for in candidates, it was one of their most important hiring priorities was actually expanding diversity. And I think for many of these firms, that’s absolutely true. But what we were finding are the subtler forms of bias that, you know, you may not even be conscious about. And in fact, we had some evidence that they were not conscious of it, in that the form of five that just described where the employers from stem firms actually rated women lower. That bias appeared more when the Raiders were fatigued when they’d rated more resumes in a row that was without bias pumped up for. So that’s how much it’s switching to this automatic system where you’re making decisions based on heuristics based on mental shortcuts. And not you know, maybe in this according to actually what you would say are your true conscious preferences or objective. And so, if it’s the case that firms do really want to hire diverse candidates, and that, unfortunately, because of the nature of sifting through so many hundreds and hundreds of resumes, they just naturally, you know, have to switch to this more automatic process where their brain has to process that quickly because you can’t spend an hour debating every single resume you got, you have to make a snap decision. Then what we’re saying is that they may be shooting their own goals at the foot so they might want diversity but the kind of handicapping themselves Based off that process where this conscious bias creeps in, and if that’s the case, then by getting rid of that, or by at least diagnosing it, then they have that chance to make that longer, more reasons, more thoughtful decision, where you’re not operating on that automatic snap decision and mental process where then you might expect life bites. Now, if bombs are just you don’t purposely excluding women and minorities, then getting rid of the resume screening bias is not going to have an impact.

John Sumser 21:35
Maybe I think about bias in its in a slightly different way. And that is that organizations in and of themselves, regardless of the individuals involved have some biases, right? And they are it’s not really that they’re conscious or unconscious, but there’s a difference between What the policy people wish were the case and what the organization is actually capable of. I mean, this is the sort of the standard actual management problem of any substantive organization is what you want it to do. And what it will actually do are two different things. And it may or may not be identifiable as specific people doing specific behaviors, right. So there’s this sort of cultural and institutional bias or cultural values, that that’s very, very difficult to change. And, and, you know, in my experience, what you see are really, really solid efforts at certain points in the process, with the assumption that solving those process points actually addresses the larger cultural problem. And the companies may not have anything to do with each other. Right. And so So, the question that I was asking is is can you Use this sort of input into the pipeline moment as a way to really shift the culture because it’s the culture that has the bias rather than the individual Raiders

Corinne Low 23:14
choir on that. For me, I always say that same thing that I think, you know, we’re always eager to make like cheap and showy reforms, but actually hurt and actually have an impact, you know. Um, but I think the one thing you know, we we think that this tool can do is make explicit something that’s implicit. So because it’s implicit bias, it is hidden and firms are saying, we’re doing everything we can to recruit diverse candidates, why can’t we get them? And you know, by by undergoing this diagnosis, the farmers are realizing actually there are these deeper problems but mean we’re not doing everything that we can.

John Sumser 23:56
Yeah, I think I think so big. I saw, I saw a Vice President of talent acquisition that United Airlines the day that they merge the diversity and inclusion and recruiting departments into a single entity, right. That’s the, that’s the kind of change that’s probably closer to do what

Unknown Speaker 24:18
I will say, you know, one thing as academics, we often use the metaphor of standing on the shoulders of giants. So basically, the idea is, we like to believe that incremental improvements, at least in knowledge, you know, can help solve the bigger problem. So, you know, it might be that it really does take these sort of big changes, but I think one thing that that any firm that cares about this issue could do is take you know, the first step which you know, in the case of incentivize resume rating, it would be have the frontline staff we’re doing the recruiting, doing the resume screens, have them take the the diagnostic tool NC serve on average across the people who are doing recruiting at your firm. Do they display these subconscious bias these, you know, in, in against certain demographic groups and in addition what what do they care about, you know, in our sample, we talked about the discrimination but we also saw a bunch of stuff about what the employers cared about, they really cared about prestigious internships and much more than a regular internship, they really cared about high GPA as as you might expect, and, and, you know, sort of certain interactions between those variables. So, you know, one thing is to see, okay, how, to what extent are people who are doing these screening screenings at your firms? To what extent do they care about different characteristics? To what extent are they displaying discrimination of any and are those characteristics that the presumably you don’t want discrimination at all, but also, you might you might disagree, the leadership might disagree with the recruiters on the ground about sort of which which characteristics are most relevant, you know, So those are the kinds of things that that the diagnostic tool can let you do. And I think, you know, on the discrimination point, it’s a first step to see if these things are sneaking and, you know, in, in this first recruiting stage because of the subconscious bias, perhaps that that Greg mentioned.

Corinne Low 26:19
Well, yeah, exactly. To your point. It’s about work, it can potentially reveal that mismatch, as john said that, you know, hey, they’ll form leadership. Thanks for doing one thing, but something different happening on the ground.

John Sumser 26:31
Yeah, this is this is the place where the next layers of great research from places like Wharton are going to happen in this difference between what the leadership wants or what the organization will do. Despite a great conversation, I really appreciate you taking the time to do it. Would you reintroduce yourselves and tell people where they might learn more about the study?

Unknown Speaker 26:56
Don’t you go first? Sure. Yeah, so i’m john Kessler. Associate Professor of Business Economics and Public Policy at the Wharton School of your instrument learning more, you can email me at Judd Kessler, at Wharton, you

Corinne Low 27:13
And I am Karen low. I’m an assistant professor of Business Economics and Public Policy at the Wharton School. I both done I have taught in the MBA program at the Wharton School. And so of course, you know, that’s, that’s one way you can learn more as we hope we’re teaching it in our in our classrooms, but you can also go to both of our websites. So if you just Google our names, you should be able to find our websites. And you know, the study is actually available on there is also a forthcoming paper that we described as forthcoming in the November issue of the American Economic Review. And so if you just look at the AR website, you can also find a copy of the study there and you can read all of our results. Of course, there’s a lot more technical details, but you know, we didn’t want to bore you with if you’re interested in learning more, and you’re also welcome to reach out to us. Again, his email line is called and we’d be happy to talk about, you know, how the lessons from our study can be applied to actually help firms solve their real hiring problems.

John Sumser 28:12
Thanks so much. I really appreciate you guys taking the time to do this. It’s very interesting work. You’ve been listening to HRExaminer’s Executive Conversations, and we’ve been talking with Judd Kessler and Corinne Low, who are Associate Professors at the Wharton School, looking into bias and hiring. Thanks for tuning in and we will see you back here same time next week.

Bye Bye now.


Read previous post:
The HRExaminer 2020 Watchlist: KeenCorp — Cultural Management Tools

This is the third of twelve organizational profiles appearing on The HRExaminer 2020 Watchlist. KeenCorp is a uniquely Dutch company...