HRExaminer Radio Executive Conversations Badge Podcast Logo

HRx Radio – Executive Conversations: On Friday mornings, John Sumser interviews key executives from around the industry. The conversation covers what makes the executive tick and what makes their company great.

HRx Radio – Executive Conversations

Guest: Eric Sydell, EVP of Innovation, Modern Hire
Episode: 344
Air Date: October 25, 2019

 

Transcript

 

Important: Our transcripts at HRExaminer are AI-powered (and fairly accurate) but there are still instances where the robots get confused and make errors. Please expect some inaccuracies as you read through the text of this conversation. Thank you for your understanding.

Full Transcript with timecode

John Sumser 0:13
Good morning and welcome to HRExaminer’s Executive Conversations. I’m your host, John Sumser and today we’re going to be talking with Eric Sydell from Modern Hire. Modern Hire is the new company that’s made out of the old Shaker and Montage, the video interviewing company. So so it’s an interesting combination, and they are brand new as the entire entity. And Eric is the Executive Vice President of Innovation. There are not very many people in the industry who are as acute and sciencey as Eric is it’s going to be a great conversation. Hang in there. Hi, Eric.

Eric Sydell 0:50
Hi, John. Good morning, and thanks for the intro. I love the intro music.

John Sumser 0:56
Laughs. Me too. Thank you. We’ve been doing this for so long that can’t figure out what the theme music is anymore and Shazam doesn’t recognize it. But take a moment and introduce yourself if you would.

Eric Sydell 1:08
Sure. Yeah. Well, so

as John said, I’m Eric Sydell and I’m an industrial organizational psychologist by background by training.

IO Psych is the common abbreviation there and there came out of graduate program in late 1990s. And started doing work with what was FHL at the time, doing assessment work and consulting work around hiring, helping companies hire people using fairly traditional assessments at that point and shortly after that founded Shaker, which became Shaker International, which is now Modern Hire. And we created virtual job trial with our day in the life simulations of jobs for a lot of big companies and smaller ones as well over the years, but very custom, very engaging solutions, that design for design to really emerse a person in the job and role and then collect a ton of data and use that data to predict job performance and turnover and outcomes that a client cared about. So we still do all that stuff. But now in the context of a much larger organization that has a lot more workflow, automation tools, and interviewing capabilities, as well. And so I stopped doing real consulting work years ago and kind of went into the r&d and innovation side of the house, which is where I sit now. And I am fortunate to be able to work with a team of data scientists and together with them, we are focused on trying to figure out where we can go and how, how fast and how hard we can push the capabilities of assessment and selection to try to get you know, better outcomes for our clients using all of the most cutting edge sciencey stuff that is out there.

John Sumser 2:51
Sure. If you were to describe Modern Hire in a few words, what is Modern Hire do?

Eric Sydell 2:56
Modern Hire is a platform that allows us to collect a lot of data about a candidate including interview data, video, audio, text, interviews, virtual job tryout data, and over time other data sources as well, and then use that data to help companies tell clients and individual hiring managers make good decisions about their candidates. And so whereas whereas Shaker was just one tool of point solution, the virtual job tryout now what we’re what we’re doing is much more platform oriented, because we’re recognizing that the power of our science of data science of IO Psychology goes beyond just a point solution. And really what our value is, I think, is helping companies to understand all that data. It’s enormously complex, helping companies to understand it, what does it predict? What is it good for? What does it not good for? And how can we make the best possible decision based on everything we know, not just one piece of the process and at the same time, there’s a lot of workforce automation stuff tha’s designed in our platform to make the hiring process smoother and easier for candidates as well as time.

John Sumser 4:09
That’s a broad array of stuff. Now it sounds to me like these are tools that work in high volume settings. Is that right? And if that’s right, are you working to reduce the scale of the job so that you can you can deliver this in tinier increments?

Eric Sydell 4:29
To reduce the scale of the job?

John Sumser 4:31
So what I mean is, as I listened to you talk about what what your tools do, it sounds like they work in jobs where you hire a lot of people to do the same job. Right, and so first is that is that accurate? And then if that’s accurate, talk a little bit about what you’re doing to reduce the size requirements for your stuff.

Eric Sydell 4:53
Yeah, yeah, definitely. I mean, we started almost 20 years ago, studying the highest volume jobs we could study because those are the ones that have all the data, so much data to study and learn from. So we did that, you know, with, you know, Bank of America tellers, and you know, all kinds of big, big positions for big companies out there. And over the years, we have learned a lot of things about what predicts success in those jobs. And so then what the task is, is to filter that down into other jobs that are lower volume, lower volume jobs are certainly on the table. And certainly we do a lot of work in lower volume jobs. But the challenge often is, you know, creating ways to measure the things that matter in those jobs. But but which don’t require a lot of local validation and study because the company might not have enough people to do that. And so we take the learnings from the bigger jobs that we’ve done and filter them down. And there’s no there’s I mean, we’ve gotten over the years we’ve been, like I said, almost 20 years. So we’ve done research on so many different jobs out there that we do have a lot of knowledge about what predicts for lower volume jobs, but I do think that, you know, we were never going to have all the answers to that question. The challenge is just collecting My data all the time, always, always crunching it always refining it, making it better.

John Sumser 6:06
So I’d imagine that as you collect that data and crunch all of that data, you’re starting to accumulate some broad understanding of what work is and isn’t and what being good for a particular job is. And isn’t the over generalize about that? That sort of stuff? Is it possible to generalize about it?

Eric Sydell 6:28
Well, I mean, you know, our field as a whole has been mostly focused on doing that. And so academics and IO psychology have done million studies on basic characteristics of people that predict results in jobs, like conscientiousness, as a personality construct is very predictive overall of job performance in general. It’s just a good thing to have. But you know that what’s interesting, I think, is as you go more micro as you get more specific with the job, so the company itself I’ll tell you, you know, one of the things that we’re excited about being able to do in the future is to go even more micro than the job and look at the work group, look at the manager look at specific fit with the people that you’re actually going to be working with. And that’s not something that’s done commonly today. But I think it’s, it’s the direction that we’re, we’re investigating, so the more specific we can be, we might find out that, yeah, you know, sure, conscientiousness is good in general, but it doesn’t do much for this particular job, or this particular work group. Or we might find that everybody else in this work group is very conscientious, and therefore, we don’t need more of it. We need something else, you know, we need more diverse skill sets for a particular group. And I think that so much of our field in the past has been focused on benchmarks and simple linear correlation that we’re missing a lot of detail and now in the kind of modern big debt data era, we can get more specific and we can find out more specific things, complimentary things. What you know, everybody likes to Talk about how diversity is good for organizational performance. But you know, in a lot of selection systems, they’re just hiring people that are carbon copies of the previous person that was hired and performed well. So we need to consciously be designing and diversity to to make sure that we’re getting the proper mix. And I think, you know, in the past that was very hard, because data was much harder to come by. And these days, is becoming more prevalent. And it’s our task and our opportunity and our job to collect it, and to cultivate it and figure out what it means so that we can be more targeted and more specific.

John Sumser 8:37
So it’s a an observation of mine, that what constitutes work varies broadly. That the exact things that might be working one organization or the opposite of work at another organization, do you see that?

Eric Sydell 8:54
Yeah, I do think so. You know, it we’re not all robots that are designed to just come in and produce some outcome. And the higher level you go, the more you know, knowledge worker types of position that you’re dealing with, it becomes a lot more about how you get work done in the context of the organization than your specific intelligence or skills, I think. And I think that that is something that is oftentimes gone missing and how we select people because they’re so generic, you know, selecting them based on a job description, versus selecting them based on an ability to really understand what they’re doing and get get work done in a specific context. And I think to me, that’s a big just a big opportunity in the future that we really weren’t able to do in the past because again, I mean that the data wasn’t there. The ability to crunch it at scale wasn’t always there.

John Sumser 9:45
Got it. So there’s an army of people, literal army of companies moving in and then around assessment and talking about the use of AI in around assessment would you do describe the stuff that you do as as using AI? And if so, what is it? And if not, what do you think about AI in general?

Eric Sydell 10:10
Well, AI to me is just, what it means, is nothing is nothing. I mean, you know, and you can, you could argue that a calculator from 1960s from the 1960s is an AI device, you know. So what is AI I wouldn’t, I mean, largely just a marketing term to me at this point. And what I like to focus on specifically is, is machine learning. And not all machine learning but more current than modern machine learning, because machine learning has been around a long time too. And then really, what I’m excited about is deep learning and, of course, deep learning hasn’t been around very long at all. I’m just the past decade or two, since that’s been really described and become operationalized to the point where companies like ours can use it and deep learning is trans¢¢formative and that is the thing that is transforming the world around us with self driving cars and, you know, speakers that you can talk to, I can’t say the name of a speaker or it’ll talk to me because it’s right beside my head here, you know, so that stuff is amazing. And we’re using that in our research. And we’re beginning to integrate that into our solutions very carefully, I might say, though, not kind of all over the place and willy nilly, but it’s very important to study how this stuff works. And you really have to understand the consequences of using it, you know, does it lead to more validity in the hiring process? Great, but does it also lead to more bias we have the data to be handled first. I think as well, so. So deep learning anyway, is great. Basically, if you’re not familiar with it out there is a way to make sense of data, especially unstructured and complex data. And unstructured data is like freeform beta. It’s like my voice right now is unstructured data and I’m not answering the most choice question that was structured data was it very difficult traditionally to analyze that type of data, like video and audio and things like that, but deep learning can do that and advancing super quickly. It’s very exciting stuff. So that, to me is what is really exciting about the field. And most of the rest of the stuff is kind of this just a ton of hype out there. As you know, I mean, ai itself is just so general with the term that I don’t even really know what it means these days.

John Sumser 12:30
Yeah, it’s it’s an interesting problem, because because people are talking about something that’s reasonably important right there under the umbrella of AI. You have the the tension, right, the assessment, assessment methods had adoption problems, I think, in large part because of how long it takes to to go through an assessment process. And so there’s a there’s a tremendous This amount of energy being spent on reducing the time and cost associated with with delivering assessment results, and that that often gets labeled as AI because it’s a big data project that does a lot of correlation and results in perhaps usable stuff. But it doesn’t sound like you’re running down that path.

Eric Sydell 13:25
Well, we certainly are trying to shorten the process where that whenever and wherever we can, but we’ve also done a lot of research on opt out on candidates that drop out of our system. And so a couple years ago, we published some research in the Journal of Applied Psychology, which is, you know, one of the preeminent journals of our field where we looked at candidates that dropped out and what we found is that most of the candidates who do drop out and it’s just a small percentage drop out in the very beginning of the experience, and hardly anybody dropped out. Once they’re really into the process. Whether it’s 20 minutes long, or whether it’s an hour long, doesn’t really make that much difference if the candidate feels like they’re getting something out of it. And in our systems, as I said, you know, they’re day-in-the-life experiences. So they’re very immersive and very engaging. And that’s the whole point is that we want the candidate to get something out of it to learn about the job as much as they can, in that timeframe. So So I will say that, you know, we always have had a little bit different view on the whole shortening the process thing, because we found that with our own process, and did and I think overall, the hiring process needs to get vastly shorter, for sure. You know, and it would be great if we could go from two months to hire a person to two hours. And I think that that is doable in the future. And it really just comes down to collecting enough data all in one place and being able to make sense of it with algorithms, you know, it all in one place and quick. So I think I think there’s a lot that machine learning and AI can help with to get to that point, but really, the challenge isn’t machine learning or AI. It’s data, you know, getting the data, good data meaningful data into one place. So that algorithm and trying to figure out what it means. And, you know, use it to help companies make fast decisions.

John Sumser 15:14
That’s interesrting. One of the things that happens when you have all of these tools for crunching data floating around is that I’ll say I’ll say science stopped being the operating objective and the generation of results that look like science starts to take its place. Are you seeing that?

Eric Sydell 15:37
No, I think there’s this whole interesting battle kind of going on between traditional psychology or you know, social sciences and data science. And in the traditional sciences, you start with a hypothesis, and then you test it and you see if the data you know, supports it or not. And in data science, you just take a whole bunch of data and you can find without any hypotheses A lot of times, and so it’s very kind of a theoretical. And I guess my view is somewhere in the middle that as we collect more data, it actually becomes a lot more difficult to come up with reasonable hypotheses. Because the world is actually extremely complicated. And the simplistic little hypotheses that are that are miniscule human brains come up with often times not that accurate, you know, and so hey, content, and this is linearly related to job performance. That is an example of a hypothesis that has been studied over and over in our field and is super simple. You know, it’s very not, that’s not complicated at all. So that’s the kind of, that’s not a very advanced hypothesis for a bunch of academic PhDs sitting around to come up with. It’s super simple. And I think that in reality, you know, conscientiousness is good, but it’s probably modulated and it probably interacts with other variables and really complicated ways that we can always figure out you know, and So that’s what having big data set allows us to understand that reality much better. And it’s, you know, I, I don’t understand how conscious or sorry, consciousness arises in the human brain, but I don’t want to give it up. Because I don’t understand that. You know, and I think that there’s a tremendous power in big data and, and machine learning and deep learning to explain things in the world. And we don’t want to give that power up just because we don’t always understand how it how it comes about and how it arises.

John Sumser 17:31
Got it. So what do you think the big ethical issues are in your work?

Eric Sydell 17:34
Yeah, that’s a great question. You know, I think that there are a number of ethical issues, obviously, around the use of AI. And I know I use that term, even if they I don’t like the term. But you know, I think the biggest thing is making sure that our science is rigorous and that we are trying to understand a black box that is inside the computer, you know, why are these predictions being made? A huge, huge challenge for everybody out there that using AI and it’s particularly important in our space in hiring, because, you know, at the end of the day where we are dealing with humans, and I mean, most, you know, a third of our company is by colleges. So, you know, we want to do right by the individual, we care about the person and not just the, you know, kind of corporate machine trying to fill seats. So we have to be able to explain back to the candidate, give them information back that helps them understand who they are and what they’re all about and why they fit or did not fit in the role. And that is something that has been slow to come about in our field because of legal. So for for most of the history of modern selection, lawyers have been saying, Don’t give your candidates any feedback, because we don’t want them to understand anything about the process that might allow them to make a claim or, you know, have more questions that we have to respond to and I think that that tide has to change. And there’s a lot of people focused on that now, but it’s still a big, you know, thing that has to be addressed, where we can really support the individual and give them back information. That’s helpful. Because otherwise, you know, with power of AI and all the crazy stuff we’re talking about, it runs the risk of becoming dehumanizing to the point where we’re just you know, slotting people into the role and not giving them a say in the process and not helping them understand whether it’s right for them or not. But I mean, to me, that’s, that’s a huge, a huge challenge. And I mean, we’re really focused on trying to create things that can be helpful to the person level, not just the company. And then, you know, there’s other certainly big ethical issues around diversity and bias, and making sure that we aren’t creating unintended consequences from using, you know, ai technology in ways that we don’t understand. So that’s a huge focus and it also used to be the case that bias and diversity was sort of like an afterthought for a lot of companies. Their first concern was hiring people who fit and yeah, yeah, we we’ve got to make sure that there’s no adverse impact, but a little bit of a secondary concern. And we used to have clients that some would, you know, feel very strongly about that, and some would not care as much about it. And it had to do with their legal counsel a lot of times. And so now, I feel like we’re in a much better position, though, because that stuff is primary and important to most of our clients these days. It’s almost, you know, diversity and fairness and equity issues are just as important if not more important, than the predictive power of the system when it comes to predicting job performance. And that’s pretty cool, I think, because as we, you know, do a better job of collecting data on systems we can find more and more types of biases, and we can either back to them out of the scoring or we can alert hiring managers that are resistant to We can do different things like that to try to highlight and eliminate these sorts of bias events. And it goes beyond the, you know, 1960s, the Civil Rights Act of 1964, title seven prohibited discrimination based on race, color, religion, sex, national origin. And that was the focus of much of our work, you know, in bias for up until a few years ago. And now I think, Wow, we can not only solve those problems, but we can also look at all the other myriad human cognitive biases that get in the way of making good decisions. And by collecting more data, we’re going to chip away those things can be able to help lead them out so that we can hire people who really can do the job, regardless of them not maybe being similar to us, you know, or thinking like we do, or us hitting it off in a friendly way. Maybe it’s not about that. It’s about doing the job and fitting in on a team in a way that is much more, I think, high fidelity and precise than we were able to be just a few years ago. So I mean, those are some of the things on my mind that are, I think, very exciting. And I have to pinch myself because we’re in such a crazy time right now where we actually have the ability to collect this broad and descriptive and meaningful data in the news, super advanced analytical tools that we did not have when I started my career around the year 2000. And advances that we could not make that. So I mean, that’s pretty cool.

John Sumser 22:23
That is pretty cool. Do you think do you think that it’s possible to eliminate bias in the hiring process?

Eric Sydell 22:28
Hey, you know, I, I have said that I think we should have that as a goal. But you know, when you think about bias, it is such an onion. It is such a multi layered deep issue, you know, and I think that like, for example, if we know the demographic makeup of a candidate of the candidates in our applicant pool, then we can use algorithms to make sure that one group isn’t scoring higher or lower than another. That’s pretty simple. And if they are, then we can stop that. And we can fix it algorithmically to make sure that all these groups are scoring in a similar manner. So in that sense, we can eradicate it, we can eradicate differently in the hiring process. But then, you know, you get at, well, how are humans using that information? Right? And can we eliminate it from a human decision? I don’t know. I mean, we can eliminate it from the data that the humans are using to make a decision, most of it. But then also, when we get even beyond the protected class stuff and look at, you know, like, I always use the example of height, I don’t think I can be president because I’m five, six, not over six feet tall, you know. So like, there’s there’s all these kinds of human cognitive biases that that we all carry around and have to go way beyond the protected class. How do we get it though? How do we get those things? You know, and I think that’s challenging. I think we’re going to get much closer to it with all the data we have and all the tools, analytical tools we have now than we ever could have in the past.

John Sumser 23:57
It raises an interesting question that we might we might do a longer conversation on, which is where is bias useful? Right? And if there’s some sort of reason that people over six feet tall or president, we’d hate to just get rid of that just to get rid of it. Yeah.

Eric Sydell 24:16
Well, I wouldn’t, I’m sure. Yeah, yeah. Yeah.

John Sumser 24:20
Yeah. Well, you know, I would favorite of it because I’m tall.

Eric Sydell 24:25
You have a lot of hair too, which I don’t.

John Sumser 24:29
Yeah. So we should talk about that later. This has been a great conversation. Is there anything you want somebody listening to this show to, to be sure to take away from it?

Eric Sydell 24:38
Yeah, that’s great. I think that the big thing that I want to communicate to people is when you’re looking at all these HR tools out there and thinking about how to improve your HR process, your hiring process, specifically, I guess, don’t be taken in by claims of hype, you know about by specific tools that they say they do this or say they do that. You have the right and the ability to collect data and to understand how a tool works. And if a vendor that you’re talking to can’t help you do that, and can’t show you proof of how they studied their, their tool and how well it works, then I’m very skeptical that it does work. And I think, you know, so much of our industry is just focused on that. Here’s a great idea. Let’s let’s create a company around this great idea without ever really understanding whether it works or not. And, and that’s, that’s my challenge for people out there. And the thing that I want people to take away and I say that even about our own tools, do they work? Do they not work, look at the data, and if they don’t work, then we need to improve them, or we replace them, you know? So that’s that’s the message, I guess.

John Sumser 25:43
Fantastic, thanks for taking the time to do this Eric. Would you reintroduce yourself and tell people how they might get a hold of you?

Eric Sydell 25:49
Sure. Absolutely. Thanks John it’s been a lot of fun My name again, Eric Sydell, and I am the EVP of Innovation at the new company formed by Shaker and Montage called Modern Hire. And my email address is Eric Sydell, which is S-Y-D-E-L-L at modern hire dot com.

John Sumser 26:08
Thanks. And thanks again for doing this Eric it was a great conversation.

You’ve been listening to HR Examiner’s Executive Conversations, and we’ve been talking with Eric Sydell who ithe EVP of Innovation at Modern Hire. Thanks for joining us this week. We’ll see you back here next week. Have a good weekend. Bye Bye now.

 



 
Read previous post:
2019-10-29-hrexaminer-2020-watchlist-ai-hrtech-phenom-people-full-sq-200px.jpg
The HRExaminer 2020 Watchlist: Phenom People — Talent Experience Management

Phenom People is the seventh of twelve organizational profiles appearing on The HRExaminer 2020 Watchlist. Phenom People is the largest...

Close