“The differences between human and machine views of a particular task are complementary. The machine can see patterns and rules with increasing accuracy. But its picture is emotionless and limited to things that have already happened.” - John Sumser
“Bias is the noisiest of the ethics questions facing HR and it all kicks-off with two conflicting viewpoints. One group assumes that people are the problem; the other sees them as the solution.” - John Sumser
In part two of his series, John Sumser digs into the categories of HR data in AI and Intelligent Tools. John covers PII data, machine data, text, language, behavioral, transactional, rate of change, data flows, survey data, network analysis, and more.
There’s a saying, ‘data makes its own gravy.’ Using data creates data about usage. Interestingly, the meta data created by data is often more useful than the data itself. John Sumser delves into the types and attributes of HR data in AI and Intelligent tools.
Recognizing AI means trying to remember, in the onslaught of machine opinion, that by accepting the machine’s opinion you are making a decision. ‘The machine suggests but the fault is yours’ will necessarily replace ‘The machine told me to do it.’ - John Sumser
“The entire organization, probably led by the HR Department, is entering a time of deep examination of organizational ethics. The questions that underly the development of an ethics process are about the ways in which we care for and protect the organization and its members.” - John Sumser
Outdated personnel information is a key roadblock to realizing the next leap in productivity. Fara Rives from HiringSolved looks at the three most common reasons employees and contractors don’t keep their information up-to-date and how AI can help.
“Every time we generalize about people or information or ideas, bits of detail get lost. To define something is to limit it. The very act of putting something complex and nuanced into a more general description, diminishes what is possible to convey about the thing itself. Representations are never complete.” - Heather Bussing
“Bias related technical tools fall into two categories. The tech group assumes that things work better when humans are not involved. The human group assumes that people should be the decision-makers when lives and careers are affected.” – John Sumser
Intelligent software can tell you a lot about the past and nothing at all about the future. That should be the thing you notice most about the score given a recommendation. It should be marked ‘based on historical inputs.’ John Sumser
Recent Comments