graphic for The 2018 Index of Predictive Tools in HRTech: The Emergence of Intelligent Software



“Any large system, especially as it ramps up, is going to be operating much of the time in failure mode. (Even if it looks like it’s working.)” – Chris Havrilla

There was once a farmer who needed to drain a marshy field. He was advised to dig a ditch to drain it. So, the farmer bought shovels, told his farmhands that he wanted to drain the field, and trained them to use the shovels by digging holes.

The next day, he found them digging holes all over the field. Dumbfounded, he asked why. The leader showed him how they were all using the new tools to dig beautiful well-made holes, just like he had taught them. So, the farmer showed them how to dig a ditch.

Moral: Train for the outcomes and what success looks like, not just the tools – a fool with a tool is still a fool.

AI Moral: Know it’s like training a fool that will also do exactly what it was trained it to do. It needs data, time, feedback and performance management – even more than people.


Chris Havrilla, HRExaminer Editorial Advisory Board Member

The next day, the farmer arrived to find a long ditch filled with water along a half-drained field. But since the ditch went nowhere, the field could not completely drain. The farmer realized he had spent all his time teaching his workers how to use the new tool and what to do with it, when he should have been teaching them why and how to drain a field.

Moral: Don’t get mesmerized by the tool, it is not the solution alone, keep your eyes on the prize.

AI Moral: Like people, AI needs data, time, training, feedback and performance management – even more so. People want to do a good job (the ones you want to keep that is), Machines want to do what you (or the data, so it better be good) tell them.

Several years later, the farmer was sold an automated Backhoe (that could do the work of 30 workers in less time) to dig ditches all over his farm to keep his fields more productive while reducing the number (and cost) of his ditch diggers. He taught his leader how to work the backhoe and told him to fire the extra farm hands. The next week he realized that many of the farmhands were still at work and he asked why? His leader showed him how every farmhand had a set of ditches to maintain and that all been taught how to use the backhoe to do their work. Now they were each able to get their work done much faster. But since they could only use the backhoe one at a time, many were still using their shovels to maintain their ditches while they waited. And, the new backhoe needed a lot of support, so many were spending their time cleaning, maintaining and assisting with the machine.

Moral: New systems often fail to displace old tools and often generate more and/or different work than expected.

AI Moral: Automation does not mean Automagical. And there are always new considerations (audit, maintenance/management, speed, ethics, compliance, risk, new and different skills to deal with what’s different and what’s next, etc…) that may necessitate new and different work that has to be considered before making business decisions such as cuts in labor, spending, etc.

The farmer decided to centralize the ditch maintenance work with specialized jobs: one leader, one backhoe operator, one mechanic and a small team of assistants. Now they could focus the work and eliminate the extra farmhands. This worked for the first few months, but then the farmer realized that some of his fields were flooding again. The leader explained that the machine could only do one job at a time and the work was beginning to back log.

Moral/AI Moral: New systems can and will generate new problems.

The farmer decided to pay the workers overtime to get the backlog completed. Yet, often when he went to inspect the work they were all watching one person set-up or fix the machine. The supervisor explained that with the additional workload the machine was breaking down more often and requiring more maintenance downtime.

Moral: Any large system, especially as it ramps up, is going to be operating much of the time in failure mode. (Even if it looks like it’s working.)

AI Moral: AI needs time to learn, and it learns from the data you give it (which means your data needs to be good, clean, deep, and thorough) and with more and constant data, training, auditing, and “feedback” it will be a continuous process to ensure you are getting the potential outcomes and value it can bring. And machines don’t unlearn either, it will no longer be a config change if you aren’t on top of incorrect guidance, recommendations, or decisions you may be letting it make.

The farmer shook his head and in a gut check moment just had to laugh and wonder if anyone ever really understood that their real purpose was to just raise crops.

Moral:  Complex systems create their own goals and you can’t assume you know what they are really doing.

AI Moral: Learn from the mistakes made from complex, enterprise software where people rarely did the proper upfront work to bring these intelligent software to work to their full potential.

If you take the time to identify your real problem, align the organization and configure the system to solve the problem, you will truly be innovating with a holistic innovation approach focused on the type of breakthrough changes this effort can bring. This is not to scare people away – but the outcomes needed to meet the type of demand, demographic, technology, cultural, and organizational forces that leaders are facing will require massive change – and everybody will likely be blazing trails at the same time – or forced to play catch up to the ones that do. So, it is critical to take the time to do it right and get help with the innovation process, because you are not just installing a tool, you are Innovating.

graphic for The 2018 Index of Predictive Tools in HRTech: The Emergence of Intelligent Software

Page 1 of 11
Read previous post:
The Tao of HR

When employees have their needs met before they are realized, a magical thing happens. They perform better.