2021-01-13 HR Examiner article John Sumser HR Tech AI and Intelligent Software Implementation–Part 2 stock photo img cc0 by AdobeStock 300868656 01 ed 544x162px.png

“From data quality to regulatory compliance, there are 28 key parameters to evaluate when considering a purchase of intelligent tools for HR. We’ve also provided 40 key questions to ask to evaluate an AI or Intelligent Software vendor solution.” - John Sumser


HR Tech: AI and Intelligent Software Implementation – Part II


How To Evaluate an AI or Intelligent Software Vendor Solution

Readiness, Functionality, Change and Improvement, Maintenance, Support, and Compliance, Liability, and Warranties.


From data quality to regulatory compliance, there are 28 key parameters to evaluate when considering a purchase of intelligent tools for HR.


  • What are the format and volume requirements that your data must meet? What are the differences in the vendor’s output based on the volume and quality of your data?

  • What is the process for acquiring and integrating your data into the solution?

  • Ask the vendor how they think about improvements in their models and algorithms. How close are they to perfection? What are the looming obstacles and limits? (Quickly eliminate vendors who don’t have clear notions of the collaboration and investment required to finish their work.)

  • How many people does the vendor believe are required to manage the process (account relations, maintenance, change management, exception reports, additional service people)?

  • How much time is required to change existing processes and train team members in the new technology?

  • How does this particular tool fit into the larger HR Department intelligent tools strategy and vision?

  • What are the processing demands this purchase places on internal systems (API calls, data submittals, security)?

  • Can the vendor operate at the scale of your company? Ask for examples.


  • What, exactly, does the new tool do? How does that improve current operations? Where did the data that trained the model come from? How is it updated or replaced?

  • What becomes possible once the tool is installed? What can we do with the tool that we haven’t done before? What can we do with the tool that we’ve never imagined doing?

  • How does the tool handle privacy and security concerns? Is it GDPR compliant? What about other standard security issues? How does the vendor manage the security of our data? Have they had breaches? What is their policy about breach notifications? How do we know if our data has been compromised?

  • What is the product roadmap? How are we kept abreast of changes to the roadmap?

  • Do you store all of our user transactions and their outputs so that we can survive an audit?


Change and Improvement
  • Technical debt is the set of ‘to be kept’ promises that a vendor has made to its customers. How does the vendor communicate the details of its technical debt? How do they plan to balance the elimination of technical debt with forward progress?

  • If the vendor pivots into a new direction, what are our rights to preserve the service we originally contracted for?

  • How are we informed when the user interface changes? Do we have the capability to defer changes in order to maintain operational consistency?

  • How are we informed when the API changes? Do we have the capability to defer changes in order to maintain operational consistency?

  • How are we informed when the data models and algorithms change? Do we have the capability to defer changes in order to maintain operational consistency?

  • How do we ensure results consistency?


  • How do we monitor the tool’s performance at a top level? How do we get visibility on the way that changes are affecting our results?

  • What is the process for identifying and fixing flaws in the models, the algorithms, or the NLP processes?


  • What are the parameters and limits of customer support? Is there a response time guarantee?

  • Is there routine reporting on the status of open support tickets?


Compliance, Liability, and Warranties
  • Do you have experience with regulatory compliance audits? Give examples.

  • Do you have evidence that your system is compliant with all of the regulations we have to comply with?

  • Exactly what is the vendor’s liability when the system makes an error?

  • What is warranted and how is it covered?

  • At what point (where is the line) does the vendor expect the customer to take responsibility for results?

  • Who owns the data? If we terminate the arrangement, is our data removed from the system? How about the things that were learned from our data?


40 Key Questions for Evaluating your Prospective AI
or Intelligent Software Vendor
  1. Describe how your AI tools work. Describe the functions of the fundamental components.
  2. Describe the data used to train the model.
  3. You will be constantly improving the quality and output of your tool. How will you keep us appraised of the changes?
  4. Will we be able to get consistent results? How often will there be changes to your underlying processes and models?
  5. What databases does it access? Does it gather from open source material?
  6. Do you ensure and monitor data quality and consistency? Do we get to see that?
  7. What happens when one of your data sources expires, changes, or disappears?
  8. Are you a certified Partner with our Talent Acquisition vendors?
  9. How would your AI interface with the ATS? 1-way, 2-way?
  10. What are the top two things that set you apart from your competitors in the AI space?
  11. Can the vendor operate at the scale of your company? Are ther examples of similar companies in similar industries of similar size and complexity?
  12. Does your tool measure effectiveness of candidates? (Whether they are open to engagements; likeliness to respond to or ignore engagements; probability to make a move?)
  13. By what method do you rank candidates?
  14. Do you use APIs for integrations?
  15. What do you do when an API integration breaks?
  16. When tied to the ATS, does your tool provide capability to search for internal candidates?
  17. Does your tool provide the capability to email candidates directly or would it need to be tied to our email platform?
  18. Can messages be sent to candidates on behalf of someone else?
  19. Does your tool offer an Admin License for managers?
  20. What tools can managers use to measure effectiveness of recruiters?
  21. What tools can managers use to measure effectiveness of recruiting messages?
  22. How are licenses/seats managed? Are they shared or tied to a specific individual?
  23. Understanding this technology is evolving quickly, can you share your roadmap for future upgrades/capabilities?
  24. How do we get our needs prioritized in your roadmap?
  25. Would we have the ability to utilize “Drip Campaigns” using this tool?
  26. Does your firm have a mobile app?
  27. Can the database be exported?
  28. Do we own our data? Is it proprietary?
  29. How do you share data between customers?
  30. What does the analytics dashboard look like?
  31. What reporting capabilities does your firm provide?
  32. What predictions do you make about candidates? How are the models developed, maintained, and improved?
  33. Can fields be customized?
  34. Can soft collateral be uploaded for Drip Campaigns?
  35. Does your tool have access levels?
  36. What does your firms support infrastructure look like? US Based? Support to customer ratio?
  37. Does your firm offer automation for repetitive tasks (RPA)?
  38. Are alerts linked to social media comments?
  39. What search engines does your firm use?
  40. Does your firm offer a “chat” interface? Can internal collaboration be “real time?” Can it be integrated to Outlook? Open API? Chrome extension for indexing candidates?


While these question sets are not comprehensive, they provide a solid base to create a set that is specific to your circumstances and intelligent tool strategy. I hope you enjoyed this series on AI and Intelligent Software Implementation in HR.


Read the entire two-part series