Q: How is a specific line of business / business unit using your predictive decisions? How is your product deployed into operations?

A: Our clients use our predictions for performance or attrition every day, in the process of recruiting new job candidates. At PAW, I will be talking about rolling these up into a full network or tree of predictions, to predict performance for the candidate’s likely future roles, after a promotion or two. For this, we’ve built several dynamic job maps, and several composite benchmarks, and will soon deploy the full integrated solution that rolls all of this together.

Q: If HR were 100% ready and the data were available, what would your boldest data science creations do?

A: I’d create a living, interactive, visual map that shows everyone at the company who is going where, and how it affects corporate value. I’d populate the map with hundreds of predictive models to optimize employee changes, and keep it fresh with ongoing research. The result would be a living guide with tangible directives for hiring, terminating, and promoting employees to grow the organization right.

Q: When do you think businesses will be ready for “black box” workforce predictive methods, such as Random Forests or Neural Networks?

A: Not for many years, if they are kept opaque. HR and Hiring Managers want to know what is driving selection, and that it makes sense with a management narrative that they can follow. Sometimes there is a tug-of-war between plausibility and accuracy, but fortunately the human ability to form narrative is strong.

This doesn’t mean you are limited to simple regressions. Variable selection is everything for regression models, and we often use random forests or lasso/elastic net methods to find a set of regression variables that robustly predict.

Also, you can use black-box models like Random Forests, Support Vector Machines or Neural Nets for better accuracy as long as you do the extra work to isolate key variables and trends in them, to build a story for your model users. There are methods to probe a winning black box method to identify variable importance and general directionality. This can take enough opacity off of a model to be able to give “face validity” to its users.

Q: Do you have suggestions for data scientists trying to explain the complexity of their work, to those solving workforce challenges?

A: We often use examples from other domains, preferably their own domain. If I’m explaining a hiring model to a banker, we’ll show how it’s like using a credit score before offering a loan to someone. You don’t extend the loan (or job offer) unless there is a good probability that the person will pay the loan back (or stay on the job, or perform on the job). They get that.

Some of the technical graphs, like survival curves, work well with management users, and you can explain them; to most. Other constructs, like an AUC curve or cluster silhouette plots, are just not going to work with most. We try to win a lot of trust by the time we get to that stage.

Q: What is one specific way in which predictive analytics actively is driving decisions?

A: As mentioned above, hiring decisions – choosing candidates who are likely to not terminate early, and who are likely to overachieve (or not underachieve) real business KPIs. We work with clients to identify the KPIs – they are tangible things, like sales per month, or food safety scores, or cash drawer entries. We don’t try to drive mushy middle values like engagement or happiness.

You don’t put graphs or anything fancy in front of a recruiter for hiring/promotion decision. We just deliver calibrated, color-coded color bands – blue candidates are likely to over-perform, red candidates are likely to under-perform. They just pick up that color from the system; maybe get some auto-generated behavioral interview questions and talking points tailored to the candidates, and move to the next step in recruiting.

Real analytics ultimately doesn’t show graphs or paragraphs of “insights” to the user – it just helps them make the decision.

Q: How does business culture, including HR, need to evolve to accept the full promise of predictive workforce?

A: A predictive analyst needs to earn their trust, and they need to learn to understand that their “gut” is just another form of decision-making. The HR users need to gently see that sometimes the “gut” can be more biased, and less fact-driven than rigorous analytics.

They don’t need to learn Chi-Square or Receiver Operator Characteristic – though we’ve converted several into junior analysts and gotten a few back into graduate school. Managers do need to understand that models are just trying to make decisions based on facts, the way they are, and that they need to be forever learning.

Q: Do you have specific business results you can report?

A: In one case we reduced annual attrition from 84% to 48% in 5 months. That saves the banking client over $1 million a year in replacement cost and employee lifetime value.

In another case we increased the ability to hire successful candidates (as measured by passing a Series 7 exam) by 12% – that high-volume situation saved the client over $4 million a year in replacement cost alone.