The growing appetite for human capital analytics is drawing more professionals into the wider “market” for human capital analytics, including some players who ought to have done a little more homework.

We hereby announce the Nelson Touch Consulting Awards for Human Capital Unalytics – for the most egregious displays of half-baked notions or products. If analytics represent the use of data, analysis and systematic reasoning to make human capital decisions, then these awards are for the opposite – the misuse of data, poor analysis and fuzzy reasoning to make human capital decisions. Hence: “unalytics” – short for un-analytics.

Nominees will be named throughout the year and a 2011 winner will be chosen from among 10 finalists through a reader poll.

We certainly don’t want to embarrass or discourage thought-leaders and vendors who are nominated. After all, it’s the ‘mad’ inventors who eventually end up with game-changing ideas! But at this nascent stage in the evolution of human capital analytics, HR professionals should not be misled by half-baked models that stifle quality and dilute standards.

We welcome defense from the nominees (who may choose to remain anonymous) and debate from the blog’s readership.

Nominee #1 is a human capital analytics software vendor. The company offers customers a dashboard that seeks to warn management of a broad variety of talent management issues, based on analytics calculated using the company’s employee database. In addition to the analytics, which are presented in graphical format, the dashboard provides an associated color-coded risk assessment.

One of the dashboard items is an exhibit that depicts differences in pay between men and women. On this count, Nominee #1 is to be commended for attempting to throw light on a very important issue. Gender pay disparities have lingered very long without appropriate redress.

The passage of the Lilly Ledbetter Fair Pay Act (2009) renewed interest in the issue, which is probably why pay disparity analysis is showing up in human capital analytics products. Of course, due to the sensitive nature of the topic, analysis is done privately and there is little scope for benchmarking.

However, Nominee #1 doesn’t give us a practical or accurate approach to investigate gender pay disparities. There is no happy intersection between the surge in human capital analytics (the tool) and the addressing of gender wage dynamics (the issue). Instead, here is Nominee #1’s ham-handed solution, depicted below.

The graphic is confusing and, stunningly, at once both overly complicated and overly-simplified.

It is overly complicated because what’s important here is the differential between male and female pay. This can easily be shown as one number: average female pay as a proportion of average male pay. This happens to be 60% in this example. In one number, 60%, you immediately see the overall problem (it’s not close to 100%) and the extent of it (it’s not even close). No need for a chart at all!

The graphic is overly simplified because comparing gross averages between men’s and women’s pay does not provide us much useful information. Certainly, knowing the gross extent of the gap is a start. However, the gap needs to be decomposed into what is explainable and what is not. Differences in labor market experience, educational qualifications and specialized skills (i.e., individuals’ human capital stock) might account for some of the gap. The unexplainable portion of the gap is the problem and most of it is attributed to gender wage discrimination.

The actual problem of unjustified pay disparity might be different from what one might assume looking at just the gross difference. Furthermore, there might be important insights to be drawn from the more detailed analysis. It could be that men and women earn different rates of return on their individual human capital (which needs further exploration). The main point here is that it is misleading to try and boil down this important issue into one simplistic dashboard statistic.

And then there are some minor irritants in terms of the graphical representation. Why are we burdened with two decimal places of significance when the gross difference between the categories is so large? Reporting no decimals or at most one decimal place would be sufficient. Why is the pay differential denominated in 22% increments on the y-axis? Not adjusting what seems to be an automatic scaling setting shows a disregard for the numbers and our analytic sensibilities. What elements are included in “Pay”? Is this for a specific position or is it an average over all positions? Why are we not provided trend information on a statistic that we want to improve over time?

Finally, we come to the risk assessment, represented alongside the graphic as follows.

The differential represented here is that between average female pay and overall average pay, when the crux of the matter is the differential between average female pay and average male pay. One explanation for this approach might be that builders of dashboards need to provide a baseline or target and, in this case, overall average pay appears to have fit the bill.

The differential, again to an unnecessary two decimal places of significance, is flagged as a “Severe Risk.” Once more, I think the dashboard construct has trumped thoughtfulness about the issue, the metric and the risks posed.

Have I missed anything? Please join the discussion and stay tuned for Nominee #2 for the Nelson Touch Consulting Awards for Human Capital Unalytics. You are invited to submit “unalytic” gems that you come across for deconstruction and debate.