Simplifying employee surveys… how about using two or three questions to find out everything?

Employee surveys are common in companies that are too big for CEOs to talk to every employee individually. Most don’t produce meaningful action that makes a difference to the employees. I believe the root cause is simple: we don’t have a simple way of asking employees what they think and what they would like to see improved. We have complex ways of going about it, involving asking employees to rate many factors on some sort of scale, such as from 0 to 10 or from 1 to 5. The surveys have so many questions that we don’t dare ask employees to answer more than once a year.

Typically, the results of such long surveys are compiled and published weeks to months after the survey takes place. Two things often follow:

Managers discount any negative input, saying things like “Things were special at the time of the survey. Several months have gone by, and things are better now. Let’s wait to see what next year’s survey brings.”

More positively, multiple regression analysis is used to work out what individual questions seem to have the strongest relationship with some overall metric, such as “employee engagement”. Since we only have numbers, some sort of working group is then formed. This can of course just be a single individual. The task is to decide what can be done to impact the particular metric or metrics.

Why not just ask the employees what they want to see improved?

Up to now, the reason for not doing so has been the lack of any automated tools that allow employee input to be analyzed without human bias. The good news is that this has changed. We now have a free tool that allows such bias-free analysis in any of 14 languages. We can now ask employees what they think, then meaningfully group the answers without human intervention. This lets us get directly to considering suggested actions. The short survey format makes it realistic to run the survey monthly, or on some other sort of ongoing basis.

Rather than getting into lots of technical detail, why not just try it out, using a sample survey for which I have personally made up 100 imaginary employee answers.

The free software is the Haven OnDemand Net Promoter analysis tool that you can find here. You cannot be successful trying it out unless you pay attention to these points:

You have to sign up (by clicking the Sign Up button) before you select your preferred authentication provider or you will get an authentication error. Personally, I use my Gmail/Google+ details to log in.

The expected survey format is what is referred to as Employee NPS (eNPS). It expects an answer to the question “How likely are you to recommend your company as a place to work?” on a 0 to 10 scale. People who give a 9 or 10 are called Promoters. From zero to 6 is a Detractor score, and the remainder are Passives. The eNPS score is the percentage of Promoters less the percentage of Detractors. The second question is simply “Why?” and a third question can also be used. “What should we improve?” If you don’t have eNPS numbers and just want to analyze text, create an NPS column and fill it with numbers between 0 and 10. If you have eNPS numbers, but they are on a 1 to 5 scale, just multiply each by 2 and the categories will work correctly.

The input file must be in .CSV format (Comma Separated Values). It is easy to use Excel and other spreadsheet software to save files in this format. Note that the default setting for CSV files in the French and some other versions of Excel uses semi-colons (;) instead of commas, and that format will not work with the NPS software. You must have commas as the separators.

There must be a column headed “NPS” and a column headed either “Why” (without a question mark) or a column headed “Improve”. Having both works too. You can have other columns too, and the software will ignore them.

After signing up, logging in and accessing the page with the software, the first step is to create an empty index. The “Clear index” button removes any files from the index, but does not delete the index, so you only need to create the index once. The other buttons should be intuitive. Selecting a file does not automatically upload it. You have to click the upload button.

The most common reasons for error messages when uploading are (1) the file is not a .CSV file, (2) the file is missing the NPS and/or a Why or Improve column, or the headers are not spelled correctly, and (3) the file is over 10 MB in size. Personally, I find loading files with over about 5,000 rows takes to long, and I prefer to break them up.

Selecting a language must be done before the file is uploaded. The software does not detect the language, but does understand the structure of the twelve languages listed. In essence, when you select a language, you are telling the software to treat the entire file as though it is in the particular language. This can produce some odd results in multi-lingual files where the languages have similar structures, such as for French and Italian. It is very unlikely to mix Mandarin with Spanish, on the other hand.

Click on any bar in the bar chart to see actual survey responses.

There is an “i” symbol close to the top of the page with more information about the file format.

The sample survey only contains text in the “Why” column. Selecting “What should we improve” won’t give you a result.

Finally, the software currently works well, but is not complete. The first chart below is something you should be able to reproduce with the current version and the sample survey. The second one is on a development machine and will be released in early January, after we complete a lot more testing. The results of the first chart are a subset of the second.

I suppose I should also mention the usual… the opinions I express here are my own and do not necessarily reflect those of my employer, Hewlett Packard Enterprise.