Diving into customer and employee surveys

This is a compilation of various past articles on customer employee surveys, presented in no particular order:

Getting real change with an employee survey: stick to the process

Employee surveys—a tool for change (the full process, including action planning)

Customer surveys—

Tips for building a valid survey

Overpolling, oversurveying, and diminishing returns

Norms!

Getting real change with an employee survey: stick to the process

For many consultant, summarizing the data and providing recommendations is standard operating procedure, but it often leads to little action. That makes sense: most employees only get a little (if any) feedback, and the top managers are given a set of recommendations they can safely ignore or forget about. Sometimes there is limited action; but far more can be done.

To get the most out of a survey, try using this method:

At the first meeting, the survey leader (who we’ll just call “the consultant”) asks the audience to draw conclusions and come up with action steps. Our role, after about twenty minutes of presenting information and some guidelines for using it, is to switch into decision-making mode, with clear deadlines, lines of responsibility, and follow-through dates. We are not just saying “this is what we would do” — we are bringing people in, getting them emotionally involved, and making the process theirs. Advice from consultants is easily forgotten or ignored; one’s own ideas are not.

Working with small, cross-functional groups to help them present survey data at a staff retreat can be even more rewarding. After early qualms and doubts are addressed, they can have an astonishing amount of enthusiasm, and many make plans for using the information in their everday work. They also learn enough to make educated criticisms of the surveys themselves.

These teams leave with a better understanding of how they could use survey data to make their organization better. More important, they tend to use their authority and knowledge to address long-standing issues.

In many ways, the ideal is for a trained facilitator to sit with each team and go over the results with them, and then guide them through action planning and follow-through. Not every organization can do that, but the next best thing is to use a cascading process, where the senior team goes through the process, then helps the teams beneath them to do it, and so on throughout the organization until everyone has understood and acted on the survey information.

How to do feedback/action planning sessions for other levels

Telling people what happened to that sheet of paper they filled out shows respect for their time, cooperation, and feelings; this respect will be repaid with more carefully completed surveys the next time around. Still, most employees want more than a summary. First, they want to know what changes have been made; but more importantly, they want a chance to use the findings to improve the organization.

That brings us to the feedback cascade.

After the first group, managers can hold their own feedback sessions for people in their areas. This is an opportunity for managers to show how they, personally, have used the information. Speaking of broad intentions may damage credibility; people usually want to hear about specific changes.

Feedback sessions also ensure that people understand the information and can use it to answer their own questions and make decisions.

The purpose of a survey is usually to spur action and guide decisions. It makes sense, then, to use the feedback session to create action plans. Doing this can not only kick off a project with a bang, but can make the feedback session the most valuable part of the entire project.

When scheduling the session, tell everyone its purpose: to review survey findings, ask people for their views of the implications of the information (causes, problems, strengths, trends, etc.), and create some action plans to address issues raised by the survey.

It is best to set up feedback sessions with plenty of time; a morning or afternoon, or a day or two, depending on the scope and how much you want to change. Prevent interruptions, possibly going off-site.

Set up some ground rules. The session should be genuinely open, and people should be able to participate without fear of retribution or attack. (Don’t make any promises you can’t keep! It helps to have an outsider present to warn when managers become defensive). The manager must create a feeling that people can freely ask questions, discuss issues, propose ideas, and take on new responsibilities.

Briefly review the goals of the survey, how surveys were sent out, and who analyzed the data. Goals may include increasing effectiveness, learning customer or staff needs, spotting minor problems before they became large, continuous improvement, etc. (Having small teams of other people get this information together and present it may help to increase participation and interest, especially if you refrain from presenting your point of view during or after their talks).

Questions about the validity of the numbers might be raised. A natural reaction is to quickly dismiss the questions; even experienced consultants do this. That said, addressing validity issues is important, and it is worth the effort of seeing from the employee’s perspective and having to review one’s own rationale (or to defer to an expert, or even to admit to a flaw!).

If one person openly questions the data, others might silently do the same. More importantly, if one person’s concerns are brushed aside, it sets a negative tone for the entire session — and beyond.

One way to start the action planning session is by asking for help in solving problems. Other people may be closer to the situation. Asking for help may increase the others’ respect for you, because it shows that you have some respect for them; and because people tend to like those who they have helped. (Benjamin Franklin once wrote of making an enemy into a friend by borrowing a book from him). People are also much more likely to accept and to actively support solutions which they had a part in creating.

If people at the meeting do not have the power to make decisions and implement plans, be honest about these limitations and tell them that you will be using their input to make these plans yourself, or to bring them up to a higher level. However, if you (or the people at the higher level) are not really serious about implementing the proposals unless they were what you were planning to do anyway, forget about the action planning session. It is better to have an open, honest feedback session without action planning than a session that raises expectations and then dashes them.

Action planning: key to success

Instead of having formal minutes, when the action planning session starts, notes can be taken on large sheets of paper so they can be visible to everyone, and can be transcribed later.

Usually, some people say they cannot change anything, that other areas must be changed first. This can be countered by asking, “Well, what can we do? … What’s stopping us from doing that right now?” This has a tremendous motivating effect. If we cannot do everything we want, we can certainly do some of the things we want.

If the people from the other area or department are in the room, you have a tremendous bonus. We have seen people pointing fingers across the room, and simply brought them together: “Who can do this, if you can’t? Are they here today? … John, how do you feel about doing this? Can you work together on it?” Long-standing communications problems have been resolved that easily, in a single meeting, with lasting (over a year and counting) effects.

It is essential for everyone to feel that their opinions and suggestions are valued, and that they are taken seriously. Watch yourself for condescending, authoritative, and defensive actions or words, while making positive comments about useful suggestions and contributions. Practice “active listening” — the art of intentionally concentrating on what people are saying, and considering how it can work and help rather than any problems it might cause or any difficulty in implementing it. Often, people can find a way around problems and barriers if they really believe in something and have a reason to invest their time and energy in it.

Seemingly trivial issues can be important, partly because of their symbolic value, partly because they are a daily nuisance: the drip in the faucet, the sign-off process for magazine subscriptions. If the survey spotlights small problems that can easily be fixed, immediately fix them, no matter how small. (Delegation helps.) When you visibly and immediately use a survey, you show respect for your employees, and increase energy and enthusiasm.

Responding to a suggestion by rewording and summarizing it shows you have heard and understood it, and gives other people a chance to clarify or add to it. It also helps to write suggestions on posters, which also comes in handy when writing down what was suggested and accomplished at the meeting.

If ideas are not usable, but the group believes they are, present your points not as absolutes, but as barriers. For example, “This is a good idea, and but…” Someone in the group might have thought of a way around it, or might be able to come up with a similar plan that is do-able. I have often been surprised by how quickly seemingly insurmountable obstacles can be overcome; and I have often surprised other people by getting around them myself!

After a number of possible solutions have been created, they should be prioritized and discussed. If there are a large number of ideas, try asking for the group to help in sorting them out. I usually use a table with four cells, with importance on one side and ease (or speed) on the other: so items can be easy and very important, easy and less important, etc. Afterwards, have the group discuss the pros and cons of each idea, starting with the “easy-very important” group, then the “easy-less important” group, then the “not easy-very important” group. It is essential to tackle some of the “easy” issues, even if they are not as important as some others, to gain momentum and quickly, visibly use the work of the group.

People should be able to come up with workable actions to address problems and enhance strengths, some which can be put into effect immediately. To maintain momentum, it is essential to follow through and enact at least a few of them within the week. The ideal is to come out of a feedback session with several ad hoc committees working on specific tasks which are within their authority, and with several decisions ready to be put into effect.

The final part is deciding who will do what, and when. Many people find just coming up with recommendations to be sufficient; and that is fine, if the group does not have the power to make changes on its own. Frequently, the group includes people who do have power, if not over everything, then at least over their own areas. It is fine to mix recommendations to higher authorities with specific actions that group members can accomplish on their own.

Go through the list of action steps which people have agreed on, and ask the group who will volunteer to handle the first one. Ask that person when they think they can have it done, and mark their name and the date next to the item. Then ask when they will be able to report on preliminary progress, even if it’s only considering exactly how they will do it, or discussing it with someone else. Mark that as a follow-through date. Repeat this for each of the actions, until everyone’s plate is full. Do not force anyone to volunteer through intimidation or peer pressure, and do not let anyone take on too many tasks; the important thing is not to get people to agree to do things, it is for them to actually carry them out!

Before people leave, schedule a meeting for a later date, two weeks or a month afterwards, to follow through and check on progress. This reinforces commitments and ensures that stalled projects are re-examined.

At this point, a small celebration is in order. People are generally tired but enthusiastic if all has gone well, and need some time to wind down and also to discuss any “leftover” concerns or ideas with other people.

Following through

The next day, get all the action steps, volunteers, deadlines, and such distributed to everyone in the group. Ask for input, clarifications, etc. Give them a chance to revise it.

You may want to informally speak with people who seem to have unreasonably short (or long) deadlines, remembering that the final word is theirs.

If goals were set, make sure they can be measured. It is often good to start measuring key indicators before the survey or feedback session, so you can measure progress. Using performance indicators is also a way to measure the effectiveness of particular changes.

It helps to divide the report into:

Actions the team has already put into effect.

Actions it will be putting into effect

Actions for which it needs approval.

Actions that need to be taken by other groups

If the report includes actions that must be taken by other people (different areas, higher levels of management), put them last, and specify what actions are to be taken to encourage those actions. The most important changes, in terms of motivation, are the ones which have already taken place! These are often forgotten, but they are a key in empowering people to do more.

The report should include a summary of the survey results, such as key strengths and problem areas, most frequently given suggestions or comments, and common answers to open-ended questions.

Take advantage of surveys to gather people up, make decisions, and implement them. That’s what surveys are for.

Employee surveys—a tool for change (the full process, including action planning): Step by step

Surveys can be used as a change tool in several ways —above and beyond the usual goal of gathering information.

First, simply having the survey tells people that change is coming, and that something will happen. That is a key part of the change process, known as “unfreezing,”which is needed for people to consider doing things differently.

One other subtle way that surveys affect change is by telling people what is considered most important. In short, what you measure becomes important; or, as the adage goes, “you get what you measure.” That’s one more reason to choose your questions carefully.

The way the survey is done sends a clear message, and not necessarily the one you want. People react to the language used in announcements, to the nature and type of questions, to the process of taking the survey, and, most of all, to the speed and quality of communication and action once it’s over.

Not telling employees the results of a survey is frustrating, but, worse, it tells them their input is not really wanted. That can result in disengagement, apathy, and “working by the rules,” not positive outcomes in an increasingly competitive and dynamic world.

One way to provide the results is to summarize them into a paragraph or two and send them out as an item in a newsletter; these write-ups often say that actions are being considered, but usually don’t point to concrete examples. This is better than nothing, but tells employees their role is small, and finished.

A better way is to use “feedback sessions” to tell respondents what the results of the survey were, often in some detail and providing information specific to each unit. However, if the actions of organizational leaders are not described, people may still feel the survey is a waste of time – even if actions are being taken (because people don’t have any way to link management decisions to the survey information.)

The solution

Surveys can actively engage people if three principles are used:

Quickly provide feedback to employees so they don’t forget the survey by the time they are given the findings.

Tell people specifically how their surveys have had an impact.

Use the survey as an opportunity for change, since giving a survey raises expectations of change.

For example, after reporting the findings, ask the audience to draw conclusions and come up with action steps. The survey expert’s role, after about twenty minutes of presenting information and ways to use it, is to guide the team through data-based decisions, ensuring that they set clear deadlines, responsibilities, and follow-through dates. The results may surprise veteran consultants.

One interesting approach is to work with small, cross-functional teams and coach them to present the survey data. This can lead to an astonishing amount of enthusiasm, and many plans for using the information in their everday work and planning. People can also learn much more about surveys and the survey process this way.

Though this may well be the ideal way to present data, not everyone has an internal consultant who has the time to work with small groups on process and survey details. For that reason, the rest of this article is dedicated to the more traditional feedback session.

Purposes of feedback sessions

Feedback sessions are needed to provide closure to the survey project and to show that the time spent on the surveys was not wasted. Executives can gain credibility and spur action (by example) by showing specifically how they have used the survey to make decisions.

The feedback session

When scheduling the session, describe its purpose: to review survey findings, ask for views of the implications (causes, problems, strengths, trends, etc.), and create action plans to address issues raised by the survey.

It is best to set up feedback sessions with plenty of time; a morning or afternoon, or a day or two, depending on the scope of the project. Prevent interruptions; many experts suggest having the meetings off-site, where people can think “out of the box” and avoid interruptions. Cell-phones should be off.

Set up clear ground rules. For example, the session should be genuinely open, and people should be able to participate without fear of retribution or attack. (Don’t make any promises you can’t keep! It helps to have an outsider present to warn when managers become defensive). The manager must create a feeling that people can freely ask questions, discuss issues, propose ideas, and take on new responsibilities.

Briefly review the goals of the survey project and how it was conducted, including when; how surveys were distributed, and who they were given to; and who analyzed the data. The goals may include increasing effectiveness, learning customer or staff needs, spotting minor problems before they became large headaches, initiating continuous improvement, etc. (Having small teams of other people get and present this information may help to increase participation and interest, especially if you refrain from presenting your point of view during or after their talks).

Along the way, there may be questions about the validity of the numbers. A natural reaction is to quickly dismiss the questions; even experienced consultants do this. Addressing validity issues is important, and is worth the effort of seeing from the employee’s perspective and having to review one’s own rationale (or to defer to an expert, or even to admit to a flaw!). If one person questions the data, others might quietly do the same, even if they don’t speak up. If one person’s concerns are brushed aside or steamrollered, it sets a negative tone which is hard to overcome.

Starting the action planning session

One good way to start the action planning session is by asking for help in solving problems. Acknowledge that other people may be closer to the situation, or may have more experience with different parts of it. Asking for help may increase the others’ respect for you, because it shows that you have some respect for them; and because people tend to like those who they have helped. (Benjamin Franklin once made an enemy into a friend by borrowing a book from him). People are also much more likely to accept and to actively support solutions which they had a part in creating.

If people at the meeting do not have the power to make decisions and implement plans, be honest about these limitations and tell them that you will be using their input to make these plans yourself, or to bring them up to a higher level. However, if you (or the people at the higher level) are not really serious about implementing the proposed actions unless they were what you were planning to do anyway, forget about the action planning session. It is better to have an open, honest feedback session without action planning than a session that raises expectations and then dashes them.

Action planning

Instead of having formal minutes, notes should be taken on large sheets of paper so they can be visible to everyone, but can still be transcribed later.

Usually, some people say they cannot change anything, that other areas must be changed first. I have countered this by asking, “Well, what can we do? … What’s stopping us from doing that right now?” This has a tremendous motivating effect. If we cannot do everything we want, we can do some of the things we want.

If the people from the other area are in the room, you have a tremendous bonus. I have seen people pointing fingers across the room, and simply brought them together: “Who can do this, if you can’t? Are they here today? … John, how do you feel about doing this? Can you work together on it?” Long-standing communications problems have been resolved that easily, in a single meeting, with lasting (over a year and counting) effects.

It is essential for everyone to feel that their opinions and suggestions are valued, and that they are respected and taken seriously. Watch yourself for condescending, authoritative, and defensive actions or words, while making positive comments about useful suggestions and recognizing contributions. Practice “active listening” — the art of actively, intentionally concentrating on what people are saying, and considering how it can work and help rather than any problems it might cause or any difficulty in implementing it. Often, people can find a way around problems and barriers if they really believe in something and have a reason to invest their time and energy in it.

Seemingly trivial issues can be important. Many employees are most annoyed by the problems that never go away: the drip in the faucet, the sign-off process for subscriptions. If the survey spotlights problems that can easily be fixed, immediately fix them, no matter how small. When you visibly and immediately use a survey, you show respect for your employees and increase enthusiasm.

Responding to a suggestion by rewording and summarizing it shows you have heard and understood it, and gives other people a chance to clarify or add to it. It also helps to write down suggestions on poster sheets; this also comes in handy when the time comes to actually write down what was suggested and accomplished at the meeting.

If ideas are not usable, and the group believes they are, present your points not as absolutes, but as barriers. For example, “This is a good idea, and but…” Someone in the group might have thought of a way around it, or might be able to come up with a similar plan that is do-able. I have often been surprised by how quickly seemingly insurmountable obstacles can be overcome; and I have often surprised other people by getting around them myself!

After a number of possible solutions have been created, they can be discussed and prioritized. If there are a large number of ideas, ask for the group to help in sorting them out. A common aid is using a table with four cells, with importance set against ease or speed. The group can then discuss the pros and cons of each idea, starting with the “easy-very important” group. It is essential to tackle some of the “easy” issues, even if they are not very important, to gain momentum and quickly, visibly use the work of the group.

People should be able to come up with workable actions to address problems and enhance strengths, including some which can be put into effect right away. It is essential to follow through on these items as soon as the session is over, and enact at least a few of them within the week. The ideal is to come out of a feedback session with several ad hoc committees working on specific tasks which are within their authority, and several decisions ready to be put into effect.

The final part of the feedback/action planning session is actually deciding who will do what, and when. Many people find coming up with recommendations to be sufficient; and that is fine, if the group does not have the power to make changes on its own. Frequently, though, the group includes people who do have power, if not over everything, then at least over their own areas. It is fine to mix recommendations to higher authorities with specific actions that group members can accomplish on their own.

Go through the list of agreed-upon action steps, and ask the group who will volunteer to handle the first one. Ask that person when they think they can have it done, and mark their name and the date next to the item. Then ask when they will be able to report on preliminary progress, even if it’s only considering exactly how they will do it, or discussing it with someone else. Mark that as a follow-through date. Repeat this for each of the actions, until everyone’s plate is full. Do not force anyone to volunteer through intimidation or peer pressure, and do not let anyone take on too many tasks; the important thing is not to get people to agree to do things, it is for them to actually carry them out!

Finally, schedule a meeting for two weeks or a month afterwards. This ensures both heightens commitment and assures that stalled projects are re-examined.

A small celebration is in order. People are tired but enthusiastic if all has gone well, and need time to wind down and to discuss any leftover concerns or ideas with other people.

Following through

The next day, distribute the action steps, volunteers, deadlines, and follow-through dates to everyone in the group. Provide a chance to comment on and revise them.

Make sure actions can actually be carried out within the deadlines. Informally speak with people who have unreasonable deadlines, remembering that the final word is theirs.

If goals were set, make sure they can be measured. It is often beneficial to start measuring key performance indicators before the survey or feedback session, so you can measure progress. Using key performance indicators is also a way to measure the effectiveness of particular changes.

If the report includes actions that must be taken by other people, put them last, and specify what actions are to be taken by the group to accomplish them. These are often forgotten, but they are a key in empowering people to do more.

The report should include a summary of the survey results, such as key strengths and problem areas, most frequently given suggestions or comments, and common answers to open-ended questions.

Take advantage of surveys to have people gather together, make decisions, and implement them. That’s what surveys are for.

Tips for building a valid survey

First, some warnings about employee surveys:

Simply running a survey tells people that change is coming, and that something will happen. Therefore, you must make sure something does happen and that people can see it happening.

Surveys subtly affect change is by telling people what is considered most important; “you get what you measure.” Choose your questions carefully.

The way the survey is done sends a clear message. People react to the language in announcements, to the nature and type of questions, to the process of taking the survey, and to the speed of communication and change once it’s over.

You have to tell employees the results of the survey, or face disengagement, apathy, and “working by the rules.” Ideally, employees will get feedback sessions with action planning built in.

Validity: measure what you want to measure

Questions must be clearly understood by everyone; the language should be as simple as possible, to avoid literacy issues.

Unless there are social expectations to fight, be direct.

The shorter the sentence, the more likely people will read it (rather than scanning it).

Phrase statements in a positive way, or the phrasing effect will drown out the content.

Doing statistics (and taking the survey) is far easier when you have a single scale.

Rank-orders are hard for the respondent, and even harder for reporting and statistics.

Each concept gets its own question (avoid “double barreled” questions).

Questions should be behavioral and concrete rather than conceptual, wherever possible.

As a general rule to remember, people do not read instructions.

Large print and frequent paragraph breaks increases the likelihood that adults will read the full text.

Follow survey conventions so people don’t get confused. For employee surveys, this means go from left to right, negative to positive, with the most positive items having the highest numbers (e.g. strongly agree = 5 and strongly disagree = 1). On Web surveys, they normally aren’t numbered. Help respondents to fill out the survey using the right scale. One person using the wrong scale can wreak havoc — and it’s very common.

Work hard to get as many people in the sample as possible to complete it, to avoid nonresponse bias.

Avoid “binary” questions that “lose” information (“Are you satisfied?” should be “How satisfied are you?” and “Do you want this service?” should be “How much would you pay for this service?”)

When necessary, define the anchors completely. While this creates a statistical violation (you can no longer simply assume the distance between each number is identical in size), the effects may be minimal, and you may be able to avoid a great deal of bias and guessing whether respondents are interpreting the scales the same way. (Rather than simply asking “How well does the organization’s mission guide your actions? — Completely to Not at all,” define each step, e.g. “I refer to it each time I make a decision,” to “I never use the mission to make real decisions,” with intermediate steps also filled in.)

A basic process for reliability and validity testing

This is a relatively fast process for basic reliability and validity testing. For the “more proper” method you can see the APA, AMA, or other research-organization Web sites.

Reliability is whether the survey gives you the same answers at different times, and whether the questions within it measure the same thing (only applicable if you’re doing a set of questions to measure a single issue, e.g. engagement, involvement, satisfaction, depression, etc.)

Validity is whether the survey measures what it’s supposed to measure. If a survey is not reliable over time, it cannot be valid, because it will vary depending on when it’s taken.

Ideally you’d check a new instrument against an older one that measures the same thing and has been validity tested already. However, most often people are developing something new because nothing exists already.

Develop the survey after doing a literature search and gathering needed information.

Use extra items where possible partly to deal with items that are struck out, and partly to provide some degree of internal validity testing via interitem correlations (that is, by seeing if any items within the survey tend not to change with the others.)

Circulate to local experts for their opinion, for face validity.

Pilot test.

Ask 5-10 people to take it

Ask them to tell you immediately if anything is confusing or hard to answer

Watch where they “get stuck”

Ask where things could be easier to understand, or better in general

Ask for any criticisms.

Give patents the survey, wait as long as you can, and give it to them again with the questions in a different order, and see if the link between the same people is greater than the link between different people.

Find another method to compare the survey to, e.g. face to face interviews, and compare results of both; or by give the survey in a different form, e.g. open-ended questions / fill in the blanks.

Repeat as needed.

Ideally, when the survey is adminstered the first few times, have extra open-ended questions so you can do a “quality check” on the numerical data.

Oversurveying, overpolling, and diminishing returns

Over the years, that response rate kept dropping. To a degree, the mailing list was diluted, which was responsible for part of it; but even among people we knew to be current, valid customers, we started finding it hard to get a response rate over 25%. The returns dwindled and it became harder to justify taking action on the results.

What happened?

I recently came across a 2010 article in the Harvard Business Review lamenting the same issue, some eight years ago, when it hadn’t yet hit my little part of the world. The writer briefly talked about some of the reasons for the drop in response rate, but his main concern was its impact (making survey data invalid). I’d like to talk a little about the reasons for it, though.

If you are like me, your phone is constantly being spammed with telemarketing calls. When you do get asked to join a survey, half the time it’s a political survey where they’re fishing for a particular set of answers or setting you up for a donation. Your physical mailbox, for a time, was probably full of surveys from political and issues groups, designed for the same reasons — talking points and donations. At the same time, all our contact points — phone, physical mail, email, Facebook, LinkedIn — are prone to constant phishing and spam attacks. Even legit-looking emails, half the time, turn out to either be phishing or trying to direct you to a malware site that has a URL similar to a real one.

When you do get real surveys, they often seem to have no effect, or worse. Your car company sends you a survey about your experience at the dealership; if you’re brutally honest, your service advisor or the manager may show you a copy of your survey when you get there, which is a betrayal of the trust you had in the company. Anything less than a rating of “10 of 10” can result in a dealership or an individual losing their bonus, which prevents some people from giving honest answers. Pointing out problems rarely seems to result in any change, and when companies or governments do change things as a result of the surveys, they rarely announce that link.

It’s no surprise to me that response rates have dropped!

There are still ways to get responses, and I’ll talk about those later this month.

Norms!

While clients almost invariably ask for, or sometimes demand, norms for employee surveys and even 360° feedback items, we have found significant cultural impact on responses — which is to say that there is variance based solely on the organizational/regional culture where the questions are asked. In some regions/companies, people tend to be more positive than in others, which is a substantial source of bias when looking at norms.

When we do employee surveys, we do have norms; but we try to convince people not to buy them, and if they do, not to use them. Part of the problem is also that it “allows failure” in areas where others do poorly as well. Most organizations have poor communications and power distribution; a norms-based approach would allow shortfalls in those areas because other organizations also do poorly there, whereas a non-norms-based approach would address weaknesses in those critical areas.

This commentary is limited to employee surveys and the like, and is not applicable to personality assessments and such, where norming is essential and handled quite differently than in the typical employee-survey process.

This page used to be several different pages. We present them here in no particular order.