Learning about learning

As an elearning manager, I have reviewed a lot of elearning during my career. Much of it dumps content out there followed by multiple choice questions. If training is learning new skills, then why aren't we testing the skills they are supposed to learn in our programs? What is the difference between what we do and what marketers do in content marketing? As Harold Stolovich would say, "Tellin' ain't trainin' and trainin' ain't performance."

After considerable noodling, I've come up with four reasons why we assess learning so poorly.

Hypothesis 1

We don't know what we are supposed to measure

It's easy to write multiple choice questions. They are quick to craft and program. But they usually only test remembering, which is at the bottom of Bloom's Taxonomy.

If we spend our time crafting outcomes and objectives, then supposedly build our training around them, why aren't we measuring to those goals?

Maybe we don't know we should.

Hypothesis 2

It's hard to measure things in elearning

Yes, it requires a little more thinking to find creative ways to measure skills using an online function. But that doesn't mean you can't. If I am teaching a course to students learning to use a piece of software, shouldn't they be able to use it at the end? I can't tell you how many times I have seen multiple choice, text based questions asking about buttons at the end of these courses. There is no context, and no pictures. What is that helping?

HYPOTHESIS 3

There isn't enough time to build those kinds of assessments​

Nope. There often isn't. I am the first to admit having whipped together 10 multiple choice questions in the 11th hour before a course had to go for final review. But that's not the best way to go about it.

Good design is about planning to achieve the intended outcome, which should always ask: "In the end, what do you want your learner to know and do?" and "How will you know that they know?"

Defining that up front and as a single sentence gives you a mantra to repeat to the client and design to. I always design assessments not only for the intended goal, but for each enabling objective. If the learned cannot do all of the substeps of your process, how will s/he be successful in the end?

In the end, what do you want your learner to KNOW and DO?How will you know that they know?

hypothesis 4

The Business only wants a Passing Score & a Smile Sheet

You did not just say that. You mean to tell me that your business partners don't care that their staff meet the goal of the training? This is a case of not knowing what they don't know. They think they only want a passing score. They may tell you that's all that matters. What they want is motivated employees who work hard to benefit the company to make it more successful as measured in their key performance indicators (KPIs). Trust me on this one. If they want their staff to learn new software, they want them to USE that new software efficiently. If it's selling a new product, they want them to be able to listen to customer needs and align the product's features to it. We need to know that they can do that.

How do we Fix it?

The first step in effective problem solving is acknowledging an issue exists. The second is defining it. We know we could do better with our assessments, and have lots of excuses as to why they aren't measuring what we are intending to teach. Now that we know, let's look at a couple of ways we could authentically assess our learners so that

we know that they know

they know that they know

the business knows that they know

the learners are successful in the end.

Authentic Assessmentis testing learners in the situation or a close approximation of it in which they must perform the intended activity.

Software CourseWhat if.... we give our learners a word problem and ask them to solve it using the software and take a screen shot of the results?

What if.... we give our learners a task to complete and have them give the results to their supervisors for assessment? Use a Word document to format this text. Be sure to use three levels of headers, page breaks in the appropriate places, and Calibri 12 for the paragraph text.

Soft Skills CourseWhat if.... we give our learners a scenario to apply the skills they have learned? We can build a branched scenario with consequences for learning, but we can also have a final exam to assess learning in the same way.

What if.... we give our learners a scenario and ask them to use their smartphones or laptops to record a response that they must upload for peer review? Coursera, EdX and the other MOOCs use peer review for grading. Five reviewers tend to give an accurate assessment.

For more ideas, check out the presentation I did for the eLearning Guild at Learning Solutions 2016, or connect with me and let's noodle together.

Your Turn

What other ways can we authentically assess our learners? I'd love to hear your ideas. Share your thoughts in the comments section below.