Let’s say we’ve taken strides with the previous incremental innovations — how do we know this effort moves the needle?

We know because we have prepared an evaluation strategy.

Not just that we have a program and session evaluation, but that we have a strategy for evaluating our objectives have been met. Which naturally means we must define our objectives. I recommend you group measures in three tiers: Strategy, Portfolio Management, and Learning Design.

Strategy

Select metrics to track across all learning programs to assess the success of your organization’s strategic objectives. “More” is a common generic goal; dig deeper to reveal the specific targets you want to hit. Here are a few examples.

If your strategic objective is…

revenue, then track sales and profit/loss

member participation, track registrants, unique participants against the population of your membership, and return registrants

volunteer engagement, track who is contributing to what when and in what capacity – across products – note patterns

business efficiency, track staff hours to programs so you can calculate how much time and money is devoted to each, noting areas that are working smoothly and areas that need to be examined for greater efficiency.

Select metrics that help you monitor whether each learning program is on track or whether tweaks are required. No need to wait until the end of a product cycle to determine whether it was a success or failure – monitor your expectations are being met so you can respond to issues as they arise. Many associations employ robust pre-conference tracking that yields valuable year-to-year comparison data, but don’t apply the same rigor to other learning programs. What could we achieve if we established program level objectives for each of the learning products in our portfolio?

If your program level objective is…

webinars will target young professionals, then track registrants by segments so you can see whether you’re hitting the desired margins (if not, switch up your marketing and ask questions about your program design)

eLearning utilization, then track registration, downloads, and completion rates (if not performing as expected, find out why now so you can be responsive). Also assess open ended evaluation questions for issues you can address to improve program performance.

growing referral registrations, track the program’s net promoter score (if not seeing referrals, follow up to assess what needs to change to achieve that goal)

understanding when to introduce new programs, track when members register and claim credits for courses over the course of the year to test your assumptions about buying cycles.

Key: Define what a high functioning learning portfolio looks like and measure your progress toward that target.

Learning Design

The focus of a vast majority of course evaluations is satisfaction – Kirkpatrick‘s Level 1. We don’t glean anything about the effectiveness of the learning from this measure. Consider ways to elevate your evaluations to measure at least Levels 2 and 3.

Level 2: Learning. Measure whether the intended knowledge and skills have been achieved. Ask whether learners intend to apply what they learned.

Level 3: Behavior: Measure whether learners actually applied what they learned. What might they need to support continued improvement?

Kirkpatrick Level 4 is the ultimate measure of training effectiveness: Results. This measure moves well beyond “the event happened and people seemed to like it” to “this event made a positive impact in members’ context of practice.” How could your association partner with employers within your industry to measure the results of your programs against their intended outcomes? Imagine the powerful feedback you could glean and the incredible value you could articulate with these insights in hand.

Key: Define the expected learning outcome and then design measures that will evaluate success – learning, behavior change, and results.

Measure your success by clarifying your vision for your learning programs. Then assess the opportunities to collect data before, during and afterward to map your efforts against your target.

Ready to bump your evaluation strategy to the next level? Get in touch.

Meet Tracy King

As Chief Learning Strategist & Founder of InspirEd, Tracy leverages her more than 17 years in the education industry for organizations interested in increasing their relevance and revenue with meaningful live, online, and mobile learning programs. Tracy specializes in the intersection of learning science and technology. She's a thought leader in education strategy and learning experience design. Learn more at www.inspired-ed.com