In fact, research by LEO and Watershed found that while 47 percent of L&D professionals believed data played a significant role in their organization in 2017, 83 percent do now. Furthermore, 70 percent list measuring business impact as a top priority (up from 58 percent last year), and 60 percent say they’re being pressured by leadership to justify investment in L&D (up from 35 percent last year).

Now, thanks to technology, it’s becoming easier. Many learning platforms now include some form of reporting and analytics function to support L&D performance and the evaluation of ROI. “You’ve got to have the systems to be able to collect and display [data],” says Piers Lea, chief strategy officer of LEO, “and then you can start making meaningful correlations. That way, you start to move from lagging data to predictive indicators.”

He recommends using a learning record store (LRS), which will enable the collection of data from a variety of learning activities and communication among systems, including the LMS. Using these systems enables personalized learning. “Nobody has enough time these days,” he says, “and that includes not having enough time to learn. This means learning needs to be deeply relevant.” Using data enables organizations to provide that “very precise learning.”

“A strategic, goal-oriented approach will guide your choices of data sources and help keep your efforts easy to manage,” says Shevy Levy, co-founder and CEO of Lambda Solutions. Identify and understand your goals, and map them to the data your platform is able to track. “And avoid the shiny-object syndrome – don’t collect data you aren’t going to use.” Those data will likely come from one or more of three contexts, she adds:

Engagement statistics: Many e-learning platforms can track metrics such as frequency of logins and time spent in training. These metrics enable you to determine how engaged learners are in a given course or module.

Performance statistics: To evaluate how good your content is, track metrics like assessment scores, learner feedback, course participation and use of different types of content (videos, interactive modules, quizzes, etc.).

Help desk statistics: Patterns in how often learners request help as well as frequently asked questions can help you evaluate the technical aspects of your e-learning program.

Of course, outside the e-learning itself, it’s also important to track actual performance on the job and other higher-level key performance indicators to determine business impact and ROI. Lea recommends “everything from sales data and retention to right-first-time and profitability.” Make sure you talk to other leaders to find out what the organization’s business goals are, and then align metrics and learning to those goals.

To implement a new learning analytics program, Levy recommends a systematic seven-step approach:

Map the context by identifying the environmental, political and cultural factors that will affect the implementation. Predict the barriers you’ll face, and identify potential “allies” that will support the program. Plan how you will present your case to stakeholders and skeptics.

Identify the stakeholders, including the people who will benefit the most from the program. Then, decide which stakeholders have the most influence, such as department heads, and start developing an approach to “involve, inform, support and train key personnel.”

Identify the purposes for using learning analytics. Examples include learner awareness, monitoring and tracking, research, evaluation and planning, and reporting and communication. Align these purposes to your stakeholders, and prioritize them.

Develop a strategic plan that includes every step you’ll need to take to meet your goals. Develop a timeline, and review and (if needed) update the strategy throughout the process.

Develop an evaluation system. Monitor your progress continually. Revisit your original objectives and vision to make sure you’re still headed in the right direction. Finally, after implementation, “conduct a review of the overall process and make notes for future efforts.”

One of the ultimate objectives of a learning analytics program is to make sure learning is effective and aligned with business goals. Lea says that organizations that “are good at strategic alignment get between three and five times higher performance output.” Making data-driven decisions, based on everything from individual learner goals to strategic organizational goals, will, ultimately, mean a successful training program.

Want to learn more? Our Midwinter Month of Measurement is leading up to our next virtual conference, TICE Virtual Conference: Metrics Matter, a Focus on Strategic Planning, Analytics and Alignment. Learn more and register for the free event here.