There are many reasons to start developing online training. For one, your audience can take the training whenever they want. For another, you can deliver training to anyone in the world. So, all of your employees who work in six different offices can complete the same training course. But if you dive into online training, how will you know whether it is an effective way to deliver training in your organization?
There is a method for finding this out called the Kirkpatrick Model for evaluating training effectiveness. Below is a short summary of how to use the model to determine whether your online training is effective.
Level 1: Reacting
According to the Kirkpatrick Model, Level 1 addresses the degree to which trainees “react favorably to the training.” This is the customer satisfaction of your online training. In other words, when an online training course is completed, you ask learners to complete a survey to tell you how they liked it. Training surveys are not perfect, but if written effectively, you can gain important insight into whether people reacted positively to your training.
Level 2: Learning
Level 2 addresses the degree to which “participants acquire the intended knowledge, skills, attitudes, confidence, and commitment based on their participation in a training event.” In practice, this is where assessments, quizzes, and tests come into play. There are many ways to test people, from un-graded multiple choice quizzes to scenario-based assessments. The purpose here is to find out whether people actually learned what was taught in the online training course.
Level 3: Behavior
Do people “apply what they learned during training when they are back on the job?” This is an important question. It is one thing to like the training (level 1) and learn what was taught (level 2), it is another thing altogether to apply what was learned. If people do not apply what was learned, there was really no point to designing and delivering the training was there?
There are three ways to determine whether people are applying what they learned. You can observe people working and record their actions, you can interview people and their managers to ask them if behaviors have changed, or you can study data that was inputed into systems (assuming the online training was about how to use a new work tool).
It takes some effort to collect this data, but it can be worth it to learn whether people are using what you taught them.
Level 4: Results
Just because people like the training (level 1), learned what was taught (level 2), and are actually applying what was learned (level 3), does not mean desired results will be achieved. At level four the point is to address the question of whether the desired outcome of the online training course was achieved.
Measuring success at Level 4 assumes you had a desired outcome in the first place, but if you thought your team needed conflict management training, did you expect to see complaints to HR fall by a certain amount. This is what is meant by results. Ultimately, training is conducted to achieve some result.
Level 5: Return on Investment
Building on Kirkparick’s Model, Jack Phillips added a fifth level to measure the return on investment of training. At this level, the objective is to attach financial numbers to the results. For example, if complaints to HR fell by 50% as a result of the conflict management training, was that result worth the investment made in the training?
The answer might be “No!”
“How could that be?” you ask. Well, think of it this way. If the number of complaints dropped from four per week to two per week, was it worth it to have spent $10,000 on the training? $100,000? $1,000,000?
Putting It All Together
The secret to evaluating online training is to put all five levels together and make a determination as to whether the training was effective. Using one level by itself is not enough to know whether training was effective.
How do you measure the success of your online training? How many levels do you use?
Bill Cushard, author, blogger, and learning experience (LX) designer, is a human performance technologist (HPT) with extensive, in-the-trenches experience building learning organizations at companies like E*TRADE, Accenture, and ServiceRocket. You can follow him on Twitter or on Google+.