I’m often asked by our clients and the subject matter experts “How do we know if our training is effective?” That’s not an easy one to answer. Training measurement (measuring training effectiveness) has always been and continues to be a challenge.
When we think about measuring training effectiveness, it helps to consider the following questions:
- Do your training programs meet their objectives?
- If they do, how do you know the objectives were met?
- How do you measure concepts or competencies in training?
- Do your objectives tie to key competencies, measurements, and exercises?
Wouldn’t it be great if you could ensure that you are designing training programs that identify the correct objectives and meet those objectives? You can! It is as easy as determining four components:
In this article, I’ll provide you with some basic information about how I use this formula to develop training programs, and how the formula can help provide guidelines for subject matter experts. It’s important to remember, though, that meeting your training measurement objectives only matters when they are the correct objectives.
Just to make sure we’re on the same page when it comes to evaluation terminology, I’ll be referring to Kirkpatrick’s Four Levels of Training Measurement. This model was developed in 1959 by Donald Kirkpatrick, PhD, and is the most widely recognized model for training evaluation. Kirkpatrick’s four levels:
- Reaction – Level 1
- Learning – Level 2
- Behavior – Level 3
- Business Results – Level 4
The current state of training measurement (evaluation)
If you had trouble answering the questions I posed at the beginning of this article, you’re not alone. Unfortunately, many training programs and training departments fall short of identifying and meeting their objectives. In fact, in many cases, training programs don’t have formal objectives at all. And the training programs that do have objectives often do not determine if they’ve been reached (Level 2 evaluation). Add to this the fact that many organizations are unable to identify the exact results achieved through training, and it’s pretty easy to see why we have confusion when it comes to evaluating training effectiveness.
Chances are also good that the training programs that are measuring at Level 2 aren’t measuring the appropriate learning. According to the ASTD’s 2005 State of the Industry Report, only 54% of the companies surveyed measured training results at Level 2. So even when we don’t consider the quality of the training measurement, only half of the companies in the survey are measuring at Level 2 at all and even fewer measure at Level 3.
A simple formula: competencies, objectives, exercises, and evaluation
Often the biggest challenge is that the training isn’t designed to measure competencies in the classroom at Level 2 or on the job at Level 3. I advocate using a simple formula that matches competencies, objectives, exercises, and evaluation in an integrated fashion to measure Level 2 training results of the appropriate competencies. Additional information about these four components follows:
- Competencies – what learners need to learn for their job.
- Objectives – specific measurable and observable statements of what will be learned in the session.
- Exercises – activities in the training session that will be used for evaluation purposes.
- Evaluation – tool used to measure the competencies.
The word “competencies” has a variety of different definitions. For the purpose of this model, I define competencies as the tasks or skills that the learner must complete on the job. Competencies should be identified as a first step in developing your training materials.
Obviously, if we can’t identify the competencies or tasks the individual needs to complete on the job, we cannot develop effective training. The identified competencies will be matched to objectives, exercises, and the evaluation.
For example, if we were to write simple statements regarding technical and soft-skill competencies, they might look like this:
- Technical competency – The inventory control specialist will be able to use SAP to complete the identified system transactions associated with their job.
- Soft-skill competency – The customer service specialist will be able to answer a customer complaint by following all of the elements of an effective customer service call.
I’ve found that not all instructional designers match the training program’s objective to the competencies. By tying objectives to the competencies, the instructional designer:
- Describes how a learner will demonstrate their knowledge, comprehension, and ability to perform a specific task.
- Communicates an intended instructional result to learners by conveying a picture of what a successful learner will be able to do.
For anyone not familiar with objectives, Robert Mager’s book, Preparing Objectives for Programmed Instruction (1962), remains the standard for writing objectives today. By Mager’s definition, an objective should have three components.
- Behavior – should be specific and observable
- Condition – sets the circumstance of the behavior
- Standard – level of performance that is considered acceptable
Here are some examples of technical and soft-skill objectives:
Technical objective – The participant will be able to use the On-Line Quick Reference (OLQR) to post a goods issue using movement type 201 and cost center 4010591 within 10 minutes.
Soft-skill objectives – The participant will:
- Get the customer’s name during the first 45 seconds of the phone call.
- Use the customer’s name when confirming/restating the customer complaint.
- Confirm/restate the customer’s issue within two minutes.
The role of exercises
Many training programs provide a post-test on the knowledge gained by the learner. While this is an effective practice for information, it has little benefit when the competency is actual performance rather than just knowledge.
This is why exercises in training courses are so important. Exercises give the learner the opportunity to practice and demonstrate competence at the level of the objective. By developing appropriate exercises, the instructional designer provides the link to the objectives and competencies.
When the task or competency is performed (as most competencies are) the exercise should be in the form of a work-related scenario. Again, using our technical and soft-skill training categories, here are a few examples of different exercises:
- Technical exercise: You are filling an order for a bank and need to issue two covers (00C 502Z01) for the order. Perform a goods issue for the covers using posting date 1/5/2005, movement type 201, and cost center 4010591. You have 10 minutes to complete the scenario.
- Soft-skills exercise: You receive a call from a customer complaining about an item on their credit card bill. Complete the first three steps of an effective customer service call within two minutes.
The final part of the formula is the evaluation. The evaluation ties together the competencies, objectives, and exercises with measurement of the performance. The evaluation tool measures the competencies that are defined by the objectives and completed during the exercise. Examples of technical and soft-skill evaluation tools are shown below.
Technical: example evaluation
Inventory movement: Did the participant…
Post a goods issue within 10 minutes?
Use the OLQR to post a goods issue?
Soft skill: example evaluation
Customer service call: Did the participant…
Get the customer’s name during the first 45 seconds of the phone call?
Use the customer’s name when confirming/restating the customer complaint?
Confirm/restate the issue accurately within two minutes?
Module development checklist
I use a module development checklist for a self-check and as a tool when I’m coaching subject matter experts or non-training professionals. The module development checklist follows the formula described in this article. The link below is a PDF version of a checklist that I think is especially helpful:
Download a training module development checklist based on this model.
By following a formula that integrates the evaluation with competencies, objectives, and exercises, both trainers and training departments can dramatically improve the success of their training. If you’re like me, you are probably saying, “But how do I know the knowledge and skills transfer to the workplace?” Good question — we’ll discuss behavior on the job (Level 3) in future articles.
Jay Kasdan, pictured above, is a project/account manager at Fredrickson Learning. He has helped numerous training programs succeed through effective training measurement. Contact us to learn more about our training measurement services.