The CEO Refresher Websites for Professionals
Take control of your online presence
with your own professional website!
  Gradient
       
   

A Practical Approach for Measuring the
Business Impact of Training

by Dr. Perry Alter and Dr. Shelley A. Kirkpatrick

 
   
 
   

Companies invest in training and development in order to improve their business results.  Kirkpatrick's four- level training evaluation approachis often used to discern whether the training achieved its goals.  Evaluation Levels 1 through 3, which measure student reaction, learning, and behavioral application, respectively, are relatively straightforward to carry out.  However, many companies stop there.  They fail to measure business impact and/or the return on investment (ROI) of that training.  And when they do attempt to measure it, they often have difficulty calculating it and interpreting it.

The result is that the true business impact of the training is never known.  As long as students enjoy the training, demonstrate an acceptable end-of-course test score, and display some new on-the-job behaviors, the training is considered to have achieved its goal.  Without measuring business results and/or ROI, companies don’t know whether their business needs that precipitated the training were met or not, and they don’t know if the improvements justified the cost of the training.

How can a positive evaluation occur at Levels 1 through 3 but fail to impact productivity or other business results?  Let’s take an example of a company that relies on IT projects.  This company, like many others, relies on its IT projects to make its workforce more productive.  In short, the company wants to get the same amount of work done with fewer staff using technology to improve efficiency.  As with many companies, its IT projects are often behind schedule and over budget. 

An initial group of 20 project managers are sent to a weeklong introductory project management class, with the goal of improving their project management competencies, in order to improve project outcomes and impact the company’s productivity.  The project managers, although extremely busy, attend the training. They give the course high marks on the end-of-class evaluation form.  They score highly on the test given at the end of the course. A follow-up evaluation three months later reveals that they are applying what they learned; this is confirmed by their managers. 

But, the evaluation stops there.  Project outcomes are not re-examined to see if fewer projects are behind schedule or over-budget.  The Chief Learning Officer (CLO) spends some time trying to compute ROI, but isn’t sure how to calculate the costs and benefits. And even if projects are getting done earlier, are the project managers investing their newly freed up time in profitable ways?

In this example, the company invested in a training course.  But, it doesn’t know if it recouped the cost of the training, including project managers’ time and the cost of the training class.  The CLO doesn’t know if she should send additional project managers to the training or purchase more advanced courses for her project managers.

The Future Focused Approach

Perhaps the most common approach to measuring the value of training is to survey the training participants immediately following a class and ask them hypothetical, “Level 4 like” questions about how they expect to use the training once they get back to the job.  These often include questions such as, "Will you be able to apply what you learned? Will your job performance be impacted as a result of this class?  What do you expect the extent of the training impact to be on your productivity?"

In fact, we often include questions like this in our own course evaluations. These questions provide a quick indication of the potential impact of the training.  They are meant to identify any immediate issues with the course.  However, they are not meant to serve as a final measure of training’s ultimate impact. To properly evaluate actual training impact, sufficient time must be allowed for employees to apply what they learned and for their new behaviors to impact the outcomes of interest.  This often leads to a more rigorous approach to measuring impact.

The Quantitative Approach

The quantitative approach to computing ROI begins with a straightforward formula:

% Return on Investment =  Benefits    x 100
                                  Costs

ROI seems easy to compute at first—add up the costs, add up the benefits, and then do the math. But, in actuality, it is difficult to accurately identify and measure costs and benefits.  Costs can include:

  • Design and development costs, including time for course designers and developers, resources and tools needed to develop the courses (such as copyrights or development software), and travel and expenses incurred by designers and developers.

  • Administrative costs to promote the training, schedule employees to take the training, and identify a qualified instructor.

  • Employee costs, including employees’ time to travel to the course and take the course; even harder to measure is the productivity loss incurred when the employee is pulled away from the job.

  • Direct course costs include the cost of the instructor to prepare to teach the course, instructor time to teach the course, overhead or room rental costs, instructor travel costs, and the cost of supplies or meals.

Similarly, there is more to the identification of benefits than meets the eye.  Benefits can include:

  • Direct productivity increases—more output for the same level of effort—of the employees who attended the training.

  • Savings—the same output for less effort—through fewer errors, more focus, or increased motivation.

  • Indirect productivity increases—such as improving a team’s productivity by finding efficiencies or improving quality.  For example, increased sales revenue can be a meaningful and measurable benefit of a sales training course.

  • Indirect savings, which may include more efficient use of supplies, improved safety, or better utilization of tools or other resources.

Our point is not to list all possible training costs and benefits, but rather to illustrate that it is easy to get bogged down in the details of the measures.  Many positions are not readily measurable.  For some jobs, it is easy to quantify productivity, such as production jobs (# of widgets per hour) or sales representatives (dollar amount of sales per week). In this case, the quantitative approach may be a good approach to use.

However, performance is not easily quantified for many jobs, such as management positions and non-production jobs.  In our experience, most training initiatives apply to several jobs in a company or across companies. If we train 30 people in a class, they often represent as many as 10 or 20 different jobs.  Except in the rare (for us) instances where most training participants are from the same job, AND the job is readily measurable, it is very difficult to measure impact using simple and standardized measurements.

A Balanced Alternative

In an attempt to find a way to quantify ROI while at the same time doing it in a manner that is easily understandable and inexpensive to compute, we have developed a balanced alternative.  This alternative approach, like other approaches to ROI, is best carried out after employees have had a chance to apply their new knowledge and skills to their job, which is typically six to nine months following training. 

Our Balanced ROI approach collects data from those closest to the productivity improvements, namely the employees who took the training.  We rely on employees to tell us how the training has directly impacted them.  As a reminder, this is most appropriate in environments where the training participants have different jobs, or when they have difficult-to-measure jobs (which, in our experience, almost always is the case). Our approach asks employees to identify specific examples, so that their estimates are based on real data, not hypotheticals. Although it doesn’t sum up ROI in a single number, it does provide enough evidence of training impact to make a reasonable determination of whether the training is returning more benefit than it cost.

What we do is follow-up with a short survey six to nine months following the training and ask them these questions:

  1. Describe a specific example of when you used the skills you learned in the class. What was the situation, what did you do, and how did it work out? Please make sure to explain how this example illustrates skills learned in the class.

  2. Estimate the dollar benefits that accrued to your company due to the actions you took. These dollar benefits can reflect increased revenue, reduced costs, reduced turnover, reduced time-to-market, reduced fees, and so on. Please be as specific as you can. If in doubt, be conservative.

  3. Describe exactly how you calculated the dollar amount of the benefits.

The balanced approach provides concrete examples that can be communicated to and understood by company leaders.  Rather than presenting impact in terms of a percentage, it uses real-life examples of success to which executives can relate, use to reinforce change and innovation, and share as lessons learned and best practices across the company. Most importantly, the evidence for success is fairly concrete. Granted, many examples are not articulately substantiated; leaders can use their own judgment to determine which examples to accept as adequately justified.

Below is a hypothetical illustration of what such as example might look like. In our experience, about 10 to 20% of the examples are as articulate as the one below. This percentage is likely a function of many factors, including how well employees understand the impact of their work on the bottom line and the nature of the skills taught.

 
Questions
Example
1. Describe a specific example of when you used the skills you learned in the Project Management class. What was the situation, what did you do, how did it work out? Please make sure to clarify how this example illustrates skills learned in the Project Management class. The Project Management class taught me the importance of follow-up. During the programming phase of the ABC project, I called the lead programmer once a week just to check in and see if everything was okay. We discussed issues as they arose and kept the project continuously moving. In the past, he had just made decisions in a vacuum, and when I later saw the deliverable, I often had to send it back for revisions. That added many weeks to the life of the project. On this project, he finished on time, and I didn't have to send it back for revisions.
2. Estimate the dollar benefits that accrued to your company due to the actions you took. These dollar benefits can reflect increased revenue, reduced costs, reduced turnover, reduced time-to-market, reduced fees, etc. Please be as specific as you can. If in doubt, be conservative. $30,000
3. Describe exactly how you calculated that amount. The additional functionality due to the programming from the ABC project enabled us to raise our prices by $5,000 per implementation. Because of my new follow-up skills, we shaved 4 weeks off the usual project time. During those 4 weeks, we sold 6 more implementations.

6 implementations x $5,000 extra per implementation = $30,000.

These examples, and the calculations represented in them, are easy to understand and make a compelling case about the impact of the training. These answers to the Balanced ROI questions provide unique feedback and insights to trainers and course designers about how and why the course material impacts performance in the company at hand. Each company is somewhat unique and therefore how employees apply what they learned varies from company to company. It also may vary within the same company. Trainers who have taught the same class a dozen times can still learn about training impact despite not having seen employees in action.

Learning Point for Us

In an early version of this approach, we used a program to add up the dollars across examples. We quickly learned such a step was mathematically doable but was not a good idea because many of the examples provided by employees contained inadequate substantiation. Nevertheless, we could still identify examples that provided very clear and credible substantiation.

Some additional tips for using the Balanced ROI approach are to provide a structured format for respondents or a sample response that displays the desired level of detail.  This may reduce the number of unusable responses.  Another tip is to ask employees to provide, at their own option, their names and contact information so that an experienced training professional can follow up to verify the accuracy of the example. 

Finally, asking employees to provide multiple examples could yield additional usable responses.  However, keep in mind that the goal is not necessarily to generate a large number of examples; the goal is to provide enough examples that paint a compelling picture about the impact that the training had.

Closing Thoughts

First, we have applied this method with selection, development, inclusion, negotiation, and communication training offerings. It is adaptable to any training offering and arguably to any intervention. Its flexibility is perhaps its greatest strength.

Second, a critical benefit of this approach is that it shows what training impact actually "looks like." The examples are often easy to understand and relate to, and as stated earlier, can be used to illustrate best practices of what the training is intended to do.

Third, Balanced ROI questions can be easily included in a broader survey that asks standard training evaluation questions about obstacles, improvement needs, relevance of exercises, and other staples of training evaluation surveys.

Fourth, we should add that our discussion above only addresses the "benefits" side of the ROI equation. The company should still measure costs as a comparison point, using whatever approach it deems appropriate.

Fifth, by asking participants for one example, are we placing an arbitrary upper cap on the measurable benefits? What if someone really had two examples? Or five? And what about all the people who don't respond to the evaluation survey? Do their benefits go unmeasured? These are legitimate points, leading to the conclusion that the total of the resulting estimates is conservative. We have found that our clients resist asking for extra examples, in the interest of user-friendliness. At the same time, the information that we do obtain is very concrete and telling.

Sixth, there will be cases where the dollar benefits indicated by the credible examples obtained may be less than the identified costs. In such a case, you may ask yourself whether you have all the information, and whether the comparison is fully informed.

Despite this last point, the approach provides a great deal of information about how effective your training is and lets you increase the number of situations where you can collect business-impact data.


       
   
 
       
   

The Authors

Perry Alter

 

 

Shelley Kirkpatrick

 

Perry Alter, Ph.D., is Principal of Lighthouse Consultants of the Southeast, based in Florida. He has been providing Talent Management consulting for many years, and has provided services to companies such as Amgen, Royal Bank, FedEx, Motorola, Chevron, and many others. His consulting practice focuses on employee selection, job analysis, and competency modeling. He is very passionate about assessment for the sake of continuous development. It is good for the individual, good for the organization, and makes work more engaging.

Shelley A. Kirkpatrick, Ph.D., is the Director of Assessment Services for Management Concepts (www.managementconcepts.com), a professional services company that specializes in training, publishing, and consulting, and is located in Vienna, Virginia. She has over 20 years of experience in developing individual and organizational assessments for the private sector as well as national security and defense organizations. A former professor at Carnegie Mellon University and The American University, Dr. Kirkpatrick has authored numerous articles on leadership, motivation, and corporate espionage that have appeared in academic journals as well as practitioner-based publications.

 
       
   
 
       
   
Many more articles in Training & Development in The CEO Refresher Archives
 
       
   
 
       
   
The CEO Refresher
 
       
   

Copyright 2011 by Perry Alter and Shelley A. Kirkpatrick. All rights reserved.

Current Issue - Archives - CEO Links - News - Conferences - Recommended Reading

Refresher Publications