Since 1998, money spent on learning and developmentas a retention tool -- from leadership-development training to management-skillsseminars -- has increased 15 percent, from $221 per employee to $252, accordingto Hackett Benchmarking and Research, a firm that tracks best practices in HR,finance, and other areas of "knowledge work." The company has an 11-year,ongoing study of 1,600 companies, a list that includes 80 percent of the DowJones Industrials and two-thirds of the Fortune 100. But can that levelof investment continue during the economic downturn?
"People will spend money on training, in goodtimes or bad," says Tom McKenzie, vice president of Boston-based Provant,which provides companies with training services and products to improve performance."The money might go up or down, but training still goes on." People"just bury the costs," he says.
But for those HR professionals and training managerswho aren't adept at hiding training in their pared-down budgets, now is thetime of reckoning. In companies where training was considered to be a nice perk,like free doughnuts on Friday or on-site dry-cleaning service, programs arelikely to be on the chopping block. Even in companies where training was viewedas very important, it might now be perceived as less important than other areasof the business.
If training programs are going to survive the budgetknives during this economic downturn, they have to prove their worth. Did trainingbring the company new customers? Did it reduce turnover? Did it increase sales?And can HR actually show how training affects the bottom line?
Absolutely, says Richard Roth, managing director ofHackett Benchmarking and Research.
Companies that spend more than the average amount ontraining have a higher placement of internal hires, and that reduces, in realdollars, recruiting costs and downtime, he says.
"The other thing we're able to show is that companiesthat spend more on training have lower annual turnover."
Companies that spend $218 per employee in training and development have morethan 16 percent annual voluntary turnover, Roth says. Companies that spend $273per employee have less than 7 percent annual voluntary turnover. "To me,it's pretty compelling," he says.
Jack J. Phillips, an expert in measurement, evaluation,and return on investment, agrees with Roth about the opportunities that HR hasto demonstrate the link between training and business results. "It canbe done, and done accurately, but not as easily as some people would like,"he says. "You can get that connection with credibility and accuracy, butit involves a disciplined methodology to do that." Here's a short coursein how to do it.
The steps that show the bottom-line value of training
Determine how the training is connected to a businessneed. Too often, Phillips says, training is put in place without enoughforethought. "Management says, 'I want these 10 core competencies in allour employees.' It may be the right thing to do, but we would like to precedethat with an understanding of how it helps the business if we do that."
Make sure the program has clear objectives.Training programs should have a learning objective: some observable and measurablebehavior at the end of the process, Phillips says. There are three types oflearning objectives: awareness -- a familiarity with terms, concepts, and processes;knowledge -- a general understanding of concepts, process, or procedures; andperformance -- an ability to demonstrate skills on at least a basic level.
Good training should have two more objectives: application-- "What do we expect you to do differently?"; and impact -- "Whatbusiness measure will you drive if you do this?" Impact objectives areoften such hard data as output, quality, cost, and time, Phillips says in HumanResources Scorecard: Measuring the Return on Investment (Butterworth-Heinemann,2001), co-authored with Ron D. Stone and Patricia Pulliam Phillips. Soft-dataimpact objectives include customer service, work climate, and work habits. WhatPhillips often sees missing from training programs are the application and impactobjectives.
NAPCO, a company whose case study Phillips uses inReturnon Investment in Training and Performance Improvement Programs (Butterworth-Heinemann,1997), used a needs assessment, application objectives, and impact objectivesin its program, "Motivating Employees for Improved Performance."
At the time of the training, the company supplied theautomotive industry with rubber and plastic parts. Now, as NAPCO International,it designs and markets replacement parts and upgrade kits for U.S.-made militarywheeled and tracked vehicles, helicopters, and plane. At the time of the casestudy, NAPCO had declining sales that were traced, in part, to front-line management'slack of leadership skills. A needs assessment confirmed the problem throughsuch hard- and soft-data indicators as percentage of shipments met or missed(productivity), parts rejected (quality), turnover, and absenteeism. The companycould measure how effective the training was by monitoring changes in thoseareas.
Training focused on teaching managers how to understandand motivate employees for improved performance, inspire teamwork, and demonstrateleadership. The pilot class, a 24-hour, six-module program taught over the courseof a month, asked participants to apply the skills they'd learned, so that trainingtransferred immediately to the job.
Determine the return on investment. The ideaof evaluating training on four levels -- reaction and implementation; learning;application; and impact -- was developed nearly 30 years ago by D. L. Kirkpatrick,Phillips notes. But even a program that had a measurable business result mighthave been delivered at too great an expense. A fifth element should be added,he says: return on investment (ROI).
The steps for determining the return on investmentdon't have to be complicated, he says.
Collect data to demonstrate the change in behavior. You need materialto show you both the situation before training and the situation after training.This is another important area that is typically overlooked in evaluations,Phillips says.
The data you're collecting to reflect how training has changed a behaviorcan include surveys, questionnaires, on-the-job observation, post-programinterviews, focus groups, performance monitoring, and performance contracts(in which a participant, the instructor, and the participant's supervisoragree on specific outcomes from training). The data collection at NAPCOcame in the form of an action plan, designed to show how the new skillswere applied to affect productivity, quality, turnover, and absenteeism.Data collection has to be "sensible," Phillips says.
"If it consumes too many resources, or busies the organization toomuch, sample a small number of participants, and keep the disruption, cost,and time to a minimum. But plan on that follow-up collection process, sothat participants will know they're going to be involved."
Isolate the effect of training. The question that arises over andover again, Phillips says, is whether a change that's seen in an organizationcomes about because of training or other factors. It's the hardest thingto determine, but it's critical, he says.
"If we ignore this issue altogether," there's no point in pursuingan ROI process, he says. "The whole management team wonders, 'Was itthe training? Or something else?'"
Phillips has identified 10 strategies that can be used to isolate the effectof training, including control groups, trend lines (to project the valuesof specific output variables if training had not been undertaken), and forecastingmodels. One simple measure, he says, is to ask the participants themselveshow much of the change in their behavior is attributable to training. "Theyoften know more about influences than we give them credit for," hesays.
United Petroleum International used the participant rating method to evaluatethe effect on its 117 sales engineers and 8 sales managers of an e-learningprogram designed to improve sales skills such as client partnerships, productpricing and contracting, and selling more profitable products. It was importantto identify the training program's effect by itself, because the companyhad instituted a new incentive plan at about the same time.
Sales at the company increased after the training and incentive plans wereput into place. The sales engineers and managers said that the trainingprogram was responsible for about 37 percent of the increase. The new incentiveplan was responsible for about 36 percent. The effect of coaching by salesmanagers was put at 17 percent, with the remaining 10 percent of influencedue to executive management's input, market changes, new products, and otherfactors. When the ROI calculation was made, the role of training in thesales increase was taken into account.
Convert the data to monetary value. This is often the part thatis the most difficult for many HR pros and training managers, Phillips says."Their training hasn't been doing quantitative data and hard numbers,"he says. One participant in a measurement and ROI workshop told Phillipsthat he'd gone into training specifically to escape the numbers. "Thatunderscores the issue," Phillips says. "Many people are not willing,or don't have the desire, to deal with these issues, but we are forced to."
The steps in the conversion process are:
Focus on a unit of measure. It could be hard data, such as unitsproduced, error rates, or overtime; or soft data, such as tardinessor requests for transfers.
Determine a value for the unit. This is more complex for softdata, but Phillips says it can be done. If there is a project on reducingturnover at a company, for instance, and there's not a standard acceptablevalue for the cost of a turnover, Phillips will offer several studiesof the cost of turnover among engineers and ask the organization toconsider what most reflects their experience. In United Petroleum'scase, the measurement was hard data: additional sales closed.
Calculate the change in performance data. Figure the changein output data after the effects of training have been isolated fromother influences. In United Petroleum's case, sales of petroleum productsimproved by 2.65 additional closes per month, but only 37 percent ofthat could be attributed to the effect of the sales-skills trainingprogram, leaving a factor of .98. This amounted to a net profit marginper close of $1,323, attributable just to the training program.
Determine an annual amount for the change. For United Petroleum,this came out to $15,876 per sales engineer.
Calculate the total value of the improvement. When the annualamount was multiplied by 117 sales engineers, United Petroleum saw anincrease in sales of $1,857,492 per year, directly attributable to thetraining program.
Tabulate program costs. Phillips recommends that this be the "fullyloaded cost of training." That is, the value of the cost of takingpeople away from their jobs for the training -- including salary and benefits."That's what the company lost by not having people on the job. If they'rewilling to pay for the job, that should be our standard." For UnitedPetroleum, the cost of the training program was $606,600.
Calculate the return on investment. It's the net benefits, dividedby the costs, times 100 percent.
United Petroleum's ROI worked out this way: $1,857,000 (the sales increase,rounded) minus $606,600 (the cost of the program). That yields the net benefitof the program ($1,250,400). That amount is divided by cost, $606,600, times100 percent. The result is a 206 percent return on the investment in thetraining program for the company. Considering that most companies regardan ROI of 25 percent as a good outcome, this training program was an excellentone, according to Phillips.
Can this be done even when your program is targeted for elimination?
Phillips says he's often called in as a consultant toanalyze programs that are on the chopping block. "We probably will uncovera lot of things that are not working -- programs that had no up-front analysis,no objectives, no plans for collection of data -- all the classic issues andproblems, but they have now got to show value.
"We do the same process: collect data, isolatethe training effects, capture the ROI. The problem is the program is often notadding the kind of value it should because it was flawed from the beginning.This analysis shows the problem. The good thing is you can learn a lot of lessons."
If there's time and commitment, some programs can befixed, he says. Sometimes, though, a training program just doesn't have whatit takes.
"Usually, if the perception is that it's not addingvalue, more than likely it's not," he says. "We have caused a lotof programs to die peaceful, graceful deaths with this process. We don't liketo use it to kill programs, though. We like to see it as process improvement.We can adjust, change, and redirect a program so that it can be useful."
Employee Development Links
- Best Practices, LLC: BenchmarkingReports in Human Resources
- Includes summaries of several reports, including "Best Practices in EmployeePerformance Management and Development"
- AMA Survey and report - Facingthe Future: Challenges and Dynamics for the Office Professional
- Secretaries and managers' views of skills need in the secretarial role, andthe difference in perception between the two groups
- Study: "TheEffective Use of Multimedia Distance Learning Technology"
- Marcie A. Cavanaugh and George T. Milkovich, Cornell University, School ofIndustrial and Labor Relations
- Report: "HumanCapital - A Self-Assessment Checklist for Agency Leaders"
- 44-page report from the General Accounting Office's Office of ComptrollerGeneral, includes sections on employee development and creating a performanceculture
Workforce, September 2001, pp.52-56 -- Subscribe Now!