Evaluating HRD Programs
CHAPTER 7
Werner & DeSimone
Effectiveness
2
The degree to which a training (or other HRD
program) achieves its intended purpose
Measures are relative to some starting point
Measures how well the desired goal is achieved
Werner & DeSimone (2006)
Evaluation
3
Werner & DeSimone (2006)
HRD Evaluation
4
Textbook definition:
“The systematic collection of descriptive and
judgmental information necessary to make
effective training decisions related to the
selection, adoption, value, and modification of
various instructional activities.”
Werner & DeSimone (2006)
In Other Words…
5
Are we training:
the right people
the right “stuff”
the right way
with the right materials
at the right time?
Werner & DeSimone (2006)
Evaluation Needs
6
Descriptive and judgmental information
needed
Objective and subjective data
Information gathered according to a plan and
in a desired format
Gathered to provide decision making
information
Werner & DeSimone (2006)
Purposes of Evaluation
7
Determine whether the program is meeting the
intended objectives
Identify strengths and weaknesses
Determine cost-benefit ratio
Identify who benefited most or least
Determine future participants
Provide information for improving HRD programs
Werner & DeSimone (2006)
Purposes of Evaluation – 2
8
Reinforce major points to be made
Gather marketing information
Determine if training program is appropriate
Establish management database
Werner & DeSimone (2006)
Evaluation Bottom Line
9
Is HRD a revenue contributor or a revenue user?
Is HRD credible to line and upper-level managers?
Are benefits of HRD readily evident to all?
Werner & DeSimone (2006)
How Often are HRD Evaluations Conducted?
10
Not often enough!!!
Frequently, only end-of-course participant reactions
are collected
Transfer to the workplace is evaluated less frequently
Werner & DeSimone (2006)
Why HRD Evaluations are Rare
11
Reluctance to having HRD programs
evaluated
Evaluation needs expertise and resources
Factors other than HRD cause performance
improvements – e.g.,
Economy
Equipment
Policies, etc.
Werner & DeSimone (2006)
Need for HRD Evaluation
12
Shows the value of HRD
Provides metrics for HRD efficiency
Demonstrates value-added approach for HRD
Demonstrates accountability for HRD
activities
Everyone else has it… why not HRD?
Werner & DeSimone (2006)
Make or Buy Evaluation
13
“I bought it, therefore it is good.”
“Since it’s good, I don’t need to post-test.”
Who says it’s:
Appropriate?
Effective?
Timely?
Transferable to the workplace?
Werner & DeSimone (2006)
Evolution of Evaluation Efforts
14
1. Anecdotal approach – talk to other users
2. Try before buy – borrow and use samples
3. Analytical approach – match research data to
training needs
4. Holistic approach – look at overall HRD
process, as well as individual training
Werner & DeSimone (2006)
Models and Frameworks of Evaluation
15
Table 7-1 lists six frameworks for evaluation
The most popular is that of D. Kirkpatrick:
Reaction
Learning
Job Behavior
Results
Werner & DeSimone (2006)
Kirkpatrick’s Four Levels
16
Reaction
Focus on trainee’s reactions
Learning
Did they learn what they were supposed to?
Job Behavior
Was it used on job?
Results
Did it improve the organization’s effectiveness?
Werner & DeSimone (2006)
Issues Concerning Kirkpatrick’s Framework
17
Most organizations don’t evaluate at all four
levels
Focuses only on post-training
Doesn’t treat inter-stage improvements
WHAT ARE YOUR THOUGHTS?
Werner & DeSimone (2006)
Other Frameworks/Models
18
CIPP: Context, Input, Process, Product (Galvin,
1983)
Brinkerhoff (1987):
Goal setting
Program design
Program implementation
Immediate outcomes
Usage outcomes
Impacts and worth
Werner & DeSimone (2006)
Other Frameworks/Models – 2
19
Kraiger, Ford, & Salas (1993):
Cognitive outcomes
Skill-based outcomes
Affective outcomes
Holton (1996): Five Categories:
Secondary Influences
Motivation Elements
Environmental Elements
Outcomes
Ability/Enabling Elements
Werner & DeSimone (2006)
Other Frameworks/Models – 3
20
Phillips (1996):
Reaction and Planned Action
Learning
Applied Learning on the Job
Business Results
ROI
Werner & DeSimone (2006)
A Suggested Framework – 1
21
Reaction
Did trainees like the training?
Did the training seem useful?
Learning
How much did they learn?
Behavior
What behavior change occurred?
Werner & DeSimone (2006)
Suggested Framework – 2
22
Results
What were the tangible outcomes?
What was the return on investment (ROI)?
What was the contribution to the organization?
Werner & DeSimone (2006)
Data Collection for HRD Evaluation
23
Possible methods:
Interviews
Questionnaires
Direct observation
Written tests
Simulation/Performance tests
Archival performance information
Werner & DeSimone (2006)
Interviews
24
Advantages: Limitations:
Flexible High reactive effects
Opportunity for High cost
clarification Face-to-face threat
Depth possible potential
Personal contact Labor intensive
Trained observers
needed
Werner & DeSimone (2006)
Questionnaires
25
Advantages: Limitations:
Low cost to administer Possible inaccurate data
Honesty increased Response conditions not
controlled
Anonymity possible
Respondents set varying
Respondent sets the pace paces
Variety of options Uncontrolled return rate
Werner & DeSimone (2006)
Direct Observation
26
Advantages: Limitations:
Nonthreatening Possibly disruptive
Excellent way to measure Reactive effects are
behavior change possible
May be unreliable
Need trained observers
Werner & DeSimone (2006)
Written Tests
27
Advantages: Limitations:
Low purchase cost May be threatening
Readily scored Possibly no relation to
job performance
Quickly processed
Measures only cognitive
Easily administered learning
Wide sampling possible Relies on norms
Concern for racial/
ethnic bias
Werner & DeSimone (2006)
Simulation/Performance Tests
28
Advantages: Limitations:
Reliable Time consuming
Objective Simulations often
Close relation to job difficult to create
performance High costs to
Includes cognitive, development and use
psychomotor and
affective domains
Werner & DeSimone (2006)
Archival Performance Data
29
Advantages: Limitations:
Reliable Criteria for keeping/
Objective
discarding records
Information system
Job-based
discrepancies
Easy to review Indirect
Minimal reactive effects Not always usable
Records prepared for
other purposes
Werner & DeSimone (2006)
Choosing Data Collection Methods
30
Reliability
Consistency of results, and freedom from
collection method bias and error
Validity
Does the device measure what we want to
measure?
Practicality
Does it make sense in terms of the resources
used to get the data?
Werner & DeSimone (2006)
Type of Data Used/Needed
31
Individual performance
Systemwide performance
Economic
Werner & DeSimone (2006)
Individual Performance Data
32
Individual knowledge
Individual behaviors
Examples:
Test scores
Performance quantity, quality, and timeliness
Attendance records
Attitudes
Werner & DeSimone (2006)
Systemwide Performance Data
33
Productivity
Scrap/rework rates
Customer satisfaction levels
On-time performance levels
Quality rates and improvement rates
Werner & DeSimone (2006)
Economic Data
34
Profits
Product liability claims
Avoidance of penalties
Market share
Competitive position
Return on investment (ROI)
Financial utility calculations
Werner & DeSimone (2006)
Use of Self-Report Data
35
Most common method
Pre-training and post-training data
Problems:
Mono-method bias
Desire to be consistent between tests
Socially desirable responses
Response Shift Bias:
Trainees adjust expectations to training
Werner & DeSimone (2006)
Research Design
36
Specifies in advance:
the expected results of the study
the methods of data collection to be used
how the data will be analyzed
Werner & DeSimone (2006)
Research Design Issues
37
Pretest and Posttest
Shows trainee what training has accomplished
Helps eliminate pretest knowledge bias
Control Group
Compares performance of group with training against the
performance of a similar group without training
Werner & DeSimone (2006)
Recommended Research Design
38
Pretest and posttest with control group
Whenever possible:
Randomly assign individuals to the test group and the control
group to minimize bias
Use “time-series” approach to data collection to verify
performance improvement is due to training
Werner & DeSimone (2006)
Ethical Issues Concerning Evaluation
Research
39
Confidentiality
Informed consent
Withholding training from control groups
Use of deception
Pressure to produce positive results
Werner & DeSimone (2006)
Assessing the Impact of HRD
40
Money is the language of business.
You MUST talk dollars, not HRD jargon.
No one (except maybe you) cares about “the
effectiveness of training interventions as
measured by and analysis of formal pretest,
posttest control group data.”
Werner & DeSimone (2006)
HRD Program Assessment
41
HRD programs and training are investments
Line managers often see HR and HRD as
costs – i.e., revenue users, not revenue
producers
You must prove your worth to the
organization –
Or you’ll have to find another organization…
Werner & DeSimone (2006)
Two Basic Methods for Assessing Financial
Impact
42
Evaluation of training costs
Utility analysis
Werner & DeSimone (2006)
Evaluation of Training Costs
43
Cost-benefit analysis
Compares cost of training to benefits gained such as
attitudes, reduction in accidents, reduction in employee
sick-days, etc.
Cost-effectiveness analysis
Focuses on increases in quality, reduction in
scrap/rework, productivity, etc.
Werner & DeSimone (2006)
Return on Investment
44
Return on investment = Results/Costs
Werner & DeSimone (2006)
Calculating Training Return On Investment
45
Results Results
Operational How Before After Differences Expressed
Results Area Measured Training Training (+ or –) in $
Quality of panels % rejected 2% rejected 1.5% rejected .5% $720 per day
1,440 panels 1,080 panels 360 panels $172,800
per day per day per year
Housekeeping Visual 10 defects 2 defects 8 defects Not measur-
inspection (average) (average) able in $
using
20-item
checklist
Preventable Number of 24 per year 16 per year 8 per year
accidents accidents
Direct cost $144,000 $96,000 per $48,000 $48,000 per
of each per year year year
accident
Total savings: $220,800.00
Return Operational Results
ROI = Investment = Training Costs
$220,800
= = 6.8
$32,564
SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission.
Werner & DeSimone (2006)
Types of Training Costs
46
Direct costs
Indirect costs
Development costs
Overhead costs
Compensation for participants
Werner & DeSimone (2006)
Direct Costs
47
Instructor
Base pay
Fringe benefits
Travel and per diem
Materials
Classroom and audiovisual equipment
Travel
Food and refreshments
Werner & DeSimone (2006)
Indirect Costs
48
Training management
Clerical/Administrative
Postal/shipping, telephone, computers, etc.
Pre- and post-learning materials
Other overhead costs
Werner & DeSimone (2006)
Development Costs
49
Fee to purchase program
Costs to tailor program to organization
Instructor training costs
Werner & DeSimone (2006)
Overhead Costs
50
General organization support
Top management participation
Utilities, facilities
General and administrative costs, such as
HRM
Werner & DeSimone (2006)
Compensation for Participants
51
Participants’ salary and benefits for time
away from job
Travel, lodging, and per-diem costs
Werner & DeSimone (2006)
Measuring Benefits
52
Change in quality per unit measured in dollars
Reduction in scrap/rework measured in dollar cost of labor
and materials
Reduction in preventable accidents measured in dollars
ROI = Benefits/Training costs
Werner & DeSimone (2006)
Utility Analysis
53
Uses a statistical approach to support claims of
training effectiveness:
N = Number of trainees
T = Length of time benefits are expected to last
dt = True performance difference resulting from
training
SDy = Dollar value of untrained job performance (in
standard deviation units)
C = Cost of training
U = (N)(T)(dt)(Sdy) – C
Werner & DeSimone (2006)
Critical Information for Utility Analysis
54
dt = difference in units between
trained/untrained, divided by standard
deviation in units produced by trained
SDy = standard deviation in dollars, or
overall productivity of organization
Werner & DeSimone (2006)
Ways to Improve HRD Assessment
55
Walk the walk, talk the talk: MONEY
Involve HRD in strategic planning
Involve management in HRD planning and
estimation efforts
Gain mutual ownership
Use credible and conservative estimates
Share credit for successes and blame for failures
Werner & DeSimone (2006)
HRD Evaluation Steps
56
1. Analyze needs.
2. Determine explicit evaluation strategy.
3. Insist on specific and measurable training
objectives.
4. Obtain participant reactions.
5. Develop criterion measures/instruments to
measure results.
6. Plan and execute evaluation strategy.
Werner & DeSimone (2006)
Summary
57
Training results must be measured against
costs
Training must contribute to the “bottom
line”
HRD must justify itself repeatedly as a
revenue enhancer
Werner & DeSimone (2006)