Evaluation of Training
Chapter 4
Effectiveness
A relative term
Effectiveness is determined with respect to the achievement of a goal or a set of goals Must be determined with respect to the goals of the program or programs being examined
Training Effectiveness
Training effectiveness refers to the benefits that the company and the trainees receive from training Training outcomes or criteria refer to measures that the trainer and the company use to evaluate training programs
Effectiveness of Training Programs
T&D Program can be effective in meeting some goals
staying within budget increasing a participants skills
and be ineffective in meeting others
Improving customer satisfaction
How do you ensure effectiveness?
Training and HRD Process
Assessment A ssess needs Design Implementation E valuation P rioritize needs Define objectives Develop lesson plan Develop/ acquire materials S elect trainer/ leader S elect methods and techniques S chedule the program/ intervention Deliver the HRD program or intervention S elect evaluation criteria
Determine evaluation design
Conduct evaluation of program or intervention
Interpret results
Purpose of HRD Evaluation
HRD evaluation:
The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.
HRD Evaluation
Both descriptive and judgmental information may be collected
Descriptive information provides a picture of what is happening or has happened Judgmental information communicates some opinion or belief about what has happened
HRD Evaluation
Evaluation involves the systematic collection of information
According to a predetermined plan to ensure that the information is appropriate and useful
Evaluation is conducted to help make informed decisions about particular programs and methods
HRD Evaluation Can Help
Determine whether a program is accomplishing its objectives Identify the strengths and weaknesses of HRD programs Determine the cost-benefit ratio of an HRD program Decide who should participate in future HRD programs Identify which participants benefited the most or least from the program Gather data to assist in marketing future programs Establish a database to assist management in making decisions
Training Evaluation
Training evaluation refers to the process of collecting the outcomes needed to determine if training is effective Evaluation design refers to from whom, what, when, and how information needed for determining the effectiveness of the training program will be collected
Reasons for Evaluating Training
Companies are investing millions of dollars in training programs to help gain a competitive advantage Training investment is increasing because learning creates knowledge which differentiates between those companies and employees who are successful and those who are not
Training evaluation provides the data needed to demonstrate that training does provide benefits to the company.
Formative Evaluation
Formative evaluation evaluation conducted to improve the training process Helps to ensure that:
the training program is well organized and runs smoothly trainees learn and are satisfied with the program
Provides information about how to make the program better
Summative Evaluation
Summative evaluation evaluation conducted to determine the extent to which trainees have changed as a result of participating in the training program May also measure the return on investment (ROI) that the company receives from the training program
Why Should A Training Program Be Evaluated?
To identify the programs strengths and weaknesses To assess whether content, organization, and administration of the program contribute to learning and the use of training content on the job To identify which trainees benefited most or least from the program
Why Should A Training Program Be Evaluated?
To gather data to assist in marketing training programs To determine the financial benefits and costs of the programs To compare the costs and benefits of training versus non-training investments To compare the costs and benefits of different training programs to choose the best program
The T&D Evaluation Process
Conduct a Needs Analysis
Develop Measurable Learning Outcomes and Analyze Transfer of Training
Develop Outcome Measures
Choose an Evaluation Strategy
Plan and Execute the Evaluation
Evaluation Frameworks
Model
Kirkpatrick (1967, 1987, 1994)
Training Evaluation Criteria
Four levels: Reaction Learning Job Behavior Results
Evaluation Frameworks
Model
CIPP (Galvin, 1983)
Training Evaluation Criteria
Four levels: Context Input Process Product
Evaluation Frameworks
Model
Brinkerhoff (1987)
Training Evaluation Criteria
Six stages: Goal Setting, Program Design, Program Implementation, Immediate Outcomes Intermediate or Usage Outcomes Impacts and Worth
Evaluation Frameworks
Model
Kraiger, Ford, & Salas (1993)
Training Evaluation Criteria
Classification scheme that specifies three categories of learning outcomes cognitive skill-based affective Evaluation measures appropriate for each category of outcomes
Evaluation Frameworks
Model
Holton (1996)
Training Evaluation Criteria
Identifies five categories of variables and the relationships among them: Secondary Influences Motivation Elements Environmental Elements Outcomes Ability/Enabling Elements
Evaluation Frameworks
Model
Phillips (1996)
Training Evaluation Criteria
Five levels: Reaction and Planned Action Learning Applied Learning on the Job Business Results Return on Investment
Saratoga Institute Approach
Elements
1. Training satisfaction 2. Learning Change
Hallmarks
The degree to which participants are satisfied with the training they have received The actual learning that has occurred, with pre-and-post-course instruments The on-the-job change in behavior as a result of the training program Did the change in behavior positively affect the organization?
3. Behavior Change
4. Organizational Change
Kirkpatricks Framework
Reaction
Did trainees like program? Did trainees think it valuable?
Learning
Did they learn what objections said they should learn?
Job Behavior
Did they use learning back on job?
Results
Has HRD improved organizations effectiveness?
Kirkpatrick and Industry
Most organizations do not collect information on all four types of outcomes About one-third of organizations use Kirkpatricks model Some feel it only measures after training Others feel it is more of a taxonomy of outcomes
Brinkerhoffs Six Stages
Goal Setting:
What is the need?
Program Design:
What will work to meet the need?
Program Implementation:
Is it working, with the focus on the implementation of the program?
Immediate Outcomes:
Did participants learn?
Intermediate or Usage Outcomes:
Are the participants using what they learned?
Impacts and Worth:
Did it make a worthwhile difference to the organization?
Classification of Learning Outcomes
Category Learning Constructs Focus of Measurement Potential Evaluation Methods
Cognitive
Verbal knowledge Declarative knowledge
Outcomes
Amount of knowledge Accuracy of recall Speed, accessibility of knowledge Recognition and recall tests
Power tests Speed tests
SOURCE: K. Kraiger, J. K. Ford, & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology, 78, table 1, 323. Copyright 1993 by the American Psycho logical Association. Adapted with permission.
Classification of Learning Outcomes
Category Cognitive Knowledge organization Mental Models Learning Constructs Focus of Measurement Outcomes Similarity to ideal Interrelationships of elements Free sorts Structural assessment Potential Evaluation Methods
SOURCE: K. Kraiger, J. K. Ford, & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology, 78, table 1, 323. Copyright 1993 by the American Psycho logical Association. Adapted with permission.
Classification of Learning Outcomes
Category
Cognitive Cognitive strategies Self insight Metacognitive skills
Learning Constructs
Focus of Measurement
Outcomes Self-awareness Self-regulation
Potential Evaluation Methods
Probed protocol analysis Self-report Readiness for testing
SOURCE: K. Kraiger, J. K. Ford, & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology, 78, table 1, 323. Copyright 1993 by the American Psycho logical Association. Adapted with permission.
Classification of Learning Outcomes
Category
Skill Based
Compilation Composition Proceduralization
Learning Constructs
Focus of Measurement
Outcomes
Speed Fluidity of performance Error rates Chunking Generalization Discrimination Strengthening
Potential Evaluation Methods
Targeted behavioral observation Hands-on testing Structural situational interviews
SOURCE: K. Kraiger, J. K. Ford, & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology, 78, table 1, 323. Copyright 1993 by the American Psycho logical Association. Adapted with permission.
Classification of Learning Outcomes
Category
Skill Based Automaticity Automatic processing Tuning
Learning Constructs
Focus of Measurement
Outcomes Attentional requirements Available cognitive resources
Potential Evaluation Methods
Secondary task performance Interference problems Embedded measurement
SOURCE: K. Kraiger, J. K. Ford, & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology, 78, table 1, 323. Copyright 1993 by the American Psycho logical Association. Adapted with permission.
Classification of Learning Outcomes
Category
Affective Attitudinal Target objective
Learning Constructs
Focus of Measurement
Outcomes Attitude direction Attitude Strength Accessibility Centrality conviction
Potential Evaluation Methods
Self report measures
SOURCE: K. Kraiger, J. K. Ford, & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology, 78, table 1, 323. Copyright 1993 by the American Psycho logical Association. Adapted with permission.
Classification of Learning Outcomes
Category
Affective Motivation Motivational disposition
Learning Constructs
Focus of Measurement
Outcomes Mastery versus performance orientations Appropriateness of orientation
Potential Evaluation Methods
Self report measures
SOURCE: K. Kraiger, J. K. Ford, & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology, 78, table 1, 323. Copyright 1993 by the American Psycho logical Association. Adapted with permission.
Classification of Learning Outcomes
Category
Affective Motivation Self-efficacy
Learning Constructs
Focus of Measurement
Outcomes
Potential Evaluation Methods
Perceived performance capability Goal Setting Level of goals Complexity of goal structures Goal Commitment
Self-report measures Free recall measures Free sorts
SOURCE: K. Kraiger, J. K. Ford, & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of Applied Psychology, 78, table 1, 323. Copyright 1993 by the American Psycho logical Association. Adapted with permission.
Shortcomings of Kirkpatricks Model of Training Evaluation
Lack of explicit causal relationships among the different levels Lack of specificity in dealing with different types of learning outcomes Lack of direction concerning which measures are appropriate to assess which outcome measures
Expanded Framework of Kirkpatrick Model
Reaction
Perceived usefulness/utility
What was the perceived relevance/usefulness of this training?
Post-training attitudes
How well did trainees like training?
Expanded Framework of Kirkpatrick Model
Cognitive learning
How much did trainees learn from the training? Post-training learning
How much learning does the trainee demonstrate immediately after training?
Retention
How much learning does the trainee demonstrate back on the job?
Expanded Framework of Kirkpatrick Model
Behavior
What behavior change occurred as a result of training? Training performance
How well can trainees demonstrate the newly acquired skills at the end of training?
Transfer performance
How well can trainees demonstrate the newly acquired skills back on the job?
Expanded Framework of Kirkpatrick Model
Results
What tangible outcomes or results occurred as a result of training? What was the return on investment (ROI) for this training?
(See ROI and utility sections below; this is Phillips Level 5)
What was the contribution of this training program to the community/larger society?
Outcomes Used in Evaluating Training Programs
Cognitive Outcomes
Skill-Based Outcomes
Affective Outcomes
Results
Return on Investment
Outcomes Used in Evaluating Training Programs:
Cognitive Outcomes
Determine the degree to which trainees are familiar with the principles, facts, techniques, procedures, or processes emphasized in the training program Measure what knowledge trainees learned in the program
Skill-Based Outcomes
Assess the level of technical or motor skills Include acquisition or learning of skills and use of skills on the job
Outcomes Used in Evaluating Training Programs:
Affective Outcomes
Include attitudes and motivation Trainees perceptions of the program including the facilities, trainers, and content
Results
Determine the training programs payoff for the company
Outcomes Used in Evaluating Training Programs:
Return on Investment (ROI)
Comparing the trainings monetary benefits with the cost of the training direct costs indirect costs benefits
Good Outcomes: Relevance
Criteria relevance the extent to which training programs are related to learned capabilities emphasized in the training program Criterion contamination extent that training outcomes measure inappropriate capabilities or are affected by extraneous conditions Criterion deficiency failure to measure training outcomes that were emphasized in the training objectives
Criterion deficiency, relevance, and contamination:
Outcomes Measured in Evaluation
Outcomes Related to Training Objectives
Outcomes Identified by Needs Assessment and Included in Training Objectives
Contamination
Relevance
Deficiency
Good Outcomes (continued)
Reliability degree to which outcomes can be measured consistently over time Discrimination degree to which trainees performances on the outcome actually reflect true differences in performance Practicality refers to the ease with which the outcomes measures can be collected
Training Program Objectives and Their Implications for Evaluation:
Objective Learning Transfer
Outcomes
Reactions: Did trainees like the program? Did the environment help learning? Was material meaningful? Pencil-and-paper tests Performance on a work sample Skill-Based: Ratings by peers or managers based on observation of behavior
Cognitive: Skill-Based:
Affective: Results:
Trainees motivation or job attitudes Did company benefit through sales, quality, productivity, reduced accidents, and complaints? Performance on work equipment
A Stakeholder Scorecard to Training Evaluation
SENIOR MANAGEM ENT CONTRIBUTIONS Measure 1 Measure 2 Measure 3 INDUCEMENTS Measure 1 Measure 2 Measure 3
TRAINEES CONTRIBUTIONS Measure 1 Measure 2 Measure 3 INDUCEMENTS Measure 1 Measure 2 Measure 3 TRAINING
TRAINEES MANAGER S CONTRIBUTIONS Measure 1 Measure 2 Measure 3 INDUCEMENTS Measure 1 Measure 2 Measure 3
TRAINERS CONTRIBUTIONS Measure 1 Measure 2 Measure 3 INDUCEMENTS Measure 1 Measure 2 Measure 3
Major Goal of Training Evaluation
Making HRD programs investments
Leading to measurable payoffs in the future
Two practical Methods
Evaluation of training costs
Return on Investment (ROI)
Utility analysis
Types of Cost Analysis
Cost-benefit analysis
comparing monetary costs of training to benefits received in non-monetary terms
improvements in attitudes, safety, and health
Cost-effectiveness analysis
financial benefits accrued from training
increases in quality and profits reduction in waste and processing time
Return on Investment (ROI)
Most common business ratio for determining performance ROI = Results Training Costs If ROI < 1, training costs more than benefits accrues If ROI > 1 benefits accrue Greater the ratio, the better the benefit
Types of Costs in Training
Direct Costs Directly associated with delivery of learning activities Course materials reproduced or hired Instructional aids Equipment rental Travel Food Instructors salary & benefits
Types of Costs in Training
Indirect Costs Incurred in support of learning activities but not directly Instructor prep Clerical & admin support Course materials already distributed and therefore not recoverable if program cancelled Marketing the program
Types of Costs in Training
Development Costs Development of videotapes, DVDs, CBI Design of program materials Piloting the program Any necessary redesign after piloting
Types of Costs in Training
Overhead Costs Not related directly to any training program but essential for operating effort Maintaining equipment Heat, light Cost of dedicated resources not in use for specific program
Types of Costs in Training
Compensation for participants Salaries and benefits paid to participants for the time in a training program Individual data not available, but HR should provide average for all participants
Increasing ROI Credibility through Training
Use conservative cost estimates
Error on high side
Find reliable estimate sources Explain all assumptions and techniques used to calculate costs Rely on hard data whenever possible Use the Balanced Scorecard shown earlier
Training Cost Analysis
Calculate direct costs Calculate indirect costs Calculate development costs Determine overhead costs Determine compensation for participants Sum total costs Divide by number or trainees to get cost per participant
Calculating ROI on Training
ROI = Return = Operational Results Investment Training Costs
Goal of Using Cost-Benefit Analysis
Put HRD on equal footing as other managers Language of business is money Results must be quantifiable Results need to be expressed statistically Need to
Demonstrate expected gains of HRD programs Compete with needs of other managers for equipment, facilities, personnel, etc.
Increasing Managerial Acceptance for Training
Involve senior management in determining the utility model and procedures to be used Train HR professionals and managers in the details of utility analysis Offer an explanation of the components of the utility model
Focus on utility information as a communication tool to aid in decision making
Increasing Managerial Acceptance for Training Involve management in arriving at estimates Use credible and conservative estimates Admit that the results of utility analysis are often based on fallible but reasonable estimates Use utility analysis to compare alternatives, rather than to justify individual programs
Impediments to Effective Training
There are many impediments, which make a training program ineffective, these are as follows: q q q q q q q Management commitment might be lacking or be uneven Educational institutions award degrees but graduates lack skills Aggregate spending on training is inadequate Large-scale Poaching of trained workers No help to workers displaced because of downsizing Employers & B Schools must develop closer ties Organized labor can help.
Impediments to Effective Training
Employees Performance on present job Employee identified for Training Training Program
Measurement of Actual Performance
Employee for the new assignment
Stability for the new assignment
Employee after Training
Measurement of Training Performance
Fig: Model for Feedback on Training
Factors That Influence the Type of Evaluation Design
Factor Change potential Importance Scale Purpose of training Organization culture Expertise Cost Time frame How Factor Influences Type of Evaluation Design Can program be modified? Does ineffective training affect customer service, product development, or relationships between employees? How many trainees are involved? Is training conducted for learning, results, or both? Is demonstrating results part of company norms and expectations? Can a complex study be analyzed? Is evaluation too expensive? When do we need the information?