ABET Lectures
ABET Lectures
ABET Lectures
Inertia, Complexity
P
D
e
Improvem
nt
C
Time
Input
IE
Assessment Data
Collection and Analysis
Curriculum
Design/Revision/Updates
via Faculty. Dept,
College
Outcomes
Product/
Service
Constituents
Administration/
Faculty
Evaluation
Expectations/Requirements
Constituents
ABET
Criteria
PLAN
DO
ACT
CHECK
Continuous
Improvement
ertia
n
I
/
e
c
n
a
t
Resis
Time
IE
University Mission,
College Mission and Strategic Plan, Program Mission
Rubrics
Program
Educational
Objectives
Assessment
(Tools, Data,
Analysis)
Evaluation
Feedback
Actions
Constituents
Feedback for
Continuous
Improvement
Evaluation:
Interpretation of
Data & Actions
Educational
Strategies, Timeline
& C/E Diagram
Assessment:
Tools, Evidence
Collection &
Analysis
IE
Stakeholder
Involvement
(Those who have a
vested interest in
program success)
Stakeholders are
identified
Primary stakeholders
are involved in
identifying
educational objectives
Primary stakeholders
are involved in
periodic evaluation of
educational objectives
Sustained
partnerships
with stakeholders are
developed
R
Performance
A
Objectives
T
(Graduates
performance 3-5
I
N years after completing
program)
G
Objectives are
defined
Stakeholders provide
input to development
of objectives
Number of objectives
is manageable
Objectives are
periodically assessed
Objectives are
periodically evaluated
for relevancy
IE
R
A
T
I
N
G
Learning Outcomes
(desired knowledge,
skills, attitudes,
behaviors at
graduation)
Outcomes are
identified
Number of outcomes
are manageable
R
A
T
I
N
G
Outcomes aligned
with educational
practice
Practices/strategies
are systematically
evaluated using
assessment data
Educational practices
are modified based on
evaluation of
assessment data
Outcomes are
publicly documented
R
A
T
I
N
G
Program and/or
Institutional
Assessment
R
A
T
I
N
G
Assessment is
systematic at the
program/ institutional
level
Evaluation of results
are done by those
who can effect
change
Evaluation of
assessment data is
linked to practices
Assessment processes
are reviewed for
effectiveness and
efficiency
Evaluation leads to
action
Assessment methods
are modified based on
evaluation processes
Evaluation
R
A
T
I
N
G
IE PROGRAM
OBJECTIVES
M1
Broad
Education
M2 BS
Programs
in Arts,
Sci., etc.
M3 BS
Programs in
Educ., Eng.,
etc.
M4
Grad.
Programs
M5
Dist.
Educ.
M6
Farm
M7
Scholarly
Activities
M8
Student
Services
M9
Resource
to WI
Yes
N/A
Yes
N/A
N/A
N/A
Yes
Yes
Yes
2 Team Skills
Yes
N/A
Yes
N/A
N/A
N/A
Yes
Yes
Yes
3 Ethical,
Professional,
Social, Global
Issues
Yes
N/A
Yes
N/A
N/A
N/A
Yes
Yes
Yes
4 Solve problems
Yes
N/A
Yes
N/A
N/A
N/A
Yes
Yes
Yes
5 Professional
Growth
Yes
N/A
Yes
N/A
N/A
N/A
Yes
Yes
Yes
IE
Bala, Clough, Jinkins, Kile
IE PROGRAM
OBJECTIVES
T1
Quality
Education
T2
Culture
T3 Assessment
T4 Outreach
T5 Faculty
T6 - Funds
Yes
N/A
Yes
N/A
Yes
Yes
2 Team Skills
Yes
N/A
Yes
N/A
Yes
Yes
3 Ethical, Professional,
Social, Global Issues
Yes
N/A
Yes
N/A
Yes
Yes
4 Solve Problems
Yes
N/A
Yes
N/A
Yes
Yes
5 Professional Growth
Yes
N/A
Yes
N/A
Yes
Yes
IE
Bala, Clough, Jinkins, Kile
1Foundation
2
Communication
3Responsibility
4
Problem
Solving
5
Growth
X
X
4 Solve Problems
5 Professional Growth
IE
Bala, Clough, Jinkins, Kile
2
Communication
3
Responsibility
4
Problem Solving
5
Growth
j
k
IE
IE
10
1
Foundation
Communication
Responsibility
4
Design
5
Growth
IE 2130
IE 34 30
IE 3530
IE 3630
IE 4030
IE 4230
IE 4430
IE 4730
IE 4930
ME 3040
Required
IE
11
1
Foundation
3
Responsibility
4
Design
5
Growth
Communication
Elective
IE 4130
IE 4330
IE 4630
IE 4750
IE 4780
IE 4830
ME 4230
IE
12
IE
13
IE
14
2
Communication
3
Responsibility
4
Problem
Solving
5
Growth
Alumni Survey
Employer Questionnaire
N/A
Employer Assessment of
Academic Preparation
Student Portfolio
Direct Assessment
of Course Activities by Students
Indirect Assessment
of Course Activities by Faculty
IE
Bala, Clough, Jinkins, Kile
15
Assessment Tools
ASSESSMEN
T TOOL
RESPONSIBILITY
FOR
ADMINISTRATION
SCHEDULE /
FREQUENCY
1. Alumni
Survey
Department Chair / IE
Program Coordinator
Department Chair / IE
Program
Coordinator/IE
Faculty Volunteer
.
.
11. College of
EMS Advisory
Board &
Alumni Board
IE
16
Rubrics
z
IE
17
Rubrics IE Program at UW - P
z
z
z
18
Rubrics IE Program at UW P
(continued)
z
IE
19
IE
20
z
z
IE
21
IE
Acceptable
Performance
22
IE
23
Rubrics: Advantages
z
z
z
z
z
IE
24
IE
25
Metric 3
Metric 1
Acceptable
Performance level
Performance level & feedback
IE
26
IE
Bala, Clough, Jinkins, Kile
27
%Acceptable or
Exceptional
Unacceptable
Acceptable
100%
60%
Marginal
Exceptional
IE 2130 - Intro to IE
Outcome (g)
28
%Acceptable
& Exceptional
Metric 1
100%
Metric 2
Metric 3
60%
IE 2130 Intro to IE
Outcome (g)
29
%Acceptable &
Exceptional
100%
Metric 1
Metric 2
Metric 3
60%
Outcome (a)a
IE 2130 - Intro to IE
Outcome (g)
IE 4930 Capstone
Design Outcome (g)
30
2006-2007
2008-2009
2009-2009
2010-2011
IE
31
IE
Bala, Clough, Jinkins, Kile
32
Frequency
f
[0,4]
[5,6]
[7,11]
12
2
5
41
20
68
MidPoint
m
2
5.5
9
12
f*m
4
27.5
369
240
640.5
9.42
IE
Bala, Clough, Jinkins, Kile
33
40
30
20
10
0
1
1: Unacceptable, 2: Marginal,
3: Acceptable, 4: Exceptional
IE
Bala, Clough, Jinkins, Kile
34
IE
35
IE
36
IE
37
IE
38
IE
39
POTENTIAL
CAUSES
ACTION
2000
2000 &
2001
IE
Bala, Clough, Jinkins, Kile
40
References
z
1. ISO 9000:2005, Quality management systems -Fundamentals and vocabulary, International Organization
for Standardization (ISO), 1, ch. de la Voie-Creuse, Case
postale 56, CH-1211 Geneva 20, Switzerland.
IE
41
References - Continued
z
IE
42
References - Continued
z
IE
43
References - Continued
z
IE
44
Apply
Mathematics &
Basic Science
(W=1)
Apply General
Engineering
Knowledge
(W=1)
Apply IE
Fundamental
Concepts
(W=2)
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Needs assistance
realizing that systems &
processes to be designed
or improved require a
sound foundation in
mathematics, physics,
chemistry, and biology.
Excels in applying
mathematical and/or
scientific principles to
design or improve
systems and processes.
Mathematical and
scientific terms are not
interpreted.
Mathematical and
scientific terms are
interpreted incorrectly.
Mathematical and
scientific terms are
interpreted correctly.
Excellent interpretation
of mathematical,
statistical, and scientific
terms.
Excels in using
mathematical, statistical,
or scientific theories and
concepts to solve
problems.
Modeling and
calculations done
incorrectly.
Modeling and
calculations have 3 or
more errors.
Modeling and
calculations have very
few errors.
Modeling and
calculations are done
correctly.
Needs assistance to
translate theories and
make realistic
assumptions to develop
models of systems and
processes.
Makes unrealistic
assumptions to develop
models of systems and
processes.
Accepts limitations of IE
& mathematical models
of systems and processes
& establishes validity of
models before using them
to make decisions.
Excels in applying
statistical techniques to
model, study, analyze,
design, or improve
systems.
Points (P)
P = W*S
Unacceptable
0TP3
Marginal
4TP6
Acceptable
7TP11
Exceptional
TP=12
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Problem Recognition
and Statement
(W=1)
Selection of Response
Variable(s), Factors,
Levels, And Ranges.
(W=1)
Perform Experiments
(W=2)
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Acceptable
(Score, S=2)
Excellent knowledge of
system, inputs, outputs,
noise factors, etc.
Can identify one response Can identify one response Can identify multiple
variable and factors.
variable and factors.
responses.
Understands the system
Has difficulty in selecting Can choose factor levels
& can rank factors.
factor levels.
to use.
Understands sequential
experimentation and
Has difficulty setting
Knows about setting
screening experiments to
inputs or method of
inputs or method of
set factor levels.
measurement of response. measurement of response. Can define accuracy and
precision for all inputs &
outputs.
Needs assistance to
choose the model to use.
Needs assistance to
determine the need for
blocking.
Excellent knowledge of
repetition or replication.
Needs assistance to
determine sample size.
Needs assistance to plan
experiments & collect
data.
Exceptional
(Score, S=3)
Check Validity of
Model & Apply
Statistical Tools to
Analyze Data (W=2)
Make Statistical
Inferences about
Product or Process
Design or
Improvement. (W=2)
Excels in tests of
hypothesis or confidence
interval estimation to
make good improvements
in product or process.
Unacceptable
0TP9
Marginal
10TP18
Acceptable
19TP26
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Exceptional
27TP30
Points (P)
P = W*S
Unacceptable
(Score, S=0)
No design strategy.
Haphazard approach.
Marginal
(Score, S=1)
Design Strategy
(W=1)
System, Components,
and Processes. (W=1)
Creativity &
Innovation
(W=2)
Documentation &
Resources Used (W=2)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Develops a design
strategy, including a plan
of attack, decomposition
of work into subtasks,
and development of a
timeline using Gantt
chart. No changes needed
to the developed strategy.
Requires help in
developing alternative
designs.
Generates alternative
designs, evaluates these,
and selects the optimal
design.
Design procedure
Design procedure
requires corrections and
requires additions and
references are inadequate. references are
incomplete.
Supports design
procedure with
documentation and
references.
Applying Engineering
and Science
Knowledge (W=2)
Poor in applying
engineering and/or
scientific principles
correctly to design
practical components,
processes, or systems.
Fair in applying
engineering and/or
scientific principles
correctly to design
practical components,
processes, or systems.
Good at applying
engineering and/or
scientific principles
correctly to design
practical components,
processes, or systems.
Excellent in applying
engineering and/or
scientific principles
correctly to design
practical components,
processes, or systems.
Constraints Identified
and taken into
Account (W=2)
Unacceptable
0TP9
Marginal
10TP18
Acceptable
19TP26
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Exceptional
27TP30
Points (P)
P = W*S
Metric &
Weight (W)
Take Responsibility
(W=1)
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Contribution to Team
Effort & Work (W=1)
Knowledge of other
Disciplines (W=1)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Organized.
Well organized.
Remembers work to be
done and due dates, but
may forget a few.
Provides positive
feedback sometimes.
Values others
viewpoints sometimes.
Courteous and
nonjudgemental always.
Provides positive
feedback when necessary.
Values others
viewpoints almost
always.
Has elementary
knowledge of technical
skills, issues, and
approaches germane to
disciplines outside of IE,
but will augment when
needed.
Participates in
discussions, respects
colleagues, makes
significant contributions
while discussing others
work, values others
viewpoints, & functions
effectively as a team
member.
Has very good
knowledge of technical
skills, issues, and
approaches germane to
disciplines outside of IE.
Unacceptable
0TP3
Marginal
4TP6
Acceptable
7TP11
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Exceptional
TP=12
Points (P)
P = W*S
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Strategy
(W=3)
Good at locating
resources, integrating
knowledge and
experience, and
formulating a good
strategy to solve
engineering problems.
Excellent in locating
resources, integrating
knowledge and
experience, and
formulating a good
strategy to solve
engineering problems.
Good in applying
multiple tools to solve
problems related to
synthesis of new systems.
Excels at applying
multiple tools to solve
problems related to
synthesis of new systems.
Needs significant
assistance to apply theory
and identify multiple
constraints.
Needs significant
assistance in generating
alternative solutions and
comparing them.
Fair at stating
assumptions.
Good at stating
assumptions.
Excels in stating
assumptions.
References are
incomplete.
Tools Used(W=2)
Solution Approach
(W=2)
Documentation
(W=3)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Unacceptable
0TP9
Marginal
10TP18
Acceptable
19TP26
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Exceptional
27TP30
Points (P)
P = W*S
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Needs assistance to
locate the code of
ethics of a
professional society.
Knows where to
access code of
ethics of at least 1
professional society.
Knows where to
access code of
ethics of 2 or more
professional
societies.
Knowledge of
Professional Code
of Ethics
Has read, but does
(W=1)
not remember
professional code of
ethics.
Knowledge of
Theories of
Ethics (W=1)
No evidence of
valuing ethical
theories.
Needs assistance to
idenify ethical
dilemmas and to
Ability to
apply the code of
Recognize Ethical
ethics from
Dilemmas (W=1)
professional
societies and/or
ethical theories.
Needs assistance to
analyze ethical
problems in case
studies.
Analyze Ethical
Problems in IE
Work and Make
Decisions (W=1)
Excellent
knowledge of many
theories of ethics.
Has ability to
analyze ethical
problems in IE work
through case
studies, but is not
interested.
Has demonstrated
good ability to
analyze ethical
problems in IE work
through case
studies.
Has demonstrated
excellent ability to
analyze ethical
problems in IE work
through case
studies.
Has generated
excellent soultions
and made sound
decisions in the IE
field.
Unacceptable
0TP3
Marginal
4TP6
Acceptable
7TP11
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Exceptional
TP=12
Points (P)
P = W*S
Delivery
&
Speaking Skills
(W=2)
Personal
Appearance &
Rapport with
Audience (W=1)
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Presentation lacks
structure.
Difficult to follow
Most information is
presentation due to erratic presented in logical order
topical shifts and jumps.
and is easy to follow.
All information is
presented in a logical,
interesting and novel
sequence and is easy to
follow.
No grasp of content.
Needs assistance
answering questions
about subject.
Uncomfortable with
content. Capable of only
answering rudimentary
questions.
Demonstration of full
knowledge of the subject
with explanations and
elaboration.
No visual aids or
inadequate slides.
Minor misspellings
and/or grammatical
errors.
No spellings or
grammatical errors.
Mumbling or incorrect
pronunciation of terms.
Voice level too low or
too high. Does not use
appropriate vocabulary.
Occasional
mispronunciation of
terms. Uses appropriate
vocabulary.
Monotonous, no eye
contact, rate of speech
too fast or too slow.
Points (P)
P = W*S
Appearance is good.
Appearance is
professional.
Needs assistance to
respond to questions and
comments.
Responds to questions
and comments, but is not
at ease or confident.
Responds to questions
and comments well.
Responds to questions
and comments
confidently.
Length is inappropriate.
Length is adequate.
Length is acceptable.
Length is appropriate.
Unacceptable
0TP9
Marginal
10TP18
Acceptable
19TP26
Exceptional
27TP30
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Sequence of
Work is hard to follow
information is difficult as there is very little
to follow. No apparent continuity.
structure or continuity.
Purpose of work is
Purpose of work is not stated, but does not
clearly stated.
assist in following
work.
Information is
presented in a logical
manner, which is
easily followed.
Information is
presented in a logical,
interesting way, which
is easy to follow.
Purpose of work is
clearly stated and
assists the structure of
work.
Purpose is clearly
stated and explains the
structure of work.
Uncomfortable with
content of report.
(W=3)
No questions are
answered. No
interpretation made.
Demonstration of full
knowledge of the
subject with
explanations and
elaboration.
Format
&
Aesthetics
Work is illegible,
format changes
throughout, e.g. font
type, size etc.
Mostly consistent
format.
Format is generally
consistent including
heading styles and
captions.
Format is consistent
throughout including
heading styles and
captions.
Spelling
&
Grammar
(W=1)
Numerous spelling
and grammatical
errors.
Minor misspellings
and/or grammatical
errors.
Negligible
misspellings and/or
grammatical errors.
References
No list of references.
Inadequate list of
Material used in text is references or
not referenced.
referencing in text.
Reference section is
not in correct format,
but is sufficient.
Reference section is in
correct format and
comprehensive.
(W=2)
Content
&
Knowledge
(W=1)
(W=2)
Points (P)
P = W*S
Unacceptable
0TP8
Marginal
9TP13
Acceptable
14TP20
Exceptional
21TP27
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Familiarity with
Applications of IE
Tools, Methods &
Techniques in
Global and
Societal Context
(W=2)
Needs assistance in
locating resources
(libraries, websites,
journals, magazines,
etc) on 2 or more
applications.
Good at locating
resources (libraries,
websites, journals,
magazines, etc) on 2
or more applications.
Excels in locating
resources (libraries,
websites, journals,
magazines, etc) on 2
or more applications.
No evidence of
reading papers on
these applications.
Needs significant
assistance in applying
IE methods to analyze
global and social
issues.
Needs assistance to
review and write a
report on specific IE
methods applied to
analyze global and
social issues.
Understanding of
Impact of
Engineering
Solutions in
Global and
Societal Context
(W=1)
Has excellent
knowledge of 3 or
more international
standards that can
alleviate the adverse
impact of engineering
solutions in global and
societal context.
Needs assistance to
use a strategy to
harmonize standards
and management
systems for quality,
environment, social
responsibility, etc.
Can implement a
strategy for
harmonizing standards
and management
systems for quality,
environment, social
responsibility, etc.
Points (P)
P = W*S
Unacceptable
0TP5
Marginal
6TP10
Acceptable
11TP15
Exceptional
16TP18
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Ability to Locate
and Use
Resources on the
Web (W=1)
Ability to Use
Reference Books,
Books,
Periodicals, and
Archives, & InterLibrary Loans in
Libraries (W=1)
Has demonstrated
excellent ability to
acquire books and
journal articles,
understand, interpret,
and apply current,
new, or innovative
concepts in IE and
related fields.
Ability to Locate
& Learn from
Recent
Publications in IE
(W=1)
Familiarity with
Services Provided
by Professional
Societies (W=1)
No evidence that
membership in
professional societies
is valued.
Plans to be a member
in 1 professional
society.
Member of 1
professional society.
Member of 2 or more
professional societies.
Not interested in
courses or resources
available at the
website for the
society.
Aware of courses on
the current topics and
resources available at
the website for the
society.
Points (P)
P = W*S
Unacceptable
0TP3
Marginal
4TP6
Acceptable
7TP11
Exceptional
TP=12
The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
Unacceptable
(Score, S=0)
Poor knowledge of
Knowledge of
Sarbanes-Oxley Act.
SOX (W=1)
and Impact on IE
Profession
Marginal
(Score, S=1)
Fair knowledge of
Sarbanes-Oxley Act,
but is not aware of its
impact on the IE
profession.
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Excellent knowledge
of Sarbanes-Oxley Act
and is well aware of its
impact on the IE
profession.
Knowledge of Job Relies primarily on the Poor knowledge of job Good knowledge of
market and relies on
job market and is
Market (W=1) Placement Services.
the Placement
Services.
building a network to
seek information.
Excellent knowledge
of job market and has
an excellent network
to seek information.
Poor preparation.
Good preparation
through books in
library & IE Office on
interviews, talks with
alumni, etc.
Excellent preparation
through books in
library & IE Office on
interviews, talks with
alumni, etc.
Knowledge of
Graduate School
& Related Topics
(W=1)
No evidence of
knowledge about
graduate school
programs.
Aware of graduate
programs in IE and
related fields.
Excellent knowledge
of graduate programs
in IE and related
fields.
Services to
Profession and
Society (W=1)
No evidence that
service to the
profession is valuable.
Has demonstrated
dedicated leadership
roles on campus and
may continue in
future.
Ability to Engage
in Conversation
about Political,
Economic,
National,
Regional, and
international
Events or Issues
(W=1)
No evidence of
interest in newspaper
or magazines.
Plans to read
newspaper or
magazines in future.
Reads newspaper or
magazines randomly.
Makes minor
contributions to
discussions.
Does contribute to
discussions.
Unacceptable
0TP6
Marginal
7TP11
Acceptable
12TP17
Exceptional
18TP21
The following details may be used for tracking student/team performance over time.
Course # & Title:
Reviewer/Assessor:
Points (P)
P = W*S
Date:
Unacceptable
(Score, S=0)
Marginal
(Score, S=1)
Apply Human
Factors
Engineering Skills
(W=1)
Good at applying
deterministic models.
Excellent in applying
deterministic models.
Good at applying
Markov processes and
queuing models.
Excellent in applying
Markov processes and
queuing models.
Excellent in applying
work measurement
techniques.
Excels in applying
SPC, design of
experiments,
acceptance sampling,
and standards
Acceptable
(Score, S=2)
Exceptional
(Score, S=3)
Quality
Engineering &
Management
(W=1)
Lean
Manufacturing
(W=1)
Supply Chain
Management
(W=1)
Fair knowledge of
supply chain
management.
Good knowledge of
supply chain
management.
Excellent knowledge
of supply chain
management.
Facilities Design
(W=1)
Fair knowledge of
facility design.
Good knowledge of
facility design.
Excellent knowledge
of facility design.
Fair in problem
Good in problem
Problem Def.,
definition, solution,
Soln. Strategy, & definition, solution,
Research (W=1) strategy, and research. strategy, and research.
Excellent in problem
definition, solution,
strategy, and research.
Points (P)
P = W*S
Unacceptable
0TP6
Marginal
7TP12
Acceptable
13TP17
Exceptional
18TP24
The following details may be used for tracking student/team performance over time
Course # & Title:
Name - Student/Team:
Reviewer/Assessor:
Date:
IE Faculty
Dr. S. Balachandran
Dr. Jill Clough
Dr. Patricia Jinkins
Dr. Justin Kile
TABLE OF CONTENTS
1. BACKGROUND ..........................................................................................................1
2. OVERVIEW
..........................................................................................................2
3. MISSION STATEMENTS .............................................................................................9
3.1 University Mission Statement........................................................................................9
3.2 Mission Statement College of Engineering Mathematics and Science (EMS) ........10
3.3 Mission Statement IE Program .................................................................................10
4. IE PROGRAM OBJECTIVES ......................................................................................10
5. OUTCOMES
........................................................................................................11
5.1 IE Program Outcomes..................................................................................................11
5.2 ABET, Inc. /EAC Outcomes........................................................................................12
5.3 Matrix Relating IE Outcomes and ABET, Inc. /EAC Outcomes ................................13
5.4 Matrix Relating IE Outcomes and IE Courses.............................................................13
6. IE PROGRAM CONSTITUENCIES .................................................................................15
7. IE PROGRAM ASSESSMENT TOOLS.......................................................................15
7.1 Tools to Collect Assessment Data about IE Program Objectives ................................21
7.1.1 Alumni Survey ........................................................................................................21
7.1.2 Employer Survey ......................................................................................................22
7.2 Tools to Collect Assessment Data about IE Program Outcomes.................................23
7.2.1 IE Graduate Exit Questionnaire ................................................................................23
7.2.2 Statistical Data ........................................................................................................24
7.2.3 Industrial Project Sponsor Survey ............................................................................25
7.2.4 Employer Assessment of Academic Preparation......................................................26
7.2.5 Student Portfolio .......................................................................................................27
7.3 Tools used by College and University .......................................... . ....27
7.4 Rubrics/Performance Criteria for Assessing & Evaluating Outcomes ........................28
8. ASSESSMENT DATA COLLECTION TIMELINE AND ANALYSIS .....................30
8.1 Assessment Data from Alumni Questionnaire & Analysis ........................................31
8.2 Assessment Data from Employer Questionnaire & Analysis ......................................35
8.3 Assessment Data from Employer Assessment of Academic Preparation & Analysis.38
8.4 Assessment Data from Industrial Project Sponsor Survey & Analysis .......................41
8.5 Assessment Data from Graduate Exit Survey & Analysis...........................................44
8.6 Cause-Effect Diagram..................................................................................................49
8.7 FE Examination Results...............................................................................................49
8.8 Past Assessment Reports..............................................................................................49
8.9 Additional Assessment Evaluations that are informal .................................................49
9. EVALUATION PROCESS AND PROGRAM IMPROVEMENT ..............................49
REFERENCES
........................................................................................................54
APPENDICES
Appendix A Appendix B Appendix C Appendix D Appendix E Appendix F Appendix G Appendix H Appendix I Appendix J Appendix K Appendix L -
1. BACKGROUND
Even before its first ABET/EAC accreditation in 1987, the industrial engineering
(IE) program at UW-Platteville (UW-P) had collected assessment information and data
from alumni, the College Industrial Advisory Board (IAB) and employers. IE faculty
evaluated and analyzed that assessment data annually at faculty meetings to improve
individual courses and also the entire curriculum. Changes in course titles, catalog
description of courses, laboratory projects in the courses, topics covered in courses, etc
were driven by data from the assessment process. The evaluation process interpreted
assessment data to determine the extent to which educational objectives of the IE
program were being achieved and resulted in faculty decisions and actions to improve IE
courses and curriculum.
When the ABET/EAC Engineering Criteria 2000 were published, IE faculty
created a continuous improvement plan (CIP) that was summarized in the May 2000 SelfStudy Report of the IE program. The CIP1, shown in Figure 1, is based on ISO 9000
principles, mandates and ensures systematic review and improvement of every aspect of
the undergraduate education program, and also ensures institutional memory of
improvement activities through documentation requirements. The program implemented
the CIP in 1998 and documented its plan in the 1998 Industrial Engineering Assessment
Manual which was revised in October 2000. This manual outlined both the assessment
and evaluation processes. It identified multiple assessment tools to collect and prepare
data to evaluate the achievement of program outcomes and educational objectives. It
established a formal assessment plan to collect assessment data from students, alumni and
employers. This manual listed the assessment data collection methods, their frequency,
methods of analysis and interpretation of collected data to determine the degree to which
objectives and outcomes are being achieved. The May 2000 and May 2006 Self-Study
Reports of the IE program summarize how these assessment and evaluation processes
resulted in course and curriculum improvements. Fall 2000 and 2006 ABET/EAC
reviews of the IE program found assessment and evaluation processes to be satisfactory.
IE faculty implemented a CIP for the curriculum two years before the fall 2000
ABET/EAC general review. This CIP was originally outlined in the May 2000 Self-Study
Report of the IE program. The current Self-Study Report also identifies the following
elements of the CIP: constituencies, program objectives, program outcomes, multiple
assessment measures/metrics, and frequency of assessment activities, expected
performance levels, evaluation process, and curriculum improvement. This document
embellishes the CIP by providing additional details.
The key concept in the IE CIP is the process approach which facilitates attainment
of desired objectives and outcomes by managing activities and related resources as a
process. The "process approach" is a generic management principle, which can enhance
an organizations effectiveness and efficiency in achieving defined objectives and
outcomes. The continuous improvement process implemented by IE faculty is
characterized via the PDCA (Plan-Do-Check-Act) cycle2, 10, 11 for continuous assessment,
evaluation and improvement of the program.
The PDCA cycle is an established, logical, method that can be used to improve a
process. This requires:
(P) planning (what to do and how to do it),
(D) executing the plan (do what was planned),
(C) checking the results (did things happened according to plan) and
(A) acting to improve the process (how to improve next time).
The PDCA cycle can be applied within an individual process, or across a group of
processes. This IE Assessment and Evaluation Manual provides details about assessment,
evaluation, and improvement of IE program using this cycle. Specifically, this manual
shows how each and every one of the multiple assessment tools was designed (P-Plan),
and administered (D- Do). In addition, the manual shows how the collected assessment
data will be interpreted (C-Check) and may lead to decisions (A-Act) by IE faculty to
improve courses and curriculum.
The ABET/EAC EC 2000 and PDCA cycle may be viewed together as depicted
in the Figure 2 on the next page. In every system and process there is sufficient inertia to
let a status quo prevail and complacency to creep in. However, the PDCA cycle forces IE
faculty to strive for continuous improvement of the curriculum instead of attempting to
ride on past successes. The ABET/EAC EC 2000 assist the faculty in holding on to the
gains in achieving the desired objectives and outcomes and also in attempting to make
significant gains in the future.
The CIP, as envisaged and implemented by IE faculty, applies the PDCA cycle to
the process of curriculum improvement. It emphasizes periodic assessment and
evaluation of IE program constituencies, objectives, and outcomes. It also encompasses,
as outlined in the IE Program Self-Study Report, planning, collecting data using multiple
assessment tools, and summarizing assessment data so that assessment measures may be
compared with the target performance levels predefined by IE faculty. This manual
presents more details about the above parts of the CIP and also outlines the
mechanism that determines whether program changes are needed, and the process to
implement the necessary improvements. See Table 1 for the current status of the IE CIP.
2. OVERVIEW
The industrial engineering program assessment plan and evaluation procedures
were developed by industrial engineering faculty with input from alumni, employers,
national and local conferences, colleagues at other universities, published literature,
program evaluators and commissioners of ABET, Inc. and the other interested parties. It
provides a basis for obtaining feedback on the program, its outcomes, and its objectives,
and using that feedback for making improvements. Figure 3 presents these two feedback
loops and loop interconnections in the IE continuous improvement plan. This document
describes the program assessment plan. It includes the objectives and outcomes for the
program, specifies assessment tools, and gives a timeline for assessing the program
throughout the academic year. The plan is designed to obtain input from all constituents
and to provide a structure for continuous improvement of both the program and the plan.
Input
Assessment Data
Collection and Analysis
Curriculum
Design/Revision/Updates
via Faculty. Dept,
College Exec. Council &
UUCC
Outcomes
Product/
Service
Demings
Demings
wheel
wheel
(P.D.C.A.)
(P.D.C.A.)
PLAN
DO
ACT
CHECK
EC2000
Continuous
Improvement
e/Inertia
Resistanc
Constituents
Expectations/Requirements
Constituents
Administration/
Faculty
Evaluation
UW-P Mission,
College Mission and Strategic
Plan, Program Mission
Constituents
Feedback for
Continuous
Improvement
Educational
Strategies,
Timeline &
C/E Diagram
Program
Evaluation
& Feedback
Program
Educational
(EAC/ABET
Objectives)
Rubrics
Evaluation:
Evidence
Interpretation
Assessment
Assessment
: Evidence
Collection
& Analysis
Number of objectives is
manageable
Evaluation
Assessment is systematic
at the program/
institutional level
Practices/strategies are
systematically evaluated
using assessment data
Evaluation of assessment
data is linked to practices
5
5
RATING
Program and/or
Institutional Assessment
RATING
RATING
Learning Outcomes
desired knowledge, skills,
attitudes, behaviors at time
of completing program)
RATING
Performance Objectives
(Graduates performance 35 years after completing
program)
RATING
Stakeholder Involvement
(Those who have a
vested interest in the
outcome of the program)
RATING
0-not in place; 1-beginning stage of development; 2-beginning stage of implementation; 3-in place and implemented;
4-implemented and evaluated for effectiveness; 5-implemented, evaluated and at least one cycle of improvement
5
5
During the past two decades IE faculty members have arrived at the consensus
that the primary constituents of the program are alumni, current students, potential future
students and industry. While the program is designed for students to transition smoothly
into service and manufacturing industries and hit the job running, graduate schools and
programs are also considered as constituents in the next lower tier. Minor constituents
include faculty members who teach courses related to the curriculum, families of
students, the college and the university, taxpayers of the State of Wisconsin, and the
Wisconsin State Legislature.
As outlined in the IE Program Self-Study Report, the program educational
objectives reflect the expected accomplishments of graduates during the first few years
after graduation from the program. These objectives are consistent with the mission of the
university, college, and the program. The next section lists these mission statements.
3. MISSION STATEMENTS
This section lists the mission statements for the University, the College, and the
Industrial Engineering Program.
4. IE PROGRAM OBJECTIVES
After discussion among the faculty and input from the Colleges Advisory Board
and students, the Industrial Engineering Program established its educational objectives.
The educational objectives of the Industrial Engineering Program are consistent with the
Mission of UW-Platteville, the Colleges Strategic Plan, and ABETs Engineering
Criteria 2000. IE program objectives are listed below. Matrices may be used to cross-link
universitys mission and college strategic plan to the IE program objectives.
1. To provide graduates with a strong foundation in engineering, mathematics, science,
and current industrial engineering practices, accompanied by experiences solving
structured and unstructured problems using conventional and innovative solutions.
2. To enhance graduates communication and interpersonal skills through a variety of
individual and team-related activities, both multi-functional and intra disciplinary.
3. To provide graduates with an understanding of the ethical and professional
responsibilities of an engineer and the impact of engineering solutions on society and
the global environment.
4. To prepare graduates to effectively describe the problem, analyze the data, develop
potential solutions, evaluate these solutions, and present the results, using their oral,
written, and electronic media skills.
5. To make graduates aware of the need for continued professional growth through the
understanding of contemporary developments in industrial engineering.
10
These objectives are published in several places including the University Catalog,
fact sheets, and the Industrial Engineering curriculum sheets. The objectives are included
on course syllabi for Industrial Engineering courses. Instructions provided to students on
preparation of a student portfolio also include the educational objectives. Most
importantly, these objectives are covered in the survey forms used to solicit feedback
from programs various constituencies.
The educational objectives of the Industrial Engineering Program are reviewed
annually by the faculty. In addition, input is requested from the College Industrial
Advisory Board every three years. Additional input is gathered through comments made
on the Alumni Survey and Employer Survey which are sent to students two and five
years after graduation. Graduating seniors complete an Exit Questionnaire which
includes an opportunity for students to cite specific examples of activities and
experiences which demonstrate achievement of program outcomes that correlate well
with these objectives. . Faculty members fully support the educational objectives and are
acutely aware of the importance of achieving the objectives. The curriculum is built
around achieving these objectives, and embellishing them when feedback from
constituents convince faculty that objectives require revision. As an example, the list of
required courses was expanded to include a course on engineering management and a
course on engineering materials so that graduates will be well prepared to demonstrate
that these objectives are achieved within a few years after graduation from the program.
5. OUTCOMES
The outcomes are abilities, skills, awareness, knowledge, and understandings that
must be inculcated in students in various courses in the curriculum. IE faculty members
design the courses and course activities to foster the achievement these outcomes so that
graduates of the program will be able to demonstrate the achievement these via
accumulated course activities.
11
12
1
Foundation
2
3
Communication Responsibility
4
Problem Solving
d
e
5
Growth
importance, and motivating them to enhance the ability or understanding in the next
upper level course. Emphasis on program outcomes in some courses is via open ended
course activities, literature search, team projects, case studies, group discussions, etc that
are designed to provide opportunities for students to explore and enhance their
competencies. Reinforcement of program outcomes is mostly achieved in upper level IE
courses. Students are assumed to possess reasonable knowledge, understanding, skill, or
ability to apply their competency to analyze a problem, case study, situation, or industrial
project. Instructional activity continues to build upon previous competency and
reinforces content/skill competency. The following table is a dynamic entity and will be
revised by faculty assigned to teach specific courses each semester. The binders for
course materials and ABET outcomes contain additional matrices that link course
activities to ABET and program outcomes.
A matrix may be developed for each ABET outcome (a) through (k) to portray
courses and coursework that cover each outcome. In addition, course matrices may
illustrate the outcomes achieved by each coursework.
Table 3: Outcomes in IE courses: I-Introductory, E-Emphasis, and R-Reinforcement
Course
Number
Communication Responsibility
4
Design
5
Growth
Required
I
IE 34 30
I
I
IE 3530
IE 3630
IE 4030
IE 4230
IE 4430
IE 4730
IE 4930
ME 3040
IE 4130
IE 4330
IE 2130
Elective
14
IE 4630
IE 4750
IE 4780
IE 4830
ME 4230
6. IE PROGRAM CONSTITUENCIES
During the past two decades IE faculty members have arrived at the consensus
that the primary constituents of the program are students, alumni, and employers. While
the program is designed for students to transition smoothly into service and
manufacturing industries and hit the job running, graduate schools and programs are
also considered as constituents in the next lower tier. Minor constituents include faculty
members who teach courses related to the curriculum, families of students, the college
advisory board (AB), the college and the university, taxpayers of the State of Wisconsin,
Wisconsin State Legislature, and ABET/EAC.
Tools to collect assessment data about achieving the program objectives which
deal with the skills, knowledge, and performance of alumni a few years after
graduation.
Tools to collect assessment data about achieving the program outcomes and
evidence about the program effectiveness. These tools collect assessment data
about students abilities and understanding before and at the time of graduation.
In the following table some tools are designed to collect assessment data about
both program objectives and outcomes. It should be noted that the IE program objectives
and outcomes stated and discussed in the May 2006 Self-Study Report have a one-to-one
correspondence. The skills, abilities, and understandings to be demonstrated are the same,
but the time at which these are demonstrated changes from the time of graduation to a
few years on a job.
16
ASSESSMENT
TOOL
1. Alumni
Survey
RESPONSIBILITY
FOR
ADMINISTRATION
Department Chair / IE January of each year
Program Coordinator Poll two-year alumni and
five-year alumni.
2. Employer
Survey
Department Chair / IE
Program Coordinator
3. IE Graduate
Exit
Questionnaire
College of EMS /
Department Chair / IE
Program Coordinator
IE Faculty member
teaching the senior
design course.
17
4. Statistical
Data
(a)
Fundamentals
of Engineering
(FE)
Examination
Results
(b) Career
Planning and
Placement
Statistics
(c) Cooperative
Education
positions and
summer
internships held
by graduating
seniors.
(d) Alumni
Survey
College of EMS
IE faculty members.
May and December of each year.
Compute passing rate.
Assess performance in each subject area.
Present findings, identify potential causes, and suggest
actions to faculty. Apply the cause-effect diagrams to
discuss action plan.
Each year.
Department Chair / IE
Program Coordinator
IE Faculty member
teaching the senior
design course.
Each semester
IE faculty members.
September of each year.
Analyze placement and starting salaries.
See the folder containing placement data for past few years.
IE faculty members.
January and September of each year.
Compute percentage of graduates who had Cooperative
Education positions and summer internships.
Identify potential causes, and suggest actions to be taken by
faculty in advising sessions.
Department Chair / IE
Program Coordinator
18
5. Industrial
Project Sponsor
Survey
IE Faculty member
teaching the senior
design course and
other IE courses
where industry
sponsored design
projects were used to
provide realistic
hands-on design
experience to
students.
6. Employer
Assessment of
Academic
Preparation
College of EMS,
Director of
Cooperative
Education
Program
Last month of
cooperative education or
summer internship
position.
7. Student
Portfolio
Department Chair / IE
Program Coordinator
IE faculty member
teaching the senior
design course (IE
4930 Industrial
Systems Design) and
the IE 2130
Fundamentals of
Industrial Engineering
course.
19
Frequency depends on the IE faculty members present their data at faculty meeting in
January and September for the previous semester or year.
outcome and faculty
Course folders/binders member. Course
materials may be
collected once in a few
years or each semester.
8. Direct
Measurement of
degree to which
ABET/EAC
specified
outcomes are
attained in IE
courses.
9. UW
Platteville,
Academic
Planning
Council, Fiveyear Self-Study
and Review.
10. ABET/EAC
Review
IE faculty members
11. College of
EMS Advisory
Board &
Alumni Board
Department Chair / IE
Program
Coordinator/IE
Faculty Volunteer
Department Chair / IE
Program Coordinator
http://www.uwplatt.edu/committees/apc/forms/index.html
Department Chair / IE
Program Coordinator
And IE faculty
members.
20
Alumni Survey
WHO? - The College of Engineering, Mathematics, and Science shall distribute surveys
to alumni who graduated two years and five years previously.
WHEN? - The surveys shall be distributed each annually (usually in January)
WHY? - Collect assessment data for achieving program objectives. Offer suggestions for
areas of improvement within Industrial Engineering Program. Indicate the career
progression of alumni
DATA COLLECTION: Surveys shall be returned either by mail or fax
INFORMATION OBTAINED: Basic statistical data regarding Salary ranges, Position,
Registration Status, and Evaluation of Industrial Engineering Program
WHAT TO LOOK FOR: Registration Status will show if the desire for continuous
growth has been instilled
a. Salary
b. Current job title/responsibilities
c. Disagreement to being well prepared for any of the ABET criteria
d. Information regarding the weakness of engineering education
e. Improvements to the industrial and general engineering curriculum at UW-Platteville
f. Willingness to return to UW-Platteville to present to classes or professional
organizations
g. Accessibility to internet/e-mail
21
Employer Survey
WHO? - The Department of Mechanical Engineering and Industrial Engineering and the
IE Program Coordinator shall distribute employer surveys with the alumni survey to
alumni who graduated two years and five years previously.
Students opinion of their preparedness for co-op experience and for the Fundamentals of
Engineering Exam
Quality of the Industrial Engineering Programs classes
Personal examples supporting the achievement of the IE Program Objectives
WHAT TO LOOK FOR: Within each category the rating should be above average or
excellent.
The number of co-ops and/or internships per student
Additional comments after each section
Comments about the quality of the engineering education, including:
Strengths
Weaknesses
What is missing
What could be eliminated
Why they would or would not recommend a friend to attend UW- Platteville for
Industrial Engineering
WHAT TO DO WITH THE INFORMATION: If any of the categories within the sections
receive a rating of average or below, the department has to give special attention to each section
individually. The department will have to develop a program to improve the categories which
received a low rating. Studies can be done to see if the same categories always receive a low
rating and why they receive that rating. With regards to the co-op and internship section of the
survey, the department can use the information to correlate a students previous experience to
receiving a job before graduation. The comments sections can be used to establish
improvements in the Industrial Engineering Program and the courses it provides. Many ideas
given at the end of the survey including the strengths, weaknesses, and what is missing can be
used as building blocks towards the improvement of the Industrial Engineering Program at the
University of Wisconsin - Platteville.
7.2.2 Statistical Data
WHO? - Industrial Engineering students taking the Fundamentals of Engineering (FE) exam:
Graduating Industrial Engineers
Industrial Engineering Alumni
Career Planning and Placement
WHEN? Fundamentals of Engineering (FE) Exam results every semester
Career Planning and Placement statistics once a year
Alumni surveys every semester
WHY? - To provide data about how well students are meeting the Industrial Engineering
Program Objectives and ABETs eleven graduate expectations.
INFORMATION OBTAINED:
Percentage of students passing the FE exam
Number of cooperative education (co-op) experiences and/or internships graduating
students have obtained
24
the Industrial Engineering Program Objectives and ABETs eleven graduate expectations.
DATA COLLECTION:
The surveys should be returned one week after their distribution. Surveys may be
returned at different times of the semester due to the variation in final presentation times.
Surveys may be returned via postage paid envelope or fax.
25
INFORMATION OBTAINED:
Level of skills and abilities students possess in accordance with the Industrial
Engineering Program Objectives
WHAT TO LOOK FOR:
Within the characteristics, all the categories should receive a rating of four or more
additional comments
WHAT TO DO WITH THE INFORMATION:
OVERALL BASIS:
If any of the items are insufficient, the faculty will have to review the project
administration, integration of the project into the course, and perhaps the course
content. Upon doing so, the group should identify areas where the deficient skills
could be developed further. In addition, the methods by which this material is
conveyed may have to be changed to make it more appealing and applicable to real
world situations.
INDIVIDUAL:
If an individual group receives a four or less, the professor should have a group
meeting with them as soon as the surveys are reviewed. Within this meeting, the
professor should provide methods by which these students can improve upon the
skills that were lacking according to the sponsor.
7.2.4 Employer Assessment of Academic Preparation
WHO? - Immediate supervisors of students participating in co-op or internship experiences
will complete an evaluation prior to the end of the students work experience.
WHEN? - The CO-OP office shall distribute evaluations to students during the last month of
the co-op or intern position.
WHY? - Assess the preparation of students compared to graduate expectations. Offer
suggestions for areas of improvement within Industrial Engineering Program
DATA COLLECTION: Evaluations shall be mailed or faxed back
INFORMATION OBTAINED:
Responsibilities of student worker
Ability of student to meet expectations
Information regarding the weakness of engineering education
Student preparedness based on graduate expectations
WHAT TO LOOK FOR:
Job responsibilities
Disagreement with respect to being well prepared for any of the expectations
Information regarding the weakness of engineering education
Inadequacies in meeting employer expectations based upon education
26
27
%Acceptable &Exceptional or
Average Score.
Outcome
a
28
%Acceptable or Exceptional
Unacceptable
Marginal
100%
Acceptable
Exceptional
60%
IE 2130 - Intro to IE
IE 4930 Capstone
Outcome (g)
Design Outcome (g)
Figure 4(b): Planned Use of Assessment Data from Rubrics in 2007/2008 - Track a
target Group.
%Acceptable
&Exceptional
Metric 1
Metric 2
100%
Metric 3
60%
IE 2130 - Intro to IE
Outcome (g)
29
%Acceptable
&Exceptional
Metric 1
Metric 2
100%
Metric 3
60%
Outcome (a)a
IE 2130 - Intro to IE
IE 4930 Capstone
Outcome (g)
Design Outcome (g)
Figure 4(d): Planned Use of Assessment Data from Rubrics in
2007/2008 to track performance of a target group.
30
ABET/EAC
Outcome/
Graduate
Expectation
2007-2008
2011-2012
31
32
Table 6: Evaluation of Assessment Data from Alumni Questionnaire/Survey Outlier Responses & Actions
YEAR
FINDING
POTENTIAL CAUSES
ACTION
1999-2000 One alumnus out of 9
IE curriculum deals with
Life Cycle principles, life cycle costing, and life cycle
strongly disagreed with I
management and production. management are being taught in IE courses. Relate these
feel that my education at
Product design is not covered concepts to understanding the impact of engineering
UW-Platteville enables me
explicitly.
design on society and the environment.
to understand the impact of
engineering design on
See the cause effect diagram. This concept is now introduced in IE 4730 and is
society and the
emphasized, or reinforced in IE 4830.
environment.
Cumulative assessment data does not show that this
The null hypothesis of not
continues to be a problem.
achieving this outcome was
rejected. The response may
be disregarded as an outlier,
but faculty decided to
address it.
1999-2000 Two out of 9 disagreed with Cause and effect possibilities Cumulative assessment data does not show that this
acquired effective oral and
reflect number of possibilities continues to be a problem.
written communication
including changes in faculty
skills
at the time, student avoidance Identify courses where oral and written communication
of opportunities in class, and
skills will be introduced, emphasized, or reinforced.
The null hypothesis of not
greater emphasis needed in
achieving this outcome was
job than anticipated.
All IE courses require presentations and reports. Faculty
rejected. The response may
Cause and effect possibilities will be more rigorous in grading the reports and
be disregarded as an outlier, reflect number of possibilities presentations. Arrange to provide a formal feedback to
but faculty decided to
including changes in faculty
students so that faculty attempts to introduce, enhance,
address it
at the time, student avoidance and reinforce this skill will be remembered.
of opportunities in class, and
greater emphasis needed in
job than anticipated.
2001
34
35
36
Table 7: Evaluation of Assessment Data from Employer Questionnaire/Survey Outlier Responses & Actions
YEAR
FINDING
POTENTIAL
ACTION
CAUSES
2000
One employer disagreed
See the cause effect Use the rubrics to provide effective feedback to students.
that alumnus had ability to
diagram. .
function on
Use the capstone design course to emphasize this skill.
multidisciplinary teams.
Cumulative assessment data does not show that this continues to
The null hypothesis of not
be a problem.
achieving this outcome was
rejected. The response may
be disregarded as an outlier,
but faculty decided to
address it.
2000 &
One employer disagreed
See the cause effect Use the rubrics to provide effective feedback to students.
2001
that alumnus had effective
diagram.
oral and written
Use the senior level and capstone design course to emphasize this
communication skills.
skill.
The null hypothesis of not
achieving this outcome was
rejected. The response may
be disregarded as an outlier,
but faculty decided to
address it.
37
8.3 Assessment Data from Employer Assessment of Academic Preparation and Analysis
This survey or questionnaire is completed by employer or supervisor when IE
student completes cooperative education assignments or summer internships. During the
summer, the returned survey data are tabulated and a report is written by the program
coordinator. Sometimes a faculty member takes the primary responsibility for entering
summary of assessment data into spreadsheet, conducting analysis and arriving at
interpretations. The report is made available to the IE faculty at the start of the fall
semester. The report is presented at the fall meeting of Advisory Board for comments
and suggestions. Table 8 below presents the interpretations and actions taken by faculty.
Feedback from employers was positive and all the responses affirm that the program
objectives are attained. Further analysis will be carried out when additional data becomes
available.
In accumulating data for several years, care was exercised to total the responses
for identical questions in each survey. Each survey item had six response levels, namely,
Excellent, Very Good, Average, Below Average, Very Poor, and N/A. To perform
statistical analysis of the data, numerical weights were assigned the responses. The
numerical weights are: Excellent=5, Very Good=4, Average=3, Below Average=2, Very
Poor=1, and N/A=0. A weighted average score of more than 3 is considered to be
satisfactory attainment of the respective objective or outcome. The corresponding test of
hypothesis used is:
H0: Weighted average =3 (outcome or objective is not achieved)
H1: Weighted average >3 (outcome or objective is not achieved)
Weighted average = (# of responses for a category*respective weight)/Total Resp.
Normality of underlying distribution is assumed in this analysis and this may be
appropriate because of the Central Limit Theorem. This assumption is usually tested
using normal probability plot analysis and the following test of hypothesis is used only
when the underlying distribution is normal. If the null hypothesis is rejected, it means that
the respective outcome is or objective is achieved. In this test of hypothesis, Z =
(Weighted Average 3) / (Std. Dev/Sqrt(n)). The test of hypothesis is carried out for
each objective and outcome using a level of significance of =0.05. If the test of
hypothesis is not applicable, a simple bar graph is used to check if the weighted average
score is 3 or more. These analyses of assessment data for 1999 through 2004 establish
that each and every one of the program objectives and outcomes is achieved. In arriving
at this conclusion, it should be noted that students in cooperative education and summer
internship positions may be sophomores, juniors, or seniors. Further, the N/A response is
quite large in many instances and this leads to the rejection of the null hypothesis and the
conclusion that the respective objective or outcome is not attained.
Alternatively, the same data may be analyzed using a test of hypothesis about
attribute p which is defined as fraction of employers satisfied with an outcome or
objective. The corresponding test of hypothesis for each outcome and objective is:
38
39
YEAR
1999-2004
Cumulative
Data
1999-2004
Data for
Individual
Years
The findings in every case are due to the large number of N/A
responses out of a total of about 7 to 15 responses.
40
41
42
YEAR
2000-01
Table 9: Evaluation of Assessment Data from Industrial Project Sponsor Surveys N/A Responses
FINDING
POTENTIAL CAUSES
ACTION
Ability to verbally
One out of eight persons on
Create course assignments that allow students to improve
and visually
sponsoring team assigned N/A for verbal and visual communication skills. Use rubrics to
communicate
this survey item. Some personnel provide effective feedback in all IE courses. See binders
effectively is below
on sponsoring team attended only for outcome (g) for evidence that this is not a current
average or is not
the final presentation and did not problem.
applicable.
read the project report.
Small sample size may have lead
to this conclusion.
1998-2005
Cumulative
Data from
the Sponsor
Survey
2002-2005
43
44
45
YEAR
Fall 2001,
Spring
2002,
F2004 &
S2005
46
Fall 2002,
Fall 2003,
Spring
2004,
F2004,
S2005
Failure to use
industrial-quality
laboratory equipment
and engineering
software for
analysis, testing,
design, and
communications.
47
Finding is contradicted by the other Cumulative assessment data from 2000 to 2005 does
not show that this continues to be a problem. So the
test of hypothesis and may be
small sample size distorted results.
false-negative.
Contact these graduates as alumni and see if the survey
reveals different results.
Investigate this issue further and develop action items
to improve the curriculum.
2000-2005
Cumulative
Data from
the Sponsor
Survey
48
49
50
51
IE 4130: Simulation
Use of papers on application of simulation modeling in the military, urban planning,
spread of diseases, spread of invasive vegetation, etc.
More detailed coverage of ethics and professional responsibility.
Use of full factorial design of experiments, collection of experimental data from
simulation runs, analysis of data, and interpretation of data.
Use of life-long learning activities in an assignment.
IE 4430: Total Quality Management
Emphasis on six-sigma quality improvement approach
Coverage of black belt skills
Coverage of ISO 9000-2000, ISO 16949, cGMP, and related standards.
Coverage of global issues and environmental issues
Assessment of knowledge of quality tools
Case studies on ethics with emphasis on quality management
Common framework for social accountability, environmental, and quality standards.
Emphasis on the use of Minitab.
IE 4930: Industrial Systems Design
Emphasis on different industries in WI
Cover may areas of industrial engineering
Current projects that are useful to sponsoring industry.
Emphasis on life-long learning
Emphasis on graduate school information, investment planning and retirement
planning
Emphasis on Sarbanes-Oxley Act
Emphasis on current topics relevant to the industrial projects.
52
53
REFERENCES
1. ISO 9000:2005, Quality management systems -- Fundamentals and vocabulary, International
Organization for Standardization (ISO), 1, ch. de la Voie-Creuse, Case postale 56, CH-1211
Geneva 20, Switzerland.
2. Montgomery, D., Introduction to Statistical Quality Control, John Wiley, 2005.
3. Rogers, G., Assessment Planning Flow Chart, 5233 Wagon Shed Circle, Owings Mills, MD
21117, email: grogers@abet.org
4. R.M. Felder and R. Brent, The ABCS of Engineering Education: ABET, Blooms
Taxonomy, Cooperative Learning, and so on, Proceedings of the 2004 American Society for
Engineering Education Annual Conference & Exposition, Session 1375,
5. Assessment Mechanisms, http://www.unh.edu/ccec/assessment/assessment.html.
6. B. S. Bloom (Ed.), Taxonomy of Educational Objectives: The Classification of Educational
Goals; pp. 201-207; Susan Fauer Company, Inc. 1956.
7. Rubrics for Evaluating Student Work, http://www.engr.sjsu.edu/assessment/topic/t1.html.
8. Use of Bloom's Taxonomy to Enumerate Attributes of EC-2000 Outcomes,
http://www.engrng.pitt.edu/~ec2000/ec2000_attributes.html
9. Defining the Outcomes: A Framework for EC-2000, Mary Besterfield-Sacre, Larry J. Shuman,
Member, IEEE, Harvey Wolfe, Cynthia J. Atman, Member, IEEE, Jack McGourty, Ronald L.
Miller, Barbara M. Olds, and Gloria M. Rogers, IEEE TRANSACTIONS ON EDUCATION,
VOL. 43, NO. 2, MAY 2000
10. Deming, W. E. (1986). Out of the Crisis. Cambridge, MA: Massachusetts Institute of
Technology, Center for Advanced Engineering Study.
11. The PDCA Cycle, http://www.dartmouth.edu/~ogehome/CQI/PDCA.html
12. Outcomes Assessment Rubrics, Department of Chemical Engineering, West Virginia University,
http://www.che.cemr.wvu.edu/ugrad/outcomes/
13. Assessment: Tutorial, Rubrics, etc., Electrical and Computer Engineering, Indiana University
Purdue University Indianapolis, http://www.engr.iupui.edu/ece/assessment/scoringRubrics.html
14. California State University Engineering Assessment Clearinghouse,
http://www.engr.sjsu.edu/assessment/
15. Assessment Rubrics, Department of Chemical Engineering, Auburn University,
http://eng.auburn.edu/programs/chen/programs/accreditation/assessment-rubrics.html
16. Authentic Assessment Toolbox, http://jonathan.mueller.faculty.noctrl.edu/toolbox/
17. Assessment Tools and Resources, University of Delaware, http://www.assessment.udel.edu/
18. Environmental Engineering Undergraduate Program - Scoring Rubrics for Our 12 Defined
Outcomes, Dept. of Civil and Environmental Engineering, University of Delaware,
http://www.ce.udel.edu/ABET/Current%20Documentation/ABET_scoring_rubrics_enveng.html
19. CRITERIA FOR ACCREDITING ENGINEERING PROGRAMS, Effective for Evaluations
During the 2007-2008 Accreditation Cycle. Incorporates all changes approved by the ABET
Board of Directors as of October 28, 2006. Engineering Accreditation Commission. ABET, Inc.
111 Market Place, Suite 1050, Baltimore, MD 21202.
20. Rogers, Gloria, Rubrics: What Are They Good for Anyway? Part I - An Assessment101
column in ABET's Community Matters newsletter, September 2006.
21. Rogers, Gloria, Rubrics: What Are They Good for Anyway? Part II - An Assessment101
column in ABET's Community Matters newsletter, October 2006.
54
22. Rogers, Gloria, Rubrics: What Are They Good for Anyway? Part III - An Assessment101
column in ABET's Community Matters newsletter, November 2006.
23. Rogers, Gloria, Assessment Planning,
http://www.abet.org/assessment.shtml#Assessment%20matrix
55