[go: up one dir, main page]

0% found this document useful (0 votes)
26 views9 pages

Chapter 4 MAAM ELLA Narrative Report FINAL

Uploaded by

jason.toballas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views9 pages

Chapter 4 MAAM ELLA Narrative Report FINAL

Uploaded by

jason.toballas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Republic of the Philippines

Eastern Visayas State University


College of Education
Tacloban City

CURRICULUM AND ASSESSMENT FOR PHYSICAL


EDUCATION AND HEALTH EDUCATION

EVALUATING THE CURRICULUM

Submitted By:
BARDAJE, JANE
DURAN, ROBELYN
MALATE, RANIEL JAMES
REGALA, MARY NARMIE
TOBALLAS, JASON

Submitted to:
MS. SUSANA F. ESPINA
Chapter 4: Evaluating the Curriculum
Curriculum evaluation is a component of curriculum development that responds to public
accountability. It looks into educational reforms or innovations that happen in the teachers’
classrooms, the school, district, division, or the whole educational system as well. It is
establishing the merit of the curriculum.
Curriculum evaluation is premised on the concept of alignment of planned, written, and
implemented curriculum. It is an attempt to answer two big questions as:
1. Do planned courses, programs, activities as written and implemented produce
desired outcomes?
2. How can these school curricula be improved?

Curriculum Evaluation: A Process and a Tool


Curriculum as a process it follows a procedure based on models and frameworks
to get to the desired results. Then curriculum as a tool, it will help the teachers and
program implementers to judge the worth and merit of the program and innovation or
curricular change. For both process and a tool, the results of evaluation will be the basis to
IMPROVE curriculum.

Curricularists defines curriculum evaluation:

Persons Definition

Curriculum evaluation is a process


Ornstein, A. & Hunkins, F. (1998) done in order to gather data that
enables one to decide whether to
accept, change, eliminate the whole
curriculum of a textbook.
Evaluation answers two questions: 1.
McNeil, J. (1977) Do planned learning opportunities,
programmes, courses and activities as
developed and organized actually
produced desired results? 2. How can
a curriculum best be improved?
Evaluation is to identify the weakness
Gay, L. (1985) and strengths as well as problems
encountered in the implementation, to
improve the curriculum development
process. It is to determine the
effectiveness of and the returns on
allocated finance.
It is a process of delineating, obtaining
Olivia, P. (1988) and providing useful information for
judging alternatives for purposes of
modifying, or eliminating the
curriculum.

REASONS FOR CURRICULUM EVALUATION:


 Curriculum evaluation identifies the strengths and weaknesses of an existing
curriculum that will be the basis of the intended plan, design or implementation.
 When evaluation is done in the middle of the curriculum development, it will tell if
the designed or implemented curriculum can produce or is producing the desired
results.
 Based on some standards, curriculum evaluation will guide whether the results have
equaled or exceeded the standards, thus can be labelled as success.
 Curriculum evaluation provides information necessary for teachers, school
managers, curriculum specialists for policy recommendations that will enhance
achieved learning outcomes.

CURRICULUM EVALUATION MODELS


1.Bradley Effectiveness Model
-In 1985, L.H. Bradley wrote a hand book on Curriculum Leadership and
Development. This book provides indicators that can help measure the effectiveness of a
developed or written curriculum.
Bradley’s Effectiveness Model for Curriculum Development Indicators
Indicators Descriptive Questions Yes or No
Vertical Curriculum Does the curriculum reflect
Continuity the format (i.e. K to 12, OBE,
Inquiry, etc.) that enables
teachers quickly access what
is being taught in the
grade/year levels below or
above the current level?
(Example: If you are looking
at science 5, below means
Science 4 and above means
Science 6)
Horizontal Curriculum Does the curriculum provide
Continuity content and objectives that
are common to all classes of
the same grade level?
(Example: All English 101 for
all 1st year college students)
Instruction Based on Are lesson
Curriculum plans/syllabi/course design
derived from the curriculum
and strategies? Are materials
used correlated with the
content, objectives and
activities?
Broad Involvement Is there evidence of
involvement of the different
curriculum stakeholders in
the planning, designing and
implementation and review
of the curriculum?
Long Range Planning Is review cycle followed
within the period of planning
and implementation of the
curriculum?
Positive Human Relations Did the initial thoughts about
the curriculum come from
teachers, principals,
curriculum leaders and other
stakeholders?
Theory-Into Practice Is there clarity of vision,
mission, graduation
outcomes, program
philosophy, learning
outcomes in the curriculum?
Planned Change Are there tangible evidence to
show that the internal and
external publics accept the
developed program?
If any of the indicators is answered with a “No”, actions should be made to make it
Yes.

2. The Tyler Objectives Centered Model


is a curriculum evaluation model proposed by Ralph Tyler in 1950. This model is
still widely used today and has influenced many curriculum assessment processes. Tyler
Objectives Centered Model purpose is to provide a systematic framework for evaluating the
effectiveness of a curriculum. It helps educators to ensure that their curriculum is aligned
with clear learning objectives, implemented in an appropriate context, and assessed using
valid and reliable methods.

Curriculum Elements Evaluation Process Action taken:


Yes or No
1. Objectives/Intended To clearly define the
Learning Outcomes intended learning outcomes
of the curriculum. These
should be specific,
measurable, achievable,
realistic, and time-bound
(SMART).
2. Situation or Context To identify the situation or
context in which the
curriculum will be
implemented. This includes
factors such as the students'
backgrounds, the school's
resources, and the
community's expectations.
3. Evaluation Once the objectives and
Instruments/Tools context are defined, it is
necessary to select or
develop evaluation
instruments or tools. These
tools should be valid,
reliable, and appropriate for
measuring the intended
learning outcomes.
4. Utilization of Tool The evaluation instruments
or tools should be used to
collect data on the students'
learning. This data can be
used to assess the
effectiveness of the
curriculum and to identify
areas for improvement.
5. Analysis of Results The data collected through
the evaluation process
should be analyzed to
determine the strengths
and weaknesses of the
curriculum. This analysis
should identify the reasons
for any observed patterns in
the data.
6. Utilization of Results The results of the
evaluation should be used
to make necessary
modifications to the
curriculum. This may
involve revising the
objectives, changing the
teaching methods, or
adding new resources.
Using all the steps to evaluate the curriculum and obtain all the YES answer would
mean the curriculum has PASSED the standard. Tyler's model of evaluating the
curriculum is relatively easy to understand which many teachers can follow.

3.Daniel Stufflebeam’s Context, Input, Process Product Model (CIPP)


The CIPP Model of Curriculum Evaluation was a product of the Phi Delta Kappa
committee chaired by Daniel Stufflebeam. The model made emphasis that the result of
evaluation should provide data for decision making. There are four stages of program
operation. These include (1) CONTEXT EVALUATION, (2) INPUT EVALUATION, (3)
PROCESS EVALUATION and (4) PRODUCT EVALUATION. However, any evaluator can only
take any of the four stages as the focus of evaluation.
FOUR STAGES OF PROGRAM OPERATION.
 Context Evaluation- assesses needs and problems in the context for decision
makers to determine the goals and objectives of the program/curriculum.
 Input Evaluation – assesses alternative means based on the inputs for the
achievement of objectives to help decision makers to choose options for optimal
means.
 Process Evaluation- monitors the processes both to ensure that the means are
actually being implemented and make necessary modifications.
 Product evaluation-compares actual ends with intended ends and leads to a series
of recycling decisions.

For all the four stages, the six steps are suggested.
Stages of the CIPP Model Steps Taken in All the Stages
1. Context Evaluation Step 1: Identify the kind of decision to be made.
2. Input Evaluation Step 2: Identify the kinds of data to make that decision
3. Process Evaluation Step 3: Collect the data needed
4. Product evaluation Step 4: Establish the criteria to determine the quality of
data.
Step 5: Analyze data based on the criteria.
Step 6: Organize needed information needed for decision
makers.

4.Stake Responsive Model


Responsive model is oriented more directly to program activities than program
intents. Evaluation focuses more on the activities rather than intent or purpose.
ROBERT E. STAKE
 Leader in the development of program evaluation methods
 Director of Center for Instructional Research and Curriculum Evaluation (CIRCE)
 Professor Emeritus of Education at the University of Illinois, Urbana-Champaign
(appointed (1998)

Robert Stake (1975) recommends to the curriculum evaluator the following steps.
The curriculum evaluator follows the steps below.
Step 1 Meets with stakeholders to identify their
perspectives and intentions regarding
curriculum evaluation.
Step 2 Draws from Step 1 documents to determine
the scope of the evaluation.
Step 3 Observes the curriculum closely to identify
the unintended sense of
implementation and any
deviations from announced
intents.

Step 4 Identifies the stated real purposes of the


program and the various audiences.
Step 5 Identifies the problems of the curriculum
evaluation at hand and identifies an
evaluation design with needed data.
Step 6 Selects the means needed to collect data or
information.
Step 7 Implements the data collection procedure.
Step 8 Organizes the information into themes.
Step 9 Decides with stakeholders the most
appropriate formats for the report.
5.Scriven’s Model of Curriculum Evaluation
Michael Scriven, a prominent American evaluation theorist, developed the Scriven’s
Model of Curriculum Evaluation in the mid-1960s. This model is designed to provide a
systematic and comprehensive evaluation of educational programs and curricula. It
emphasizes the importance of defining clear objectives and standards for evaluation, as
well as collecting and analyzing data to determine the curriculum’s effectiveness.

The Scriven’s Model comprises five main stages:

1. Defining Objectives and Criteria: The first step in the Scriven’s Model is to
define clear and specific objectives for the curriculum. These objectives
should be measurable and observable, allowing evaluators to assess the
extent to which they are achieved. Additionally, evaluators establish
evaluation criteria or standards that will be used to measure the curriculum’s
success.
2. Designing the Evaluation Plan: In this stage, evaluators develop a detailed
plan for conducting the evaluation. The plan outlines the methods, data
collection techniques, and instruments to be used in the evaluation process.
It also identifies the sources of data and the target population (e.g., students,
teachers, administrators) for data collection. The evaluation plan should align
with the defined objectives and criteria.
3. Data Collection: During this stage, evaluators collect data from various
sources to assess the curriculum’s effectiveness. Data collection methods may
include surveys, interviews, observations, tests, and academic performance
records. The data collected should be relevant to the objectives and criteria
defined earlier.
4. Data Analysis and Interpretation: After gathering the data, evaluators
analyze and interpret the findings to draw conclusions about the
curriculum’s performance. Data analysis techniques may involve statistical
methods, content analysis, and qualitative coding to make sense of the
information collected. The analysis should provide valuable insights into the
strengths and weaknesses of the curriculum.
5. Reporting and Recommendations: The final stage of the Scriven’s Model
involves presenting the evaluation results in a comprehensive report. The
report should include a summary of the evaluation findings, an interpretation
of the data, and conclusions about the curriculum’s effectiveness in achieving
its objectives. Additionally, evaluators may make recommendations for
curriculum improvement based on the evaluation findings.
Conclusion:
Scriven’s Model of Curriculum Evaluation provides a structured and systematic
approach to assess the effectiveness of educational curricula. By focusing on clear
objectives, data collection, and analysis, the model helps evaluators make informed
decisions about curriculum improvement.

In summary, whatever models of curriculum evaluation to be used, ASCD, 1983


suggests the following steps.
Steps in Conducting a Curriculum Evaluation
Steps What to Consider
1.Identifying primary audiences  Curriculum Program Sponsors,
Managers and Administrators,
School Heads, Participants
(Teachers and Students) Content
Specialist; other stakeholders.
2.Identifying critical issues/problems  Outcomes (expected, desired,
intended) Process (Implementation)
Resources (Inputs)
3.Identifying data source  People (teachers, students, parents,
curriculum developers) Existing
documents; Available records;
Evaluation studies.
4.Identifying techniques for collecting  Standardized Test, Informal tests;
data Samples of Students Work;
Interviews; Participant
Observations, Checklist, Anecdotal
records
5.Identifying established standards and  Standards previously set by agency
criteria (DepEd, CHED, Professional
Organization)
6.Identifying techniques in data analysis  Content Analysis, Process Analysis,
Statistics, Comparison, Evaluation
Process
7.Preparing evaluation report  Written; Oral; Progress; Final;
Summary; Descriptive, Graphic,
Evaluative and Judgmental; List of
Recommendations
8.Preparing modes of display  Case studies; Test Scores Summary;
Testimonies; Multimedia
representation; Product Display
(exhibits); Technical Report

You might also like