International Journal of Assessment and Evaluation in Education
Vol 6/Dec 2016(1-8) ISSN 2232-1926
Submission Date: 20th January 2016
Acceptance Date: 13th February 2016
Empirical Analysis of Item Difficulty and Discrimination Indices of Senior
School Certificate Multiple Choice Biology Tests
OLUTOLA, Adekunle Thomas
Department of Educational Foundations,
Faculty of Education, Federal University, Dutsin-Ma, Katsina State, Nigeria.
aolutola@fudutsinma.edu.ng, olutolaade@yahoo.com
Abstract
This study empirically analyzed the item difficulty and discrimination indices of senior school certificate
examination SSCE multiple choice biology tests used by West African Examinations Council and National
Examinations Council in Nigeria. Sample for the study consisted of 1450 Senior Secondary Three students
drawn from 20 randomly selected secondary schools. Item by item analysis was used for obtaining
difficulty and discrimination indices. Findings from the study showed that 2008 SSCE Biology multiple
choice test had mean difficulty index of 0.42 and slightly higher than NECO Biology multiple choice test
with mean difficulty index of 0.40 and 2008 SSCE in Biology had a discriminating power of 0.43 and
higher than NECO with mean discriminating power of 0.39. It was recommended that 4 option items in
multiple choice Biology tests should be encouraged but if 5 options items should be used more attention
should be given to psychometric properties of tests.
Keywords difficulty indices, discrimination indices, multiple choice biology tests
INTRODUCTION
Evaluation plays an important role in the educational process and development. It is crucial for teachers to
make use of best evaluation practices in order to help the students to have better results in internal and
external examinations. Evaluation can be described as systematic processes of determining the extent to
which instructional objectives are achieved by students (Gronlund, 1981). Therefore, students’
achievement can be used to determine to a large extent the degree of success or failure of an educational
practice. Teachers’ carry out a routine evaluation of school learning to achieve various objectives, but this
is essentially internal.
These internal evaluations go by such names as teacher made tests, Continuous Assessment,
School Based Assessment and local tests. For the conduct of external examinations however, there are
recognized bodies that carry this assignment out for the whole country (Nigeria) and award certificates to
candidates at different levels. The National Examinations Council (NECO) and the West African
Examinations Council (WAEC) and National Business and Technical Examination Board (NABTEB) are
the bodies authorized by the Nigerian law to conduct the Senior School Certificate Examinations (SSCE),
General Certificate Examinations (GCE) and other exams. NECO, NABTEB and WAEC carry out
summative evaluation of the Criterion Referenced Tests. Summative Evaluation is the type of evaluation,
which typically comes at the end of a course of instruction. It is used primarily for assigning course grades
of the intended learning outcomes. The purpose of summative evaluation is to assess the overall
effectiveness of a programme (Susan, 2003; Nuhfer, 1996).
Another form of evaluation according to Tessmer (1993) and Scriven (1991) is the formative
type. This refers to a form of structured testing procedure that is executed while teaching and learning are
on-going with a view to bring about improvements (Susan, 2003; Nuhfer, 1996). Tests used in formative
evaluation are mostly teacher made tests and thus internal to the school system (Alonge, 2003). The
OLUTOLA, Adekunle Thomas
school system adopts some other forms of evaluation in a complementary manner with formative and
summative evaluation such as placement, diagnostic evaluation and so on.
The common form of test used for both formative, summative evaluation and other forms of
evaluation by teachers for internally conducted assessments and statutory examining bodies for external
assessment are of the objective, essay and practical variants. Objective tests are not only popular in
internal and external examinations; they also play a crucial role in assessment processes in the school
system. The multiple choice test type of objective test is regarded as the most applicable, flexible and
useful type of objective test items. Multiple choice tests are widely acclaimed as most reliable because of
consistency in scoring the test as well as its fairness to all students (Osunde, 2009). Multiple choice tests
discourage the learner’s tendency to anticipate likely questions but encourage them to cover the whole
contents taught in their preparations. They are also useful in assessing learners’ mastery of specific facts,
concepts, terms, laws and principles (Kolawole, 2005; Lawal, 2001).
According to Kolawole (2005), multiple choice tests require students to select the answer from a
number of possible alternatives. Multiple choice items give the fairest opportunity to testees to prove their
competence and testers to prove their integrity. Its objectivity is both in terms of development and scoring
as items cover wider curriculum contents and objectives of instruction. It is adjudged as having good
validity since it has the tendency to cover all aspects of learning content (Alonge, 2003 ; Lawal, 2001).
The usefulness of multiple choice tests (MCT) for achieving objectives of testing depends on its
quality and properties. The importance of difficulty indices and discrimination power in multiple choice
items cannot be overemphasized. According to Schumacker (2005), Classical Test Theory (CTT) utilizes
traditional item and sample dependent statistics i.e. item difficulty and item discrimination. In classical
theory, the two statistics that form the cornerstone are item difficulty and item discrimination.
Adewuyi and Oluotun (2001) described difficulty index of an item as the extent to which an item
has been answered correctly by the testees. That is, the percentages of the testees that select the right
option (Alonge, 2003). Going by this definition of item difficulty, the closer to one the value of the
difficulty index is, the simpler the item and the closer the value to zero, the more difficult the item is.
Difficulty index actually tells us how easy the item was for the students in that particular group.
The higher the difficulty index the easier the question and the lower the difficulty index the more difficult
the question. The difficulty index, in fact equals to “Easiness index” (Zafar, 2008). Abiri (2006) also
indicated that multiple choice tests with fewer numbers of options have better difficulty indices than those
with larger number of options.
Discrimination power of multiple choice items on the other hand is the ability to discriminate
between the brilliant students and poor students (Alonge, 2003). On the other hand, Oyejide (1991)
described discrimination power as the strength of each item to distinguish the higher achievers (those who
are more competent) from the lower achievers (those who are less competent). Discrimination power of a
test ranges from zero to one (0-1). The closer this value is to one the better the item is (Oyedeji, 1991 &
Kelly, 1989). The index of discrimination is also the extent to which a test is correctly responded to by
those examinees possessing more of the traits being measured (Alonge, 2003; Ebel, 1979).
The role of assessment or test is very vital in evaluating students in the school setting. West
African Examinations Council (WAEC) and National Examinations Council (NECO) organize the Senior
Secondary Certificate Examination (SSCE) in Nigeria and they are essentially used for certification. In
Nigeria, MCT is being used by examining bodies like National Examinations Council (NECO) and West
African Examinations Council (WAEC) and other public examination bodies.
National Examinations Council (NECO) was created by a decree in April, 1999. It has its
headquarters in Minna, Niger State. NECO took off on 26th April, 1999. The National Examinations
Council established in April, 1999 has the sole responsibility of conducting the Senior School Certificate
Examination (SSCE), hither-to being conducted by West African Examinations Council (WAEC)
(WAEC, 2007). The National Examination Council (NECO) conducted its maiden June/July SSCE in the
year 2000 and has since continued to conduct Senior School Certificate Examination (SSCE) twice in a
year (June/July and November/December) alongside with the West African Examinations Council
(NECO, 2007).
West African Examinations Council as one of the examining bodies in Nigeria was established in
1952 following the acceptance of the Jeffery’s Report by the then colonial Governments established by
five West African Governments namely Ghana, Liberia, Gambia, Nigeria and Sierra Leone who passed
appropriate ordinances in their Legislative Assemblies in 1951 in collaboration with and in succession to,
2
Empirical Analysis of Item Difficulty and Discrimination Indices...
the Cambridge School Certificate Syndicate. It has his headquarters in Accra, Ghana while the Nigeria
headquarters is in Yaba, Lagos (WAEC, 2007). It has the sole responsibility of organizing and conducting
secondary schools and public examinations in West African countries such as the Gambia, Ghana,
Liberia, Nigeria and Sierra Leone.
One of the emphases of the Nigerian educational policy is that citizens must acquire scientific and
technological education. Biology is one of the science subjects and has links with other science subjects. It
is the general field of knowledge concerned with the study of all aspects of living organisms. According to
Parker (1992), Biology embraces those principles of widest application to the origin, growth and
development, structure, function, evolution and distribution of plants and animals. It is also the bedrock
upon which some science subjects derive their being (origin). Biology as the science of life enables the
individuals to understand themselves, the parts and functions of their bodies. Biology has been subdivided
into separate branches such as Botany, Zoology, Physiology, Genetics, Morphology, Anatomy, and
Biochemistry and so on. It is the fact that no student intending to study these disciplines can do without
Biology.
The students’ poor performance in Biology has drawn attention of researchers and curriculum
planners towards Biology as a subject in the school curriculum (Kareem, 2003). In spite of the importance
and popularity of Biology among Nigerian students, performance at senior secondary school level had
been poor (Ahmed, 2008). The desire to know the causes of the poor performance in Biology has been the
focus of researchers for some time now. The Chief Examiners’ reports indicate that students have similar
weaknesses in Biology papers which lead to students’ poor performance in West African Senior School
Certificate Examination (WASSCE) in Biology papers.
WASSCE Chief Examiners’ Report May/June (2004) shows that the students’ performance in
Biology paper 1 was slightly poorer than the previous year with a mean score of 20 and a standard
deviation of 9.60. In Biology paper 1 and the mean score for the paper 2 was 20 while the standard
deviation was 7.48. WASSCE Chief Examiners’ Report Nov/Dec. (2004) shows that the mean for
Biology paper 2 (Essay) was 22 while the standard deviation was 8.94. Students’ performance in this
paper was fair compared with previous years. But, the performance of the candidates was poorer than the
previous year in paper 3 (alternative to practical).
WASSCE Chief Examiners’ Report Nov/Dec (2007) indicates that the performance of the
students in paper 2 (Essay) was slightly poorer than that of the previous year with a mean score of 17 and
a standard deviation of 8.77 compared to a mean of 18 and S.D of 8.94 for Nov/Dec 2006. In addition,
students’ performance in paper 3 (alternative to practical) was slightly poorer in 2007 with a mean score
of 24 and a standard deviation of 12.17 compared with a mean of 28 and standard deviation of 7.64 for
Nov/Dec 2006 WASSCE). Researchers shifted the blame of students’ poor performance in Biology on
teacher laxity, students poor study habits, parents poor attitude to their children education and so on
without thinking on the properties of tests such as difficulty indices and discrimination power of test
items.
There is therefore need to analyze the difficulty indices and discrimination power in Senior
School Certificate Multiple Choice items in Biology. Both WAEC and NECO make use of multiple
choice Biology items in their examinations. The researchers also compared the difficulty indices and
discrimination power in Senior Secondary School Certificate Multiple Choice items in Biology.
The purpose of the study was to empirically analyze the item difficulty and item discrimination
indices of Senior Secondary School Certificate multiple choice Biology tests. Specifically, the study
focused on:
1. Item difficulty index of each of the NECO and WAEC SSCE multiple choice Biology tests and;
2. Item discrimination power of each of the NECO and WAEC SSCE multiple choice Biology test.
RESEARCH QUESTIONS
This research work investigated the item difficulty indices and item discrimination power of Senior
Secondary School Certificate multiple choice items in Biology with view to find the answers to the
following research questions:
3
OLUTOLA, Adekunle Thomas
1. What is the difficulty index of each of the NECO and WAEC SSCE multiple choice Biology tests?
2. What is the discrimination power of each of the NECO and WAEC SSCE multiple choice Biology
tests?
METHODOLOGY
This study adopted a descriptive survey research design. The population for this study consists of all
senior secondary school students in Ekiti State. Ekiti State is one of the states in the South Western part of
Nigeria. The state has a largely agrarian economy and most communities engage in growing cash and
food crops. Survey research design was chosen for this study because the data were collected through the
tests.
The target population for this study consists of senior secondary school three (SS3) students in
Ekiti State. Stratified random sampling technique was adopted for the study. The schools were stratified
along three Senatorial Districts in Ekiti State. In Ekiti Central and Ekiti North Senatorial Districts 7 senior
secondary schools each were selected while in Ekiti South 6 senior secondary schools were selected.
Thus, a total of twenty senior (20) senior secondary schools were selected.
In this study, 576 (40%) respondents were selected from Ekiti Central, followed by Ekiti South,
with 466 (32%), while Ekiti North had 408 (28%) respondents. Thus, one thousand four hundred and fifty
(1450) students were randomly selected to take part in the study. The researcher adopted the 2008
National Examinations Council (NECO) and West African Examinations Council (WAEC) multiple
choice Biology question papers to collect the data. This is because multiple choice tests are the strongest
predictors of overall students’ performance compared with other forms of evaluation.
2008 WAEC and NECO SSCE multiple choice Biology items have a total of 120 items including
540 options (420 decoys and 120 correct responses). The whole 120 items were reviewed.
The instruments were standardized tests used by these examination bodies in Nigeria and West
Africa. These instruments were considered to be accurate and reliable by the public examination bodies.
The data collected from this study were analyzed with respect to the two research questions
generated for this study. Item analysis was carried out for research questions 1 and 2 i.e. to obtain the
difficulty indices and discrimination power of the test items.
RESULTS
Students were asked to indicate whether they were male and female. Their responses are summarized on
Table 1 below.
Table 1 Distribution of respondents according to gender.
Gender Frequency Percentage
Male 758 52.3
Female 692 47.7
Total 1450 100.00
In Table 1, out of the one thousand four hundred and fifty (1450) students sampled, seven hundred and
fifty-eight (758) 52.3% students were males while six hundred and ninety-two (692) 47.7% students were
females.
Answers to Research Questions
Research Question One: What is the difficulty index of each of the NECO and WAEC SSCE multiple
choice Biology tests?
Item difficulty index of each NECO and WAEC SSCE multiple choice Biology tests were determined
by following the CTT principle of selecting the top and bottom 27% of the testees for obtaining this item
4
Empirical Analysis of Item Difficulty and Discrimination Indices...
characteristic. The number of students in the upper and lower groups who got each item right was
obtained by frequency count and the proportion getting the items right was calculated. The difficult index
obtained was then summarized and presented in grouped frequency distribution Tables 2.The researcher
used the quality criterion recommended by Tarrant, Ware and Mohammed, (2009) and Theodorsson,
Shafil, Wardy, Khan, Mahrezi and Shafaee, (2010) to interpret the difficulty index obtained.
Table 2 Item difficulty of 2008 NECO and WAEC SSCE multiple choice biology tests items.
Ranges of Difficulty NECO % WAEC % Description
Index Frequency Frequency
0 – 14% 1 2 1 2 Very difficult
15% - 29% 16 26 5 8 Difficult
30% - 69% 40 67 53 88 Moderately difficult
70% - 84% 1 2 - - Easy
85% - 100% 2 3 1 2 Very easy
Total 60 100% 60 100% -
Table 2 reveals that in 2008 NECO SSCE multiple choice Biology items, 1 (2%) of the items was very
difficult, 16 (26%) of the items were difficult, 40 (67%) of the items were moderately difficult, 1 (2%) of
the items was easy and 2 (3%) of the items was very easy.
Also, Table 2 reveals that, in 2008 WAEC SSCE multiple choice Biology items, 1 (2%) of the items
was very difficult, 5 (8%) of the items were difficult, 53 (88%) of the items were moderately difficult and
1 (2%) of the items was very easy.
Table 3 below compares the mean (x) of item difficulty indices of NECO and WAEC SSCE multiple
choice Biology tests items.
Table 3 Comparison of the item difficulty in 2008 NECO and WAEC SSCE multiple choice biology tests items.
Variable N Mean Difficulty
NECO 60 0.40
WAEC 60 0.42
Table 3 shows the comparison of item difficulty in NECO and WAEC SSCE multiple choice Biology test
items. From Table 3 above, it can be observed that WAEC SSCE Biology items had highest mean
difficulty of 0.42 while NECO SSCE Biology items had mean difficulty of 0.40. This shows that WAEC
SSCE multiple choice Biology test items have more difficulty items than NECO SSCE multiple choice
Biology test items.
Research Question Two: What is the discrimination power of each of the NECO and WAEC SSCE
multiple choice Biology tests?
Item discrimination power of each of the NECO and WAEC SSCE multiple choice Biology test items
were determined by subtracting the number of the students in the lower group who got the item right from
the number of those in the upper group who also got it right and dividing this differences by half of the
total number of the testees in the two groups combined. The discrimination index obtained was then
summarized and presented in grouped frequency distribution Tables 4. The researcher used the quality
criterion recommended by Tarrant, Ware and Mohammed, (2009) and Theodorsson, Shafil, Wardy,
Khan,Mahrezi and Shafaee, (2010) to interpret the discrimination power obtained.
Table 4 Discrimination power of 2008 NECO & WAEC MCT items in Biology.
Ranges of NECO Percentage WAEC Percentage Description
Discrimination Index frequency Frequency
-0.00 – 0.18 14 23.3 6 10 Poor
0.19 – 0.29 8 13.3 10 17 Moderate
0.30 – 0.39 9 15 4 7 Good
5
OLUTOLA, Adekunle Thomas
0.40 – 1.00 29 48.3 40 66 Very Good
Total 60 100% 60 100% -
Table 4 reveals that 14 (23.3%) of the items were poor, 8 (13.3%) were moderate, 9 (15%) were good and
29 (48.3%) were very good in NECO SSCE Multiple Choice Biology Items .In addition, Table 4 shows
that 6 (10%) of the items were poor, 10 (17%) were moderate, 4 (7%) were good and 40 (66%) were very
good in WAEC SSCE Multiple Choice Biology Items.
Table 5 below compares the mean (x) of item discriminations in NECO and WAEC SSCE multiple
choice Biology test items.
Table 5 Comparison of the discrimination power of 2008 NECO and WAEC SSCE
multiple choice biology test items.
Variable N Mean Discrimination
NECO 60 0.39
WAEC 60 0.43
Table 5 shows the comparative analysis of NECO and WAEC SSCE multiple choice Biology test items
according to the level of item discrimination indices. From the Table 5, it can be observed that, WAEC
SSCE Biology items had highest mean discrimination of 0.43 while NECO SSCE Biology items had
mean discrimination of 0.39. Therefore, WAEC SSCE multiple choice Biology test items have more
discriminating items than NECO SSCE multiple choice Biology test items.
DISCUSSION OF FINDINGS
It was found that WAEC SSCE multiple-choice Biology test have more difficult items than NECO SSCE
multiple choice Biology test. WAEC SSCE multiple choice Biology test have mean difficulty of 0.42
while NECO SSCE multiple choice Biology test have mean difficulty of 0.40. This shows that WAEC
SSCE multiple choice Biology test have more difficulty items than NECO SSCE multiple choice Biology
test. The findings disagree with the studies of Thorndike and Hagen (1978) and Romans and Stein (1993)
which says that, the five option formats have better difficulty indices. The study support the findings of
Abiri, (2006) which say difficulty indices of multiple choice test with fewer number of options say four is
better than anyone with larger number of options.
The higher mean difficulty index discovered in WAEC may be caused by the number of options in
WAEC SSCE multiple choices Biology test. Four option formats in WAEC have higher difficulty than
five option formats in NECO. The findings of this study contradicted the findings of Kolawole (2007)
which says that there is no significant difference between the difficult levels of WAEC and NECO
multiple choice items in mathematics. Therefore, both WAEC and NECO multiple choice tests in
mathematics have the same difficulty levels.
It was found that WAEC 2008 SSCE multiple choice Biology test have more discriminating items
than NECO 2008 SSCE multiple choice Biology test. WAEC SSCE multiple choice Biology test have
mean discrimination of 0.43 while NECO SSCE Biology test has mean discrimination of 0.39. This shows
that WAEC SSCE multiple choice Biology test have more discriminating items than NECO SSCE
multiple choice Biology test. This finding is supported by that of finding of Olatunji (2007) that 4 option
formats of WAEC SSCE multiple choice tests have better discriminating indices than NECO SSCE
multiple choice test in Economics.
CONCLUSION
The 2008 WAEC SSCE multiple choice Biology test have more moderate difficult items than NECO
SSCE multiple choice Biology test. Fewer options (4 options) are better written for multiple choice
Biology test than larger numbers of options say five options. The 2008 WAEC multiple choice Biology
test (with 4 options) discriminated better than 2008 NECO multiple choice test (with 5 options). The
6
Empirical Analysis of Item Difficulty and Discrimination Indices...
result of this study should be seen as an indication to teachers, item writers, subject officers,
psychometricians and examinations bodies that students’ performance can be enhanced positively by
adopting the use of fewer options (say 4 options) for multiple choice Biology tests. The findings of this
study will help to provide a leeway for solving the problem of mass failure of the candidates faced by the
various examination bodies.
RECOMMENDATIONS
Based on the findings and conclusions drawn in this study, the following recommendations are made to
relevant educational authorities and examination bodies and other stakeholders in education.
1. 4 option items especially in multiple choice Biology tests should be encouraged but if 5 options
items should be used more attention should be given to psychometric properties of tests;
2. Teachers should pay particular attention to principles of test construction and item writing to
reduce the problems of item difficulty indices and discrimination power, of multiple choice
Biology tests;
3. Governments should periodically organize in-service training programme for teachers on regular
basis to broaden their knowledge in test construction, test administration and interpretation in
order to improve students’ performance in Biology and;
4. The two examining bodies should meet and agree on the-same number of decoys of the test items
for multiple choice biology items to avoid unnecessary standard comparability.
REFERENCES
Abiri, J. O. O. (2006). Elements of evaluation measurement and statistical techniques in education. Ilorin: Library
and Publication committee, University of Ilorin, Nigeria.
Adewumi, J. O., & Oluokun, O. (2001). Introduction to test and measurement in education. Oyo: Odumatt Press
Publishers.
Ahmed, M.A. (2002). Analysis of secondary school Biology teachers’ ratings of the difficulty level of concepts in
nutrition. Ilorin: Unpublished M. Ed Thesis, University of Ilorin.
Ahmed, M.A. (2008). Influence of personality factors on Biology lecturers assessment of difficulty levels of genetics
in Nigerian Colleges of Education. Unpublished Ph.D. Thesis, University of Ilorin, Ilorin.
Alonge, M.F. (2003). Assessment and examination: The pathways to educational development. Inaugural lecture.
University of Ado-Ekiti.
Ebel, R.L. (1979). Essential of educational measurement. New Jersey: Prentice Hall.
Gronlund, N.E. (1981). Measurement and evaluation in teaching, 4th Edition. New York: Macmillian Publishing Co.
Inc.
Kareem, O. L. (2003). Strategies for teaching Biology concepts: An educational technologists perspectives. Journal
of Postgraduate Students Association. (POGSASS). (4), 25-39.
Kareem, O. L. (2003). Effects of autographic self – instructional packages on senior secondary school students’
performance in Biology in Ilorin, Nigeria. Ilorin: Unpublished Ph.D. Thesis, University of Ilorin.
Kelly, T.L. (1989). The selection of upper and lower groups for the validation of test items. Journal of Educational
Psychology, 30.
Kolawole, E. B. (2005). Test and measurement. Lagos: Bolabay Publications.
Kolawole, E.B.(2007). A comparative analysis of psychometric properties of two Nigerian examining bodies for
senior secondary schools mathematics. Research Journal of Applied Sciences, 2(8): 913-915.
Lawal, A. (2001). Evaluation of students’ learning outcomes 1: types and uses of tests. In I.O. Abimbola (Ed.).
Fundamental Principles and Practice of Instruction. Ilorin: Belodan (Nig) Enterprises & Tunde Babs Printers.
National Examinations Council. (2004). Focus about NECO. Minna: Regent Ltd.
National Examinations Council (NECO). (2006 – 2009). Regulations and syllabuses for Senior School Certificate
Examination (SSCE) for candidates in Nigeria. Niger State: National Headquarters Minna.
National Examinations Council (NECO). (2007). National Examination Council, Retrieved 20/05/2011 at
info@neconigeria.com; service@neconigeria.org.
Nuhfer, F.B. (1996). The place of formative evaluations in assessment and ways to reaps their benefits. Denver:
University of Colorado.
7
OLUTOLA, Adekunle Thomas
Olatunji, D.S. (2007). Effects of number of options on psychometic properties of multiple choice tests in Economics.
Ilorin: Unpublished M.Ed Thesis, University of Ilorin.
Osunde, A. (2009). Essay and multiple choice tests: Bridging the gap. Workship Papers on Multiple Choice Test
Item Writing Procedures for Academic Staff, University of Ilorin, Ilorin, on Monday 4 th Monday, 2009. pp. 14-
24.
Oyejide, A. P. (1991). Effects of confidence scoring procedures on the psychometric properties of three multiple
choice test formats.
Parker S.P. (1992). Concise encyclopedia of science and technology (6th Edition). New York; MC Graw-Hill,
Companies P 252.
Ramons, R.L. and Stern, J. (1993). Item behaviour associated with changes in the number of alternatives in multiple
choice. Item Journal of Educational Measurement, 10.
Schumacker, R. E. (2005). Class test analysis. London: Applied Measurement Associates.
Scriven, M. (1991). The methodology of evaluation. Chicago: R and Mc Nally.
Susan, A.N. (2003). Formative evaluation tools for campus conflict resolution and mediation programmes:
overview. Denvier: University of Colorado.
Tarrant, M; Ware, J and Mohammed, A. M. (2009). An assessment of functioning and non functioning distractors in
multiple choice questions. China: A Descriptive Analysis, Department of Nursing Studies, Faculty of Medicine,
Hong Kong.
Tessmer, M. (1993). Planning and conduct of formative evaluations. Philadelphia; Kogan.
Theodorsson, T; Shafil, K. E; Wardy, N. A; Khan, A, Mahrezi, A. A. and Shafaee, M. A. (2010). Assessments of
family doctors in Oman. Getting the questions right preliminary findings of a performance analysis of multiple
choice questions. Internet Journal of Medical Education, 1 (1).
Thorndike, R. L. and Hagan, E.R. (1978). Measurement and evaluation in psychology and evaluation. 4th Edition.
New York: John Willey and Sons.
WAEC. (2002). Performance of students’ in the May/June WAEC SSCE Examination, (1995 – 2001). Annual
Report, WAEC Press.
WAEC Diary. (2004). Brief history of the council. Lagos: Academy Press PLC.
WAEC. (2001 – 2007). The West African Senior School certificate examination. The Chief Examiners’ reports for
Nigeria.
West African Examinations Council (WAEC). (2007). History of WAEC. Retrieved 20/05/2011 at http// www:
waecnigeria.org. history.htm.
Zafar, M. (2008). Item analysis assumptions (difficulty & discrimination indexes). Assessment Unit Dept. of Medical
Education Ext. 47142.