[go: up one dir, main page]

0% found this document useful (0 votes)
7 views28 pages

Validity & Reliability

Uploaded by

sindhudjipmer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views28 pages

Validity & Reliability

Uploaded by

sindhudjipmer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 28

VALIDITY & RELIABILTY

BEHAVIOURAL SCIENCES
1. Psychology
2. Sociology
3. Anthropology
4. Speech and Hearing Sciences

Using a questionnaire, transforming qualitative content into


quantitative content is possible
VALIDATION OF QUESTIONNAIRE
• Validity
• Reliability
VALIDITY & RELIABILITY
FACE EXPERT
VALIDATION VALIDATION

EXPERT PANEL
CONTENT VALIDATION
VALIDATION
VALIDITY DISCRIMINANT
TESTS VALIDATION
CONSTRUCT CONVERGENT
VALIDATION VALIDATION PREDICTIVE
VALIDATION

CONCURRENT
VALIDATION
CRITERION
VALIDATION
POSTDICTIVE
VALIDATION
FACE VALIDITY
Difficulty,
Appropriateness,
Clarity
Qualitatively
Revised to simpler
& clearer version
FACE VALIDITY

Impact score =
Quantitatively Frequency (%) x
Importance
ITEM IMPACT SCORE
Each item is rated by 5-point Likert scale
• 5 – Very important
• 4 – Important
• 3 – somehow important
• 2 – slightly important
• 1 – not important
Frequency – No. of experts rated each item as 4/5
Importance – mean score of item
Impact Score ≥ 1.5 - Valid
LIMITATIONS OF FACE VALIDITY
• Face validity – very casual, soft and many researchers do not consider
this as an active measure of validity.
• It is the most widely used form of validity in developing countries.
COHEN’S KAPPA INDEX
• Items are to be categorized as favorable or unfavorable by raters
• Kappa index value of above 0.6 can be acceptable for inter-rater
agreement in the questionnaire.
• The formula
• k = (po – pe) / (1 – pe)
• po: Relative observed agreement among raters
• pe: Hypothetical probability of chance agreement
COHEN’S KAPPA - VALUE
CONTENT VALIDITY

Appropriate wording,
grammer,
Qualitative
understandability &
relatedness to culture
Content validity

CVR
Quantitative
CVI
CONTENT VALIDATION RATIO (CVR)
• The content validation ratio (CVR) can be calculated using
Lawshe’s Method.
• CVR=(Ne - N/2)/(N/2)

• Ne - number of panel members indicating “essential”

• N - Total number of panel members.

• For each item, it should be calculated.

• Value: -1 to +1

• Minimum value: 0.62


CVI – CONTENT VALIDITY INDEX

4 Point likert scale


1 – not relevant
A. Relevance
2 – Somewhat relevant
CVI-R 3 – Quite relevant
4 – highly relevant
CVI
4 Point likert scale
1 – not clear
B. Clarity
2 – somewhat clear
CVI-C 3 – Quite clear
4 – highly clear
CVI – CONTENT VALIDITY INDEX
CVI(R) CVI(C)

Item Scale Item Scale


I-CVI(R) S-CVI(R) I-CVI(C) S-CVI(C)
Nr/N Nc/N
CVI = No. of raters giving scoring of 3 or 4
Total no. of raters
Scale = No. of ratings as 3 or 4
Total no. of ratings
CVI - VALUES
CONTENT VALIDITY - DRAWBACKS
• It is also adjudged to be highly subjective like face validity
• In some cases, researchers could combine more than one form of
validity to increase the strength of the questionnaire.
• For instance, face validity is combined with content validity &
criterion validity
CONSTRUCT VALIDITY
• Most valuable form of Validity and also the most difficult form to
establish.
• “Construct” can be defined as a composite latent variable which has
some hidden measure.
CONSTRUCT VALIDITY
New Pain scale
Convergent vs
Already existing scale

Construct validity
Pain vs Cognition

Discriminant
Quality of life: Social,
physical, emotional &
mental
CONSTRUCT VALIDITY
• Based on Correlation Matrices
Correlation Construct validity
Coefficient
0.1 Small
0.3 Moderate
0.5 Large

• Correlation coefficient values can be interpreted as follows:


• r = 1: Perfect positive correlation
• r = 0: No correlation
• r = −1: Perfect negative correlation
• r value of >0.50 is generally considered sufficient to suggest
convergent validity
CRITERION VALIDITY
1. Concurrent: questionnaire based on knowledge (consciousness)
regarding blood sugar control vs present RBS levels
2. Postdictive: Already occurred in the past
Two groups – with depression & without depression
(known group validity)
3. Predictive: Present adherence to anti-hypertensive medication –
future outcome (Impaired systolic BP control)
Some other types:-

Culture Validity (b/w different cultures – as in Nutrition Questionnaire)

Discriminate Validity (should be able to discriminate between the


cohorts – like a pain scale among orthopaedic patients and ENT patients)

*One crude method of checking validity is by constructing Item vs Total


score correlation matrix.
RELIABILITY
1. Test-Retest Reliability (Stability)
2. Inter-rater Reliability
3. Internal consistency
TEST-RETEST RELIABILITY
• Correlation coefficient – acceptable (r> 0.7)
• Cohen’s kappa can be done
• Should not be too far or too close
• It is a good tool for personality traits but not for those which keeps on
changing like pain, behaviour, attitude etc
• Memory effect
• Testing effect
• Perceptions may change
• Behaviour may change
• Fatigue
INTER-RATER RELIABILITY
• Nominal – Cohen’s kappa
• Ordinal – 1. weighted kappa: i) Linear weighted kappa
ii) Quadratic weighted kappa
2. Kendall’s Tau
• >2 raters & Nominal – Fleiss Kappa
• >2 raters & ordinal – Kendall’s w
• >2 raters – Krippendorf alpha – Missed data & more flexible (nominal, ordinal &
metric)
• Agreement between the raters
• Expert vs novice
• Guess component – Conventional agreement vs kappa statistics
• Software – DataTab
• Cronbach’s alpha can also be done for continuous data (likert scale)
INTERNAL CONSISTENCY
1. Cronbach’s alpha
2. Split-half consistency
3. Kudar Richardson-20 : Categorical, dichotomous [Yes/No]
FORMULA:
i) Checking for co-variance or Reliable variance
α= k S² - εσ²
k-1 S²
ii) Non-Standardised Cronbach’s alpha:
α= N*Covariance
avg.variance + (N-1)(Covariance)
= N*C
V + (N-1)(C)
INTERNAL CONSISTENCY
iii) Standardized Cronbach’s alpha – correlation matrix
α = kr
1+(k – 1)r
Cronbach’s α is sensitive to no. of items
Can be applied only for unidimensional data
CRONBACK’S ALPHA

You might also like