DATA DASHBOARDS, EARLY
WARNING SYSTEMS, AND LCAPS
David T. Conley, PhD
Professor, University of Oregon
President, EdImagine Strategy Group
Senior Fellow for Deeper Learning, Hewlett
Foundation
Orange County Department of Education
May 4, 2015
2
Overview of the Day
• Overview of data dashboards and Early Warning Systems
• Review the rationale
• Identify characteristics of effective dashboards and early warning
systems
• Issues in choosing measures
• Challenges in linking dashboards and EWS to LCAPs
• Potential accountability implications
• Process of constructing data dashboards and EWS
• Using the data and creating action steps tied to LCAPs
• Review examples
• Linkages to LCAPs
3
DATA DASHBOARDS AND
EARLY WARNING SYSTEMS
4
5
6
7
PDX Crime Rate Animation
8
Data Dashboards
“Like an automobile dashboard, a data dashboard provides an array
of information about school performance and practices, rather than
a single number like a test score, to show whether a school is
succeeding.
This information enables educators to focus resources and attention
on particular problems and, equally importantly, to monitor their
own performance and address all issues that affect performance.” –
Alliance for Excellent Education
“Getting the right information is less than half the battle. Acting on
it, once it’s in hand, is harder still.” – Bridgeland & Orszag, 2013
“Multiple measures” cannot be handed down from on high. We
need to trust each community to create the kinds of school
programs it wants for its children, instead of the state or federal
government making the rules.” – John Merrow
9
Theory of Action For a Data Dashboard
Actionable
information
Refinements in
data sources
Systems
improvements
Analysis and
reflection
Individual
actions
10
Rationale for Data Dashboards
• Data dashboards can
• provide more information than a single test score or
school rating
• allow educators to continuously monitor performance
• guide educators on the actions needed to improve
performance
• provide an opportunity to focus on locally developed
indicators
• provide an opportunity to locally define what “success” is
• allow districts and schools to tell stores with multiple
sources of data
• create a space to organize and bring importance to LCAPs
11
Characteristics of Effective Dashboards
• Dashboard indicators should:
• be highly valid, important, and actionable
• be capable of displaying data longitudinally
• be comprehensible to principals and other key users
• consist of existing and new measures
• be more than math & reading test scores
• contain a mix of convenient measures and those that
are more challenging to collect
• include measures of varying psychometric rigor
• align with state priorities and reflect local priorities and
areas of emphasis and strength
12
CHARACTERISTICS OF EARLY
WARNING SYSTEMS
13
Early Warning Systems
Developing successful approaches to intervention requires
dependable and accessible data, training on how to use those
data, and regular information about how interventions are
impacting students, both in terms of academic performance
and high school completion.” – Kennelly & Moonrad, 2007
“We should pay attention to more than just the lowestachieving students when working to address issues of
graduation and dropping out. In a school system where about
half the students drop out, it is not just aberrant students who
are at high risk of not graduating but average students as
well.” – Allensworth & Easton, 2007
14
Early Warning Systems
• Early warning systems use
predictive data to identify
students at risk for high school
dropout as part of a larger
prevention/intervention
framework
• A collection of measures track
who is on track and off-track to
graduate on-time from middle
or high school
• Interventions are used to steer
off-track students back ontrack
https://ccsr.uchicago.edu/sites/default/files/publications/
07%20What%20Matters%20Final.pdf
15
Characteristics of Effective Early Warning
Systems
• Indicators derive from research and best
evidence on risk factors.
• Indictors connect to a larger prevention/
intervention system.
• Indicators recognize social-emotional factors and
student characteristics.
• Users of the EWS will need built-in time to
develop and implement EWS interventions.
16
CHOOSING MEASURES FOR
A DATA DASHBOARD
17
Data Dashboards and LCAPs
Required Indicators
• LCAPs present a
natural starting
point for
creating data
dashboards.
• Additional
indicators not
included in this
list should also
be considered.
Input
Process
Outcome
Test score gains
ü
English proficiency
ü
College/career readiness
ü
ü
Attendance
Dropout rates
ü
Graduation rates
ü
ü
Student engagement surveys
Completion of college/career pathway
ü
Completion of workplace or service experience
ü
Suspensions, expulsions
ü
Student/parent/teacher climate surveys
ü
Parental input/involvement efforts
ü
Parent participation surveys
ü
Teacher mis-assignment
ü
Access to materials
ü
Adequate facilities
ü
Common Core implementation
ü
Course access in core academic areas
ü
18
A Grab-bag of Potential Indicators
• Dual enrollment
•
•
•
•
•
•
•
participation/completion
% enrolled in postsecondary programs
Industry certifications
% taking higher-level
courses
College-going rate
% needing college
remediation
% taking Algebra in Grade 9
Opportunity to learn metrics
• Speaking and listening
• Goal orientation and
•
•
•
•
•
•
aspirations
Learning techniques
Metacognitive skill
development
Creativeness and
expressiveness
Student engagement
Expository writing
Collaborative skills
Profile Approach
20
Accountability Implications
• Dashboards are indicators, not outcome measures; they
should not be tied directly to accountability.
• This means they should emphasize real-time, formative
information much more than summative data or scores.
• They should be used to show evidence of progress
towards LCAP goals, but not achievement of goals.
• They can also be forward-looking indicators that suggest
future trends.
• Early indicators that signal likelihood of future performance
• Data dashboards can be accountability tools for local
communities.
21
CHOOSING MEASURES FOR
EARLY WARNING SYSTEMS
22
Early Warning System Indicators
• Ninth-grade course failures in mathematics and English
• Ninth-grade credit deficiencies (not on track to
graduation)
• Chronic absenteeism (esp. multiple successive days and
multiple single absences)
• Ninth-grade GPA below 2.0 or in bottom 25% of all
students
• Signs of disengagement
• Poor behavior ratings from teachers
• Student self-reports showing low levels of academic
engagement and motivation
• Discipline referrals and out-of-school suspensions
23
Learning Process Measures
• Student time on task
• Student engagement
• Student effort
• Overall behavioral level in classrooms school-wide
• Homework completion
• Courtesy and civility
• Parent engagement
• This would be low-stakes information collected mostly
through teacher reports and student self-reports.
• Reporting these, in real time, could influence student
(and teacher and parent) behavior.
24
USING DATA DASHBOARDS
TO IMPROVE TEACHING AND
LEARNING
25
Using Data Dashboards to Improve
Schools
• Many school staffs (and administrators) not accustomed to
having access to actionable information.
• Many associate public information on performance with
API and NCLB, and see it as a way to shame schools.
• Dashboards need to lead to action if they are to be
accepted by teachers and administrators.
• They must be compatible with accountability requirements
but be much more than just reports on them.
• The content of the dashboard needs to be discussed
publicly frequently by key administrators.
• Principals need to be reinforced to act independently
based on data dashboard information.
26
Using Early Warning Systems
• Early warning system data must be followed closely.
• Beginning of the school year is the most important time.
• Track individual ninth grade student attendance closely
during the first 30 days of the fall semester.
• Monitor on-track indicator data closely at the end of the first
quarter and the end of the first semester.
• Act as soon as possible when trends or individuals are
identified.
• Implement prevention/intervention strategies for students
off-track at end of each monitoring period (or sooner).
• Evaluate predictive power of indicators and effectiveness
of strategies, and revise as needed.
27
DISCUSSION
In which areas might data dashboards and EWS
have the greatest short-term impact? Long
term?
Do you have examples of ways data is (or is not)
shaping practice currently?
28
EXAMPLE DASHBOARDS
29
Spokane Public Schools
30
Spokane Public Schools
31
Spokane Public Schools
32
Spokane Public Schools
33
Spokane Public Schools
34
Spokane Public Schools
35
Spokane Public Schools
36
Spokane Public Schools
37
Spokane Public Schools
38
Plano Independent School District (TX)
http://pisd.edu/dashboard/index.dashboard.shtml
39
Plano CCR Measures
40
Plano Independent School District (TX):
Dashboard Indicators
• Accountability Standards and Distinctions
• State State of Texas Assessment of Academic Readiness (STARR)
• Percent of schools earning a distinction in mathematics
• Percent of schools earning a distinction in reading/ELA
• Percent of schools earning a distinction for top 25% in student growth
• Percent of schools earning all three distinctions above
• Student Achievement and Technology Readiness
• Texas music educators association (TMEA) All-state honorees
• Level of technology access
• Texas recommended distinguished program
41
Plano Independent School District (TX):
Dashboard Indicators
• College and Career Readiness Indicators
• Percentage of students taking SAT/ACT
• SAT composite score average
• ACT composite score average
• Percentage of students meeting SAT/ACT CCR benchmarks
• Percentage of students taking AP/IB tests
• Advanced courses & dual enrollment
• Percentage of students scoring a 3 on AP or 4 on IB test
• Graduation rate
• Financial Indicators
• Instructional expenditure
• Teacher turnover
• Instructional % of budget
42
Arlington Public Schools (VA)
http://www.apsva.us/dashboard
43
Arlington Public Schools (VA):
Dashboard Indicators
• Student Performance Data: Program/Course Enrollment,
Assessments, and Graduation
• All students
• SOLs
• AP/IB Enrollment
• AP/IB Exam Performance
• SAT/ACT Participation
• SAT Performance
• ACT Performance
• On-time Graduation Rates
• Diploma Types
• Dual Enrollment
• Student subgroups
• Pre-K Enrollment
• Gifted Services Enrollment
• SOLs
• AP/IB Enrollment
• AP/IB Exam Performance
• SAT/ACT Participation
• SAT Performance
• ACT Performance
• On-time Graduation Rates
• Diploma Types
• Dual Enrollment
44
Arlington Public Schools (VA):
Dashboard Indicators
• Other student and family
experience data
• Student Developmental Assets
• Student Safety
• Family Involvement and
Communication
• Strategic Partnerships
• Culturally Competent Practices
• Positive Student Relationships
• Staff data
• Teacher Qualifications (IPAL)
• Staff Diversity Profile
• Staff Satisfaction
• School-based Positions
• Facilities, Finance, and
Technology Data
• Project Management
• Energy Efficiency
• Fiscal Responsibility
• School-based Positions
• Uptime for Core Services
• Student to Computer Ratios
45
Fresno Unified SD
Emphasis
Goal Link
Math (Accelerate Achievement)
All students will excel in reading, writing and
math
ELA (Accelerate Achievement)
All students will excel in reading, writing and
math
Social – Emotional (Decrease behaviors that
lead to suspension/expulsion)
All students will demonstrate the character and
competencies for workplace success
Performance Measures
District CST proficiency
1st Passing Rate on CAHSEE
3rd Grade CST proficiency
5th Grade CST proficiency
% of 8th Grader enrolled in Algebra I
8th Grade Algebra proficiency
District CST proficiency
1st Passing Rate on CAHSEE
3rd Grade CST proficiency
5th Grade CST proficiency
8th Grade CST proficiency
Student Attendance Rate
Percent that responds agree or strongly
agree to “I feel like I am a part of this school”
(California Healthy Kids Survey)
Percent that responds agree or strongly agree to “At my
school there is a teacher or adult who really cares about
me” (California Healthy Kids Survey)
Suspensions per 100 students
Expulsions per 100 students
!
46
Fresno Unified SD
47
Fresno Unified SD
48
LCAP Evaluation Rubric
• The Data
Analysis
evaluation rubric
is intended to
serve as a data
dashboard of
sorts.
• The data analysis
component of
the Data Analysis
Rubrics will be
online and allow
for an at-aglance view of
data.
49
CONSTRUCTING A DATA
DASHBOARD AND EARLY
WARNING SYSTEM
50
DISCUSSION
What do you like about the examples you just saw?
What don’t you like?
Which elements might you incorporate in your district’s
data dashboard?
51
DISCUSSION
Think about the process you would use to develop a data
dashboard or to improve an existing one. List key first steps.
Does the district have the technical capacity to create and
administer a data dashboard? If not, how could it be done?
Which measure exist? Which would be important to add first?
Which would you add later?
52
WRAP-UP
Do you have any final thoughts on data dashboards and
early warning systems?
53
Data Dashboard Resources
• Alliance for Excellent Education. (2015).
Data Dashboards: Accounting for What Matters
• Conley, D. (2015). A New Era for Educational Assessment.
education policy analysis archives, 23, 8.
• Conley, D. T., Thier, M., Beach, P., Lench, S. C., & Chadwick, K.
L. (2014b).
Measures for a college and career indicator: Multiple
measures. Eugene, OR: Educational Policy Improvement
Center.
• Del Razo, J. L., Saunders, M., Renée, M., López, R. M., &
Ullucci, K. (2014).
Leveraging Time for School Equity: Indicators to Measure
More and Better Learning Time. Annenberg Institute for
School Reform at Brown University.
54
Early Warning SystemResources
• Allensworth, E. M., & Easton, J. Q. (2007).
•
•
•
•
What matters for staying on track and graduating in Chicago Public High
Schools. Chicago, IL: Consortium on Chicago school research.
Balfanz, R. (2009).
Putting middle grades students on the graduation path (Policy and
Practice Brief). Baltimore: Johns Hopkins University, Everyone Graduates
Center.
Herzog, L., Davis, M., & Legters, N. (2012).
Learning what it takes: An initial look at how schools are using early
warning indicator data and collaborative response teams to keep all
students on track to success. Johns Hopkins University, School of
Education, Everyone Graduates Center.
Jobs for the Future (2014).
Early Warning Indicators and Segmentation Analysis: A Technical Guide
on Data Studies that Inform Dropout Prevention and Recovery. U.S.
Department of Education.
Kennelly, L., & Monrad, M. (2007).
Approaches to Dropout Prevention: Heeding Early Warning Signs with
Appropriate Interventions. American Institutes for Research.
To download a copy of this presentation, visit edimagine.com