[go: up one dir, main page]

Academia.eduAcademia.edu
Assessment for Effective Intervention http://aei.sagepub.com/ College and Career Readiness Assessment: Validation of the Key Cognitive Strategies Framework Allison R. Lombardi, David T. Conley, Mary A. Seburn and Andrew M. Downs Assessment for Effective Intervention published online 28 June 2012 DOI: 10.1177/1534508412448668 The online version of this article can be found at: http://aei.sagepub.com/content/early/2012/06/26/1534508412448668 Published by: Hammill Institute on Disabilities and http://www.sagepublications.com Additional services and information for Assessment for Effective Intervention can be found at: Email Alerts: http://aei.sagepub.com/cgi/alerts Subscriptions: http://aei.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav >> OnlineFirst Version of Record - Jun 28, 2012 What is This? Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 448668 AEIXXX10.1177/1534508412448668Lomb ardi et al.Assessment for Effective Intervention © 2012 Hammill Institute on Disabilities Reprints and permission: http://www. sagepub.com/journalsPermissions.nav Article College and Career Readiness Assessment: Validation of the Key Cognitive Strategies Framework Assessment for Effective Intervention XX(X) 1–9 © 2012 Hammill Institute on Disabilities Reprints and permission: http://www. sagepub.com/journalsPermissions.nav DOI: 10.1177/1534508412448668 http://aei.sagepub.com Allison R. Lombardi, PhD1, David T. Conley, PhD2, Mary A. Seburn, PhD3, and Andrew M. Downs, PhD4 Abstract In this study, the authors examined the psychometric properties of the key cognitive strategies (KCS) within the CollegeCareerReady™ School Diagnostic, a self-report measure of critical thinking skills intended for high school students. Using a cross-validation approach, an exploratory factor analysis was conducted with a randomly selected portion of the sample (n = 516) and resulted in five reliable factors: (a) problem formulation, (b) research, (c) interpretation, (d) communication, and (e) precision/accuracy. A confirmatory factor analysis was conducted with the remaining sample (n = 808). Goodness-of-fit indices indicated acceptable model fit. The five-factor solution is consistent with earlier validity studies of the KCS framework. Implications for use by high school personnel in evaluation of instructional programs and as a value-added assessment are discussed. Keywords cognition, critical thinking, college readiness, factor analysis, validity College and career readiness has emerged as a major focal point in educational accountability systems. Most recently, knowledge and skills associated with college and career readiness have become the underlying goal of the Common Core State Standards (CCSS; National Governor’s Association [NGA] & Council of Chief State School Officers [CCSSO], 2010) and a subsequent initiative led by the Race to the Top Assessment Program (U.S. Department of Education, 2010). Not only were these policy initiatives designed to address the knowledge and skills students need to be successful in college and careers (NGA & CCSSO, 2010), but they also seek to reduce the 30% to 60% of underprepared high school graduates in need of remedial higher education (National Center for Education Statistics, 2004). Remediation needs are significantly higher among aspiring first-generation college students, suggesting that assessing college and career readiness in such students is particularly important (Chen, 2005; Choy, 2001; Venezia, Kirst, & Antonia, 2003). The current and well-accepted indicators of college and career readiness (e.g., grade point average, college admission exam scores) show some evidence of predicting college student grade point average (Camara & Echternacht, 2000; Cimetta, D’Agostino, & Levin, 2010; Coelen & Berger, 2006; McGee, 2003; Noble & Camara, 2003); however, other evidence suggests that these measures are misaligned with the knowledge and skills pertinent for success in college environments (Achieve, Inc., 2007; Brown & Conley, 2007; Brown & Niemi, 2007; Conley, 2003). Given the recent focus on college and career readiness as highlighted by the CCSS and the continued demand for remedial higher education courses, it is especially crucial for high school personnel to assess their students on the knowledge and skills that are not measured by grade point average or college admission exams. Adequate assessment of such skills may help educators improve instructional programming so that college and career readiness is emphasized and, in turn, remedial higher education needs are reduced. College and Career Readiness Definitions College readiness differs from college eligibility; in addition to satisfying high school graduation requirements, college-ready students are able to succeed in a credit-bearing 1 University of Connecticut, Storrs, CT, USA University of Oregon, Eugene, OR, USA 3 Educational Policy Improvement Center, Eugene, OR, USA 4 University of Portland, Portland, OR, USA 2 Corresponding Author: Allison R. Lombardi, Neag School of Education University of Connecticut. 249 Glenbrook Road, Unit 2064 Storrs, CT 06269, USA Email: allison.lombardi@uconn.edu Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 2 Assessment for Effective Intervention XX(X) Figure 1. The four keys of college and career readiness. Source: Copyright 2011 by the Educational Policy Improvement Center. course at a postsecondary institution and, therefore, do not require any remediation (Conley, 2005, 2007a, 2010). Furthermore, career readiness pertains to the knowledge, skills, and learning strategies necessary to begin studies in a career pathway, which differs from work ready and job trained, or the basic expectations regarding workplace behavior and specific knowledge necessary to begin an entry-level position, respectively (Conley, 2011b). As such, and consistent with the overall goal of the CCSS, the present study will emphasize college and career readiness as the target for high school graduates, as opposed to college eligibility, work readiness, or job training. Model College and career readiness is a multidimensional construct that includes academic preparation and noncognitive factors previously shown to affect college outcomes, which include, but are not limited to, motivation, engagement, and self-efficacy (Allen, 1999; Gore, 2006; Kuh, 2005; Torres & Solberg, 2001; Zajacova, Lynch, & Espenshade, 2005). To address the multidimensional nature of college and career readiness, Conley (2010) developed a comprehensive model with four keys: (a) key cognitive strategies (KCS), (b) key content knowledge, (c) key learning skills and techniques, and (d) key transition knowledge and skills.1 Thus, although other college-readiness models and standards exist (e.g., ACT, Inc., 2010; Tinto, 2007; Wiley, Wyatt, & Camara, 2010), Conley’s model is unique in that it is multidimensional, comprehensive, and addresses cognitive and noncognitive factors. Figure 1 shows the comprehensive model. KCS comprise internal, metacognitive thinking skills that are perhaps the least observable by educators. Key content knowledge encompasses the effort, attribution, and value put forth by students to understand the academic disciplines, including overarching reading and writing skills, the core academic subject areas (e.g., English/language arts, mathematics, science, and social sciences), and technology (e.g., familiarity with typical software programs, frequency of computer use to complete assignments). Key learning skills and techniques encompasses self-monitoring and study skills (Lombardi, Seburn, & Conley, 2011a). Examples include the ability to manage time, take notes, set goals, persevere in the face of obstacles, collaborate, and self-advocate (Bransford, Brown, & Cocking, 2000; Conley, 2007). Key transition knowledge and skills encompasses knowledge of college access (e.g., financial aid, college application and admission processes) and the nuances of college academic and social culture. Aspiring first-generation college students rely more heavily on their high schools for college access (Pascarella, Pierson, Wolniak, & Terenzini, 2004). Evidence shows that high school personnel can increase access to college by providing emotional support, access to information, and assistance navigating the college admission process to low-income and traditionally underrepresented students (Gandara & Bial, 2001; McDonough, 2004; Plank & Jordan, 2001; Stanton-Salazar, 2001; Venezia et al., 2003). The CollegeCareerReady™ School Diagnostic (CCRSD) measures the four model keys. The items were written based on a previous study of more than 4,000 students in 38 high schools that demonstrated exemplary practices in terms of college and career readiness of aspiring first-generation and underrepresented students (Conley, 2010; Conley, McGaughy, Kirtner, Van Der Valk, & Martinez-Wenzl, 2010). These practices were coded, categorized, and operationalized into the four keys shown in Figure 1 (for a full-study description, see Conley et al., 2010). Versions are available for students, teachers, administrators, and counselors to allow for schoolwide assessment of college and career readiness programs, practices, and instruction. All versions are self-report measures. Although the four keys are equally important to consider in assessing college and career readiness, we will focus on the KCS for the purpose of the present study. The KCS The KCS are a series of metacognitive strategies derived from the literature on cognition pertaining to college students (e.g., Boekaerts, 1999; Pintrich, 2004; Wolters, 1998) and linked to key attributes of college and career readiness (Conley, 2007). Specifically, the KCS include the ability to make inferences, interpret results, analyze conflicting source documents, support arguments with evidence, reach conclusions, communicate explanations based on synthesized sources, and think critically about what they are being taught (Conley, 2003, 2005, 2007, 2010; National Research Council, 2002). Similarly, the CCSS specify that students should be able to hypothesize and strategize solutions to Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 3 Lombardi et al. problems before beginning an assignment, search and organize information to make a case for a solution, consider varying opinions on the topic, compile and communicate their solution, and review their own work for precision and accuracy (Conley, 2011a; NGA & CCSSO, 2010). Based on these identified behaviors and skills, the KCS were defined as five sequential constructs: (a) problem formulation, (b) research, (c) interpretation, (d) communication, and (e) precision/accuracy. Together, they represent the thinking skills or habits of mind of successful college students (Conley, 2007; Costa & Kallick, 2000; Ritchhart, 2002), as well as the skills that college instructors expect students to have mastered on entrance to college across academic disciplines (Conley, 2003). Table 1 shows detailed definitions of the five KCS. The KCS were developed from three theoretical frames: (a) dispositional-based theory of intelligence, (b) cognitive learning theory, and (c) competency theory. A dispositional view is rooted in the belief that intelligence is malleable and, through increasing efforts, can grow incrementally (Bransford et al., 2000; Costa & Kallick, 2000). The second conceptual frame derives from cognitive learning theory, in which people construct new knowledge based on what they already know and believe, and that retention is heightened by meaningful learning experiences (Perkins, 1992). Competency theory provides the final element of the conceptual frame. Guided by the expert–novice literature (Baxter & Glaser, 1997), this theory suggests that novices (students) benefit from models of how experts approach problem solving, especially if they receive coaching in using similar models (Bransford et al., 2000). Within competency theory research, developmental models of learning note the typical progression as a learner advances from novice to competent to expert, and describe the types of experiences that lead to change (Boston, 2003). These three theoretical frames underpin the five-part KCS model. Table 1. The Five Key Cognitive Strategies and Operational Definitions Measuring the KCS Precision/accuracy Conley (2003) found that a nationwide sample of college faculty, regardless of selectivity of institution and across multiple disciplines, reached near universal agreement that most students arrive unprepared for the intellectual demands and expectations of postsecondary environments. Other researchers have analyzed high school transcripts and found that rigorous academic preparation as represented by the titles of high school courses taken is the most significant predictor of persistence to college graduation (Adelman, 1999; Bedsworth, Colby, & Doctor, 2006). A different approach is to analyze the content of college courses and then determine what should be occurring in high school courses to align with what will be encountered in college courses. This backward mapping strategy implicated the initial iteration of the KCS framework (Conley, 2003). Strategy Problem formulation Research Interpretation Communication Definition The student demonstrates clarity about the nature of the problem and identifies potential outcomes. The student develops strategies for exploring all components of the problem. The student may revisit and revise the problem statement as a result of thinking about potential methods to solve the problem. The student explores a full range of available resources and collection techniques, or generates original data. The student makes judgments about the sources of information or quality of the data, and determines the usefulness of the information or data collected. The student may revisit and revise information collection methods as greater understanding of the problem is achieved throughout this process. The student identifies and considers the most relevant information or findings, and develops insights. To make connections and draw conclusions, the student uses structures and strategies, which contribute to the framework for communicating a solution. The student reflects on the quality of the conclusions drawn and may revisit and revise previous steps in the process. The student organizes information and insights into a structured line of reasoning and constructs a coherent and complete final version through a process that includes drafting, incorporating feedback, reflecting, and revising. The student is appropriately precise and accurate at all stages of the process by determining and using language, terms, expressions, rules, terminology, and conventions appropriate to the subject area and problem. Source: Adapted from Conley (2007a). Copyright 2007 by the Educational Policy Improvement Center. Purpose of the Present Study The purpose of this study was to examine the reliability and internal validity of the KCS within the CCRSD, a selfreport instrument intended to measure the degree to which schools provide college and career readiness opportunities for their students. To do this, we examined (a) the internal consistency of the measure and (b) the extent to which data Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 4 Assessment for Effective Intervention XX(X) from the measure fit the proposed KCS model. These study objectives informed conclusions on whether the KCS could be validated as a self-report measure within a larger measure of college and career readiness. The larger measure, the CCRSD, is a tool intended for school personnel in evaluating their instructional programs to ensure consistency with the criteria of the CCSS and ultimately provide more postgraduation opportunities to the youth they serve. Because cognitive thinking skills are not readily observable in students, a self-report instrument may be a useful tool for educators in determining instructional supports and refinements that emphasize these skills. Particularly, we were interested in validating the instrument for aspiring firstgeneration college students, a population that has shown a disproportionate need for remedial higher education (Chen, 2005; Venezia et al., 2003). reference point. If students do not believe certain items describe their behaviors or indicate they do not know, they are less aware of successful college and career readiness practices and behaviors. Items were written for five subscales that represent the constructs: (a) Problem Formulation, (b) Research, (c) Interpretation, (d) Communication, and (e) Precision/ Accuracy. We hypothesized that cognitive thinking skills associated with college and career readiness comprised these five subscales. Before administration, the items were pilot tested for readability on two samples during 2009 and 2010. At that time, participants were solicited for qualitative feedback on items as they responded to the survey. These data were analyzed and used to inform item revision. Procedures Method Sample Participants were students (N = 1,324) across 10 high schools in Illinois, Indiana, Michigan, and Wyoming that had agreed to pilot test the CCRSD in fall 2010. A purposive sample of high schools was selected because they had high enrollment rates of aspiring first-generation college students and the schools reported they were implementing college and career readiness programs. For the most part, the students were evenly distributed across grades: 27% were in 9th grade, 24% were in 10th grade, 26% were in 11th grade, and 23% were in 12th grade. Of the students, 53% reported neither parent had a college degree, 27% reported one parent had a college degree, and 20% reported both parents had college degrees. Race/ethnicity of the students was as follows: African American (48%), White (22%), Hispanic/Latino (20%), mixed race (6%), Asian American (<1%), American Indian/Alaskan native (<1%), and unknown (2%). There were slightly more female (54%) than male (46%) students. A majority of the students qualified for free/reduced meal service (66%). Approximately, 15% were students with disabilities with an individualized education program, and 10% were classified as English learners. School personnel selected student participants so that there were approximately 100 students per grade. Selected students were offered the opportunity to take the CCRSD during a designated 50-min class period. School personnel were advised to select student participants from core academic courses (e.g., English/language arts, mathematics, science, social studies). Of the participating classes, 72% were core academic courses—English/language arts (25%), mathematics (18%), natural sciences (16%), and social sciences (13%). The remaining 28% of courses were “other,” which included career/technical, arts, foreign languages, physical education, and health. In all schools, the resulting participant sample was compared with the overall school population, and no significant demographic-based differences were found. The CCRSD was administered online. Participants completed an online consent form prior to the start of the survey. If participants responded “no” to the consent form, they were unable to proceed with the survey and were redirected to the end page. Student participation was voluntary and students received no compensation. School personnel received no compensation and were provided with aggregated data reports, which they could access and interact with online. Analytic Approach Measure The KCS dimension of the CCRSD contains 64 items with response options ranging from 1 = not at all like me to 5 = very much like me, with a “don’t know” (DK) option. Students are asked, “Please indicate how much each statement describes you” and rate the items accordingly. Because the items are based on exemplary practices (see Conley et al., 2010), the intent is for students to selfrate their own behaviors using exemplary behaviors as a To meet our study objectives, we examined the psychometric properties of the instrument. For validity, we took a cross-validation approach by randomly splitting the sample so that student responses were subject to exploratory factor analyses (EFA) and confirmatory factor analyses (CFA). For the EFA, 40% of the sample was randomly selected, and the remaining 60% of responses were saved for the CFA. The reason for splitting the sample accordingly (as opposed to 50% for each method) was to ensure the CFA Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 5 Lombardi et al. sample was large enough to meet more stringent sample size requirements (Kline, 1998). The CFA was used to determine whether the factor structure obtained in the EFA could be confirmed on student responses from the remainder of the sample. Structural equation modeling methods (Kline, 1998) were used to estimate the CFA models. In addition to the EFA and CFA, we examined internal consistency (Cronbach’s α) of the scores for the full instrument and within factors, and determined a priori that acceptable reliability included values of .70 or greater, while .80 or greater values were preferable (Nunnally, 1975). Two types of software were used for analyses, PASW 18.0 (SPSS, Inc., 2010) and Mplus 6.1 (Muthen & Muthen, 2010). Results We used only completed surveys for all data analyses, and therefore no missing data treatment was necessary. However, there was a DK response option for all items. We coded these responses with “0” values, implicating a 6-point scale. This decision was based on a previous study of the key learning skills and techniques dimension of the CCRSD in which DK responses were treated with three different methods: (a) listwise deletion, (b) imputation with the expectation/maximization (E/M) algorithm, and (c) coding DK responses as 0, or the lowest level on the scale (for a fullstudy description, see Lombardi, Seburn, & Conley, 2011b). Findings showed that casewise deletion could distort results, and important group differences would go potentially unrecognized. The researchers concluded it appropriate to categorize these responses at the lowest level on the scale because this lack of knowledge indicates that they are least aware of identified successful behaviors associated with college readiness. Although the present study focuses on the KCS, the rationale for DK responses indicating a lack of awareness is quite similar to the key learning skills and techniques. As such, we determined it appropriate to code the DK values as 0, implicating a 6-point scale. EFA We conducted an EFA with 40% of the responses (n = 516) using maximum likelihood and Geomin rotation in Mplus 6.1 (Muthen & Muthen, 2010). To determine the number of factors to retain, the following criteria were considered: (a) absolute and relative eigenvalues greater than 1, (b) examination of the scree plots, (c) proportion of variance accounted for by factor, (d) interpretability of the rotated solution as compared with the KCS theoretical model, (e) minimum of three items loading to each factor, and (f) the simple structure of factor loadings. Using these criteria, we examined one- through seven-factor models and determined that a five-factor solution was optimal. In this solution, a total of nine items were removed because of Table 2. Descriptive Statistics and Reliability by Subscale and Dimension Subscale Problem formulation Research Interpretation Communication Precision/accuracy Key cognitive strategies Item n M SD α 12 10 10 9 14 55 3.74 3.71 3.56 3.66 3.81 3.70 0.83 0.88 0.89 0.97 0.89 0.76 .88 .88 .88 .90 .93 .96 (a) cross-loadings of .35 or greater on two or more factors or (b) weak loadings across factors (no loadings of .35 or greater). The remaining items grouped into the five-factor solution consistent with the KCS model. Ultimately, we relied heavily on the interpretability criteria (df) in determining the most optimal factor solution because there was a large break in the eigenvalues, scree plots, and variance accounted for after the first factor. The variance accounted for by individual factors 1 through 5 was 38%, 5%, 3%, 3%, and 2%, respectively. Our first study objective was to examine the internal consistency of the KCS. Table 2 shows the descriptive statistics and α coefficient values for the full measure and by subscale. These subscales are based on the factors that emerged from the EFA, in which nine items were removed from the original version of the instrument. CFA To test whether the five-factor solution obtained in the EFA could be replicated, we conducted a cross-validation study in which a randomly selected 60% of the responses (n = 808) were subject to a CFA using maximum likelihood estimation. Each item was associated with one of the five firstorder latent variables that emerged in the EFA (problem formulation, research, interpretation, communication, and precision/accuracy) via a single path, and each first-order latent variable was associated with the second-order construct (KCS). We set the first measurement path for each latent variable to 1.0, so that a scale could be established for the remaining variables. Model fit was evaluated using the minimum fit function χ2, the χ2/df ratio, and four goodness-of-fit indices: the root mean square error of approximation (RMSEA), the standardized root mean square residual (SRMR), the comparative fit index (CFI), and the Tucker-Lewis index (TLI). We determined a value of less than 5 for the χ2/df ratio (MacCallum, Brown, & Sugawara, 1996), and RMSEA < .06, SRMR < .08, and CFI/TLI > .90 (Hu & Bentler, 1995) indicate good model fit. Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 6 Assessment for Effective Intervention XX(X) Figure 2. Path diagram for key cognitive strategies. ***p < .001. The obtained χ2 value for the model was χ2 (1425) = 4,926.38, p < .001, indicating a statistically significant difference between the five-factor model and the data. However, χ2 values are potentially inflated by large sample sizes, and χ2/df ratio was 3.45, indicating acceptable model fit. The obtained values for the RMSEA and SRMR were .05 and .04, respectively, both indicative of good model fit. However, the obtained CFI value was .84 and TLI was .83, both of which did not meet the criteria for good model fit according to Hu and Bentler (1995). These values are considered within the acceptable range according to more liberal criteria (Browne & Cudeck, 1993) that indicate values within the range of .80 to .90 are considered acceptable. Despite this and in consideration of the combination of the other fit indices, the model appears to acceptably fit the data. Figure 2 shows the path diagram for the five-factor solution with standardized parameter estimates. The standardized path coefficient values from the higher order factor to each of the lower factors have values ranging from .90 to .96. All standardized path coefficient values were statistically significant at the p ≤ .05 level. The standardized parameter estimates from each of the latent variables to their respective indicators ranged from .39 to .71. All are positively and statistically significantly different from zero, indicating each item is positively related to the latent construct. Within each latent construct, the standardized parameter estimates ranged as follows: problem formulation (12 items) ranged from .39 to .65, research (10 items) ranged from .60 to .71, interpretation (10 items) ranged from .55 to .70, communication (9 items) ranged from .57 to.70, and precision/accuracy (14 items) ranged from .55 to .71. Discussion The purpose of this study was to examine the psychometric properties of the KCS dimension within a larger measure, the CCRSD. Findings show that the KCS dimension has preferable reliability and promising validity evidence. Reliability within factors showed α coefficients ranging from .88 to .93, and the full scale was .96, all of which met our criteria for preferable reliability (Nunnally, 1975). Results of the cross-validation study show evidence of structural validity, in which the five-factor solution that emerged from the EFA was confirmed in the CFA with an overall acceptable model fit. The EFA results showed the first factor accounted for a large amount of variance (38%) and the descriptive statistics (as shown in Table 2) show somewhat limited variability of the KCS. In particular, student mean responses on the five subscales fell between somewhat like me and a lot like me. Potentially, responses were less variable because students were unaccustomed to self-rating cognitive thinking skills. Together, these findings may attest to the difficulty in precisely measuring cognitive thinking skills and suggest the need to further disentangle the KCS constructs. Future studies should focus on the development of more precise operational definitions of the five KCS and their subsequent items, as well as on explicating the relative importance of each of the five KCS as contributors to college and career readiness. The KCS are also the basis for the College-Readiness Performance Assessment System (C-PAS), a formative, low-stakes performance assessment (Conley, Lombardi, Seburn, & McGaughy, 2009). C-PAS is designed to enable teachers to monitor the acquisition of the KCS through rich content-specific performance tasks embedded into the curriculum in English/language arts and mathematics spanning from Grades 6 through 12. Postsecondary preparedness is the reference point for this criterion-based measurement system. Tasks vary in content areas but are all scored with a common scoring guide by teachers and external reviewers, enabling rater reliability to be further examined. Previous studies show promising internal and external validity evidence for C-PAS (Baldwin, Seburn, & Conley, 2011; Conley et al., 2009). Unlike C-PAS, the CCRSD is a self-report measure. Although prior validity evidence has been established for the KCS framework on C-PAS, the purpose of the present study was to examine psychometric properties of the KCS framework as a self-report measure. These study findings are consistent with previous studies in regard to the fivepart KCS model, which indicates that problem formulation, research, interpretation, communication, and precision/ accuracy comprise the cognitive thinking skills associated with college and career readiness (Conley, 2003; Conley et al., 2009). Thus, the KCS dimension of the CCRSD may be a useful tool for school personnel to evaluate their instructional programs for college and career readiness opportunities. Particularly, school personnel serving high numbers of aspiring first-generation students may assess students on the KCS to better understand how these cognitive thinking skills and strategies could be integrated in the classroom. Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 7 Lombardi et al. Prior evidence shows these students are more dependent on educators for college and career preparation (Pascarella et al., 2004) and that programs targeted toward college access positively affect them (Gandara & Bial, 2001; McDonough, 2004; Plank & Jordan, 2001; Stanton-Salazar, 2001; Venezia et al., 2003). Assessing students on the KCS is the first step to integrating these skills into instruction. Potentially, if the KCS are integrated into instruction, remedial higher education needs may decrease. Limitations Although the present study shows promising validity evidence, there are several limitations to consider. Of primary concern is our sample, of which the majority (68%) comprised African American (48%) and Hispanic/Latino students (20%). White students comprised 22%, and students of other races comprised the remaining 10% of the sample, suggesting an underrepresentation of Asian American, Pacific Islander, and American Indian students. Aspiring first-generation college students were of particular interest in this study, and more than half of the sample (53%) comprised this population. Due to these sample characteristics, the extent to which our findings generalize across high schools is somewhat limited. Moreover, there is a potential for respondent bias because this is a self-report instrument. Future research studies are needed to establish the predictive validity of the KCS dimension to determine whether students who exhibit high awareness and understanding of the KCS also have high achievement. of longitudinal tracking, school personnel can get a better sense for the value added of their programs to student learning and achievement. In addition, use of the CCRSD coupled with a performance assessment (such as C-PAS), may allow for a comparison of student self-ratings and teacher scores on the KCS. This system is not meant to replace current and well-known academic performance measures; the KCS are meant to add more meaning and clarification in integrating the instruction of thinking skills alongside the content that is taught and measured in high school courses. The CCRSD online system allows all participants to use a data-driven decision framework to better understand how they can optimally spend their high school years in preparation for the future. Declaration of Conflicting Interests The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. Funding The authors received no financial support for the research, authorship, and/or publication of this article. Note 1. The college-readiness model is described by Conley (2010, p. 31). Copyright 2010 by D. T. Conley. The model dimensions described in the book have been relabeled as model keys. Names of two keys have been relabeled: Academic behaviors are now key learning skills and techniques, and contextual awareness and skills are now key transition knowledge and skills. References Implications for Practice In light of the importance of college and career readiness as specified by the CCSS and the Race to the Top Assessment Program, it is increasingly crucial to measure the knowledge and skills associated with postsecondary success. School personnel—administrators, teachers, counselors, and other student support personnel—may assess their students with the KCS dimension to better understand how they can adjust instruction and programming within their classrooms and schools to encourage and teach the KCS. In addition to student surveys, there are teachers, counselors, and administrator versions available so that student scores may be compared with school personnel to gain a greater sense of the perceptions and discrepancies in collegereadiness instruction and programs. Within the larger CCRSD online system, these instruments are tied to a resource database with actionable steps. The system is longitudinal, allowing students and school personnel to track their responses over time, monitor progress, and adjust instruction accordingly. There is potential for the CCRSD to be used as a valueadded assessment. With versions available for students, teachers, and other school personnel, and with the possibility Achieve, Inc. (2007). Aligned expectations? A closer look at college admissions and placement tests. Washington, DC: Author. ACT, Inc. (2010). College readiness standards for EXPLORE, PLAN, and ACT. Iowa City, IA: Author. Adelman, C. (1999). Answers in the tool box: Academic intensity, attendance patterns, and bachelor’s degree attainment. Washington, DC: U.S. Department of Education. Allen, D. (1999). Desire to finish college: An empirical link between motivation and persistence. Research in Higher Education, 40, 461–485. Baldwin, M., Seburn, M., & Conley, D. T. (2011). External validity of the College-Readiness Performance Assessment System (C-PAS). Paper presented at the 2011 annual conference of the American Educational Research Association, New Orleans, LA. Baxter, G. P., & Glaser, R. (1997). An approach to analyzing the cognitive complexity of science performance assessments. Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing and Center for the Study of Evaluation. Bedsworth, W., Colby, S., & Doctor, J. (2006). Reclaiming the American dream. Boston, MA: Bridgespan. Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 8 Assessment for Effective Intervention XX(X) Boekaerts, M. (1999). Self-regulated learning: Where we are today. International Journal of Educational Research, 31, 445–457. Boston, C. (2003). Cognitive science and assessment. College Park, MD: Office of Educational Research and Improvement. Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy of Sciences, U.S. Department of Education. Brown, R. S., & Conley, D. T. (2007). Comparing state high school assessments to standards for success in entry-level university courses. Educational Assessment, 12, 137–160. Brown, R. S., & Niemi, D. N. (2007). Investigating the alignment of high school and community college assessments in California (National Center Report No. 07-3). The National Center for Public Policy and Higher Education. Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. A. Bollen & J. S. Long (Eds.), Testing structural equation models (pp. 136–162). Newbury Park, CA: SAGE. Camara, W. J., & Echternacht, G. (2000). The SAT and high school grades: Utility in predicting success in college. College Entrance Examination Board, Office of Research and Development (Report CB-RN-10). New York, NY: College Board. Chen, X. (2005). First generation students in postsecondary education: A look at their college transcripts (NCES 2005-171). U.S. Department of Education, National Center for Education Statistics. Washington, DC: U.S. Government Printing Office. Choy, S. P. (2001). Students whose parents did not go to college: Postsecondary access, persistence, and attainment (NCES 2001-126). U.S. Department of Education, National Center for Education Statistics. Washington, DC: U.S. Government Printing Office. Cimetta, A. D., D’Agostino, J. J., & Levin, J. R. (2010). Can high school achievement tests serve to select college students? Educational Measurement: Issues and Practice, 29, 3–12. Coelen, S. P., & Berger, J. B. (2006). First steps: An evaluation of the success of Connecticut students beyond high school. Quincy, MA: Nellie Mae Education Foundation. Conley, D. T. (2003). Understanding university success. Eugene: Center for Educational Policy Research, University of Oregon. Conley, D. T. (2005). College knowledge: What it really takes for students to succeed and what we can do to get them ready. San Francisco, CA: Jossey-Bass. Conley, D. T. (2007a). Redefining college readiness (pp. 8–9). Eugene, OR: Center for Educational Policy Research. Conley, D. T. (2007b). Toward a comprehensive conception of college readiness. Eugene, OR: Educational Policy Improvement Center. Conley, D. T. (2010). College and career ready: Helping all students succeed beyond high school. San Francisco, CA: JosseyBass. Conley, D. T. (2011a). Building on the common core. Educational Leadership, 68, 16–20. Conley, D. T. (2011b). Pathways to postsecondary and career readiness. Invited speaker at College and Career Readiness Regional Workshop, Wellington, NZ. Conley, D. T., Lombardi, A., Seburn, M., & McGaughy, C. (2009). Formative assessment for college readiness on five key cognitive strategies associated with postsecondary success. Paper presented at the 2009 annual conference of the American Educational Research Association, San Diego, CA. Conley, D. T., McGaughy, C., Kirtner, J., Van Der Valk, A., & Martinez-Wenzl, M. T. (2010). College readiness practices at 38 high schools and the development of the CollegeCareerReady School Diagnostic tool. Paper presented at the 2010 annual conference of the American Educational Research Association, Denver, CO. Costa, A., & Kallick, B. (2000). Discovering & exploring habits of mind. A developmental series (Book 1). Alexandria, VA: Association for Supervision and Curriculum Development. Gandara, P., & Bial, D. (2001). Paving the way to higher education: K-12 intervention programs for underrepresented youth. Washington, DC: U.S. Government Printing Office, U.S. Department of Education, National Center for Education Statistics. Gore, P. (2006). Academic self-efficacy as a predictor of college outcomes: Two incremental validity studies. Journal of Career Assessment, 14, 92–115. Hu, L. T., & Bentler, P. M. (1995). Evaluating model fit. In R. H. Hoyle (Ed.), Structural equation modeling: Concepts, issues, and applications (pp. 76–99). Thousand Oaks, CA: SAGE. Kline, R. (1998). Principles and practice of structural equation modeling. New York, NY: Guilford. Kuh, G. D. (2005). Student engagement in the first year of college. In L. M. Upcraft, J. N. Gardner, & B. O. Barefoot (Eds.), Challenging and supporting the first-year student (pp. 86–107). San Francisco, CA: Jossey-Bass. Lombardi, A. R., Seburn, M., & Conley, D. T. (2011a). Development and initial validation of a measure of academic behaviors associated with college and career readiness. Journal of Career Assessment, 19, 375–391. Lombardi, A. R., Seburn, M., & Conley, D. T. (2011b). Treatment of nonresponse items on scale validation: What “don’t know” responses indicate about college readiness. Paper presented at the 2011 annual conference of the American Educational Research Association, New Orleans, LA. MacCallum, R. C., Brown, M. W., & Sugawara, H. M. (1996). Power analysis and determination of sample size for covariance structure modeling. Psychological Methods, 1, 130–149. McDonough, P. M. (2004). Counseling matters: Knowledge, assistance, and organizational commitment to college preparation. In W. G. Tierney (Ed.), Nine propositions relating to the effectiveness of college preparation programs (pp. 69-88). New York, NY: State University of New York Press. McGee, D. (2003). The relationship between WASL scores and performance in the first year of university. Seattle: Office of Educational Assessment, University of Washington. Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012 9 Lombardi et al. Muthen, L. K., & Muthen, B. O. (2010). Mplus version 6.0 user’s guide. Los Angeles, CA: Muthen & Muthen. National Center for Education Statistics. (2004). The condition of education 2004: Remediation and degree completion. Washington, DC: U.S. Department of Education. National Governor’s Association & Council of Chief State School Officers. (2010). Common Core State Standards initiative. Retrieved from http://www.corestandards.org/ National Research Council. (2002). Learning and understanding: Improving advanced study of mathematics and science in U.S. high schools. Washington, DC: National Academy Press. Noble, J. P., & Camara, W. J. (2003). Issues in college admissions testing. In J. E. Wall & G. R. Walz (Eds.), Measuring up: Assessment issues for teachers, counselors, and administrators (pp. 283-296). Greensboro, NC: ERIC Counseling and Student Services Clearinghouse. Nunnally, J. C. (1975). Psychometric theory: 25 years ago and now. Educational Researcher, 4, 7–21. Pascarella, E. T., Pierson, C. T., Wolniak, G. C., & Terenzini, P. T. (2004). First-generation college students: Additional evidence on college experiences and outcomes. Journal of Higher Education, 75, 249–284. Perkins, D. (1992). Smart schools: Better thinking and learning for every child. New York, NY: Free Press. Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review, 16, 385–407. Plank, S. B., & Jordan, W. J. (2001). Effects of information, guidance, and actions on postsecondary destinations: A study of talent loss. American Educational Research Journal, 38, 947–979. Ritchhart, R. (2002). Intellectual character: What it is, why it matters, and how to get it. San Francisco, CA: Jossey-Bass. SPSS, Inc. (2010). PASW 18.0 for windows. Chicago, IL: IBM. Stanton-Salazar, R. D. (2001). Manufacturing hope and despair: The school and kin support networks of U.S.-Mexican youth. New York, NY: Teachers College Press. Tinto, V. (2007). Research and practice of student retention: What’s next? Journal of College Student Retention, 8, 1–19. Torres, J. B., & Solberg, V. S. (2001). Role of self-efficacy, stress, social integration, and family support in Latino college student persistence and health. Journal of Vocational Behavior, 59, 3–63. U.S. Department of Education. (2010). Race to the Top Assessment Program. Retrieved from http://www2.ed.gov/programs/ racetothetop-assessment/index.html Venezia, A., Kirst, M. W., & Antonia, A. L. (2003). Betraying the college dream: How disconnected K-12 and postsecondary education systems undermine student aspirations. Stanford, CA: Stanford Institute for Higher Education Research. Wiley, A., Wyatt, J., & Camara, W. J. (2010). The development of a multidimensional college readiness index. New York, NY: College Board. Wolters, C. A. (1998). Self-regulated learning and college students’ regulation of motivation. Journal of Educational Psychology, 90, 224–235. Zajacova, A., Lynch, S. M., & Espenshade, T. J. (2005). Selfefficacy, stress, and academic success in college. Research in Higher Education, 46, 677–706. Downloaded from aei.sagepub.com at UNIV OF OREGON on July 17, 2012