[go: up one dir, main page]

Academia.eduAcademia.edu

Managing Director

2008

EBP briefs A scholarly forum for guiding evidence-based practices in speech-language pathology

Volume 3, Issue 1 April 2008 EBP briefs A scholarly forum for guiding evidence-based practices in speech-language pathology M I D  L I  S: A A L E B J. E, E.D., CCC-SLP P  D   D P D  C S  D U  C F O, FL A publication of Pearson ISSN 1941-7756 EBP Briefs A publication of Pearson Editor Chad Nye University of Central Florida Editorial Review Board Frank Bender Private Practice Bertha Clark Middle Tennessee State University Gayle Daly Longwood University Donna Geffner St. John’s University Joan Kaderavek University of Toledo Cheryl Lang Detroit Public Schools Anita McGinty University of Virginia Judy Montgomery Chapman University Barbara Moore-Brown Anaheim Union High School District Jamie Schwartz University of Central Florida Sheila Ward Detroit Public Schools Managing Director Tina Eichstadt Pearson 5601 Green Valley Drive Bloomington, MN 55437 Cite this document as: Ehren, Barbara J. (2008). Making Informed Decisions about Literacy Intervention in Schools: An Adolescent Literacy Example EBP Briefs, 3(1), 1-11 Making Informed Decisions about Literacy Intervention in Schools: An Adolescent Literacy Example Barbara J. Ehren, Ed.D., CCC-SLP Professor and Director of the Doctoral Program Department of Communication Sciences and Disorders University of Central Florida, Orlando, FL The role of the speech-language pathologist (SLP) in literacy has received considerable attention since the publication of the related American Speech-LanguageHearing Association (ASHA) guidelines in 2001; e.g., (Ehren, 2002; Ehren, 2006; Ehren & Ehren, 2001; Justice, Invernizzi, & Meier, 2002; Nelson & Van Meter, 2006; Roth & Ehren, 2001; Roth & Troia, 2006; Silliman & Wilkinson, 2004; Ukrainetz, 2006; Wallach, 2008.) However, SLPs in schools continue to struggle with issues related to delivery of literacy-related services, including the type of intervention they should provide. With the concurrent EBP fits well within movement toward evidenceschool culture based practice (EBP) in the field of speech-language pathology, an additional challenge to school SLPs is to integrate EBP into their work with literacy. This task involves adopting EBP as a decision-making process to guide the selection and evaluation of assessment and intervention approaches. Problems with Implementing EBP For many school SLPs, EBP is not standard operating procedure. Why is that? Certainly it is not because they refuse to base their practice on sound methods. However, there are several possible reasons: (1) EBP may look and feel to many practitioners like an academic exercise that should not be a priority for on-theground SLPs with everything else they have to do. Part of the problem may be related to the complexity of some of the steps in the EBP process. Several iterations of the evidencebased decision-making process have been proposed (e.g., ASHA, n.d.; Ehren, Fey & Gillam, 2005; Gillam & Gillam, 2006; Johnson, 2006; Nye, Schwartz & Turner, 2005) with all of them requiring a literature search and an appraisal of the levels and quality of the evidence. Many school SLPs would not consider such activities as part of their workload. The problem also may relate to the perceived dichotomy between research and practice, with scientific evidence the domain of academicians in universities and implementation methods the domain of practitioners in schools. This is unfortunate because, if given the chance, EBP can forge research-to-practice links. (2) Even if SLPs are inclined to locate and evaluate research, time to do so becomes an issue, especially for SLPs with a heavy workload. Gathering and appraising the research base is the most time-consuming part of the EBP decision-making process. SLPs with many students to serve might wonder how they have time for these steps, running from therapy sessions with students to IEP meetings, scheduled during the lunchtime they had to give up on Monday. However, Dollaghan’s (2004a) suggestion to focus on Internet access to high-yield sources and sites may help with that roadblock, as will the growing number of systematic reviews of intervention research, as will be discussed later in this brief. (3) Perhaps terminology is getting in the way. The No Child Left Behind Act of 2001 (NCLB) and the Individuals with Disabilities Education Act of 2004 (IDEA 04) require that programs, methods and materials rely on “scientifically based research,” defined as involving “the application of rigorous, systematic, and objective procedures to obtain reliable and valid knowledge…”(20 U.S.C. 6365,Sec 1208[6]). Educators for the most part do not use the term “evidence-based practice,” a term borrowed from the medical community (Sackett, Rosenberg, Gray, Haynes, & Richardson, 1996). Early on, Whitehurst (2002) talked about “evidence-based education” (EBE), but this term is not heard often in schools. Selecting practices rooted in scientifically based research is often loosely interpreted and may not be congruent with the rigor associated with EBP as a decision-making process. For example, the principal of a school might have purchased a program claiming to have a scientific research base. That could mean a number of things, from (a) the authors read some studies in the area and developed their program based on their interpretation of research findings to (b) the authors have tested their program with large numbers of students and have data to support its effectiveness. The school staff may not have engaged in a decision-making process regarding 2 EBP Briefs the use of this particular program with their students. An SLP working at such a school will come away with a very different idea of EBE/EBP than what has been discussed in the speech-language pathology literature. These problems cannot be trivialized with an admonition to school-based SLPs that to be professional, they must employ EBP. Rather, EBP must be interpreted in the context of life in the schools, consistent with the requirements and intent of federal mandates; moreover, it must be framed as a practical, doable companion to managing a workload, not an extra set of tasks on top of Parents have the right the SLPs’ current workload. to expect SLPs to have Presenting EBP to school-based a cogent rationale for SLPs in this light is the purpose the intervention they of this brief. are providing. Rationale for Employing EBP If David Letterman were an SLP working in the schools, he might offer the following “Top 10 Reasons for SLPs to Use EBP” (Ehren et al., 2005): 10. “Evidence-based practice” are new buzzwords and SLPs just like to say them. 9. You need more three-letter acronyms to add to your meager jargon repertoire. 8. You keep hearing about EBP and your curiosity is getting the better of you. 7. You love Tom Cruise and want to borrow his movie line and say, “Show me the data.” 6. You hope Oprah will do a show on EBP and you want to be prepared should you be invited to appear. 5. You don’t want to say “Duh” when a parent asks you why you are using a particular technique. 4. Your principal asked you how you were complying with the NCLB requirement to use scientifically based practices. 3. You don’t want to go to a due process hearing without a rationale for what you are doing. 2. You are a conscientious professional and want to make sound decisions about intervention. 1. You know the students you serve don’t have time to waste with practices that may be ineffective. While the reasons listed as ten through six are humorous, the top five reasons should be real motivators for practitioners to include EBP decision-making as part of their repertoire. Parents have the right to expect SLPs to have a cogent rationale for the intervention they are providing. School administrators expect all educators in their buildings to be accountable to legal mandates for using scientifically based practice. SLPs who have been part of a due process proceeding can attest to the value of sound decision-making when parents challenge practices or results. From job satisfaction and commitment to mission perspectives, SLPs want to be confident that they are doing the best job they can to help struggling students. And perhaps most important, students — especially adolescents with literacy problems — have limited time in school to resolve issues; therefore, efficient and effective use of their time is a priority. These are indeed cogent reasons. The question, however, remains: How can EBP be made a palatable, doable process that resonates with school SLPs? The EBP Decision-Making Process in Schools “The goal of EBP is the integration of (a) clinical expertise, (b) best current evidence, and (c) client values to provide high-quality services reflecting the interests, values, needs, and choices of the individuals we serve” (ASHA, 2004, p 1). How does this goal fit within the culture of schools? It fits very well on several counts. EBP involves a way of doing business, a template for professional practice. It is far more than an academic exercise; it is an integral part of providing services. A popular approach in schools that is consistent with standards-based education is “backward design,” which in simple terms means, “Start with the end in mind” (Wiggins & McTighe, 1998). The end for educators is directed toward the outcomes they want students to achieve, including state curriculum standards. EBP fits nicely within this framework because the litmus test for an evidence-based decision is whether the desired results were obtained. A rationale for using EBP is to promote student success in language and literacy skills and strategies necessary for academic achievement. An intermediate target for SLPs is to document mastery of Individualized Education Program (IEP) goals that promote access to the curriculum, as required by IDEA 2004. For SLPs, accountability for student outcomes orients them to what they need to accomplish. Accountability requirements of NCLB focus on achievement of specific Literacy Intervention in Schools—Adolescents subgroups,includingstudentswithdisabilities.SLPsinschools no doubt have heard school administrators voice concern about making AYP (adequate yearly progress); that is, meeting the achievement targets set by their states. SLPs participate in this important school mission by operating within EBP parameters. “Accountability emphasizes the need for school-based professionals to deliver instruction and interventions that have demonstrated efficiency (the time taken to reach a desired outcome) and effectiveness (the likelihood that the desired outcome will be achieved)” EBP involves a way (Justice & Fey, 2004, p. 3). of doing business, A term related to EBP a template for professional practice. that might be more familiar to SLPs within school culture is “data-based decision-making.” This term is typically applied to schools analyzing their achievement results in order to design a plan for school improvement. Although EBP is not the same process, the common elements are (a) a reliance on hard data, (b) a thoughtful, analytical process and (c) a focus on what works. Commitment to data-based decision-making involves looking at data before continuing, refining, or abandoning an approach. It includes thoughtful selection of measurement tools and procedures that provide evidence of desired outcomes, both on IEP goals and on curriculum standards. This focus on data-based decisionmaking in schools provides a context for SLPs to employ EBP and perhaps a basis of support from administrators for engaging in EBP processes. Given that EBP clearly fits within school culture, how can the decision-making process be framed in a way that resonates with school SLPs? From a practical standpoint, SLPs might think of the EBP process as a series of seven questions whose answers will help them make informed decisions about assessment and intervention methods: 1. Have studies been done that address your area of concern? If so, what did they find? 2. How well do the studies relate to your specific question and student(s)? 3. How convincing were the findings of the studies? 4. What other factors should be considered in making a decision about what to do? 5. What is the best choice, considering the hard data and other factors? 3 6. Was the decision a good one? If so, how do you know? 7. How are you doing with the EBP process? Let’s see how this decision-making process plays out for Rosemary, the SLP at Sunny Shores Middle School (Grades 6-8). An Adolescent Literacy Example Scenario Rosemary is planning for a new school year. She is scheduled to serve the middle school one day a week. At present, 15 students with learning disabilities (LD) are identified as having language impairment (LI) with reading comprehension problems. They are two or more grade levels below in reading. Last year, she pulled students out of their classrooms and provided intervention in her “speech” room for one 30-minute session per week, when she could find them and when they would come. Her focus was on helping them with reading comprehension tasks. Basically, she worked with the students to complete end-of-chapter questions in the social studies text. She also provided consultative services to teachers in making classroom accommodations that were listed For SLPs, accountability on students’ IEPs. Rosemary is for student outcomes now wondering if she should orients them to what do the same thing this coming they need to accomplish. year or whether there is another way to address students’ reading comprehension problems. A point that should be made about Rosemary’s situation is that in schools, the type of intervention SLPs are able to provide is shaped by the structure of services (e.g., how often an SLP is at a school, how many times the students are seen). Therefore, EBP decision-making with regard to intervention is intertwined with delivery of service issues. Rosemary’s situation is set against the backdrop of what is going on in her school district. Due to the critical shortage of SLPs, Sunny Shores School District is minus an SLP and has determined that one day a week of service is all they can provide in middle schools. Question 1: Have studies been done that address your area of concern? If so, what did they find? Rosemary’s concern is how to provide effective and efficient reading comprehension intervention for her middle 4 EBP Briefs school students with LI. It is significant that Rosemary has targeted this area in her work with adolescents. She is in the ballpark with regard to the type of intervention she might provide. She is aware of the ASHA guidelines on Roles and Responsibilities of Speech-Language Pathologists with Respect to Reading and Writing in Children and Adolescents (ASHA, 2001) and has been reading other material on SLPs’ work with adolescents (Ehren, 2002; Ehren, 2006). Some of her readings have suggested that reading comprehension strategies be taught directly and explicitly to adolescents. She is wondering if she should go that route next year instead of just helping them answer comprehension questions. However, she doesn’t have hours to conduct a literature review. Most of her work is with younger students and she has many questions regarding effective intervention in several areas. For example, she has elementary-age students with autism and students who use augmentative /alternative communication and present many challenges. If she needed to locate studies on every area in which she provides therapy, she wouldn’t have time to do therapy. Rosemary remembers going to a workshop where the presenter reviewed studies on interventions with adolescents. She will start there. Her notes from the workshop include (1) an analysis and synthesis of existing literature in reading interventions for adolescents with LD by Mastropieri, Scruggs, and Graetz (2003), (2) a meta-analysis by Swanson and Hoskyn (2001) on interventions with adolescents, and (3) research from the University of Kansas Center for Research on Learning (KUCRL) in the use of strategies with adolescents (e.g. Bulgren, Hock, Schumaker & Deshler, 1995; Clark, Deshler, Schumaker, Alley, & Warner, 1984; Schumaker, EBP decisionDeshler, Alley, Warner, making with regard & Denton, 1982; Schumaker, to intervention is et al., 2006; Faggella-Luby, intertwined with Schumaker, & Deshler, delivery of service 2007; Fritschmann, Deshler issues. & Schumaker, 2007). What Rosemary learned was that taken as a whole, research on reading comprehension intervention with adolescents with LD indicates that instruction in self-questioning type strategies (e.g., activating prior knowledge, summarizing, finding main ideas, self-monitoring, text-structure), along with direct instructional elements (e.g., instruction broken into individual steps, modeling by the teacher), produced large effects in reading comprehension (Mastropieri, et al., 2003). For example, Mastropieri, Scruggs, Bakken and Whedon (1996) in their research synthesis found an effect size of 1.33 for self-questioning types of strategies. [An effect size is a way of quantifying the size of the difference between two groups (e.g., a treatment and a control group or a treatment and comparison group) to determine how well an intervention works (Coe, 2002). Anything over .8 is considered large (Cohen, 1969).] In this case the large effect size attests to the value of teaching reading comprehension strategies to adolescents with LD. Adding to the evidence composite, Swanson and Hoskyn (2001) conducted a metaanalysis on intervention An effect size is a way with adolescents with of quantifying the size of LD. [A meta-analysis is the difference between two groups. “the accepted means for objectively synthesizing a body of research outcomes (i.e., a collection of primary studies) to determine the weight of scientific evidence…” (Robey & Dalebout, 1998, p. 1228)] A key finding was that studies that emphasized explicit instruction, including frequent practice, yielded larger effect sizes (d=0.80). Swanson and Hoskyn noted that the prototypical intervention was 40 minutes of daily instruction four times a week over 20 sessions. Rosemary also learned that for 30 years KUCRL has been conducting programmatic research on the Strategic Instruction Model for adolescents. This research consists mostly of single-subject, multiple-baseline design studies, measuring the effect of strategy instruction in several areas, including reading comprehension (Schumaker & Deshler, 1992). Researchers have demonstrated with more than 120 low-achieving students, including those with learning disabilities, that direct, explicit teaching of strategies incorporating a practice protocol results in student gains. For example, most recently Fritschmann, Deshler and Schumaker (2007) reported a large effect size (r=.91) for gains evidenced by adolescents on a standardized reading test after intervention using The Inferencing Strategy (Fritschmann, Schumaker & Deshler, 2007) for reading comprehension. The evidence from the Mastropieri et al. (2003) review, the Swanson and Hoskyn (2001) meta-analysis and the 30 years of research at KUCRL gives Rosemary a cogent rationale for considering direct, explicit teaching of reading comprehension strategies. This convergence of evidence attests to the strength of the approach (ASHA, 2004). If she looked further, she would find other scientific support as well (e.g., the reviews by Gersten, Fuchs, Williams & Baker, 2001; Vaughn, Gersten & Chard, 2000). Fortunately, Rosemary made good use of information she obtained in the workshop she attended. Answering the question about available studies can be the deal-breaker Literacy Intervention in Schools—Adolescents for school SLPs because of the time involved. Rather than conceptualizing this step as an extensive literature review done solo, a more realistic approach for school SLPs is to look to others for help with the legwork. Several sources are possible: (a) Going to workshops, like Rosemary did, at which presenters provide information based on scientific evidence; (b) Locating EBP guidelines or systematic reviews (e.g., the Compendium of EBP Guidelines and Systematic Reviews on the ASHA website [http://www. asha.org/members/ebp/compendium], and the Campbell Collaboration library of systematic reviews on intervention [http://www.campbellcollaboration.org]); (c) Checking the comprehensive list of systematic reviews/meta-analyses and EBP guidelines provided by Johnson (2006); and (d) Of course, SLPs Reading systematic or narrative cannot suspend therapy until such reviews of individual authors information is that synthesize information; available. e.g., Cirrin & Gillam, 2008; Law, Garrett & Nye, 2004. The downside of relying on EBP guidelines and systematic or narrative reviews currently is that not every area that SLPs need to know about has been addressed in these forums. Of course, SLPs cannot suspend therapy until such information is available. Another option is for Rosemary’s district to launch a professional development initiative through which SLPs can earn in-service points for conducting literature reviews on specific assessment or intervention questions, looking especially for compilations of research studies rather than single studies, as suggested by Dollaghan (2004a). In the context of her suggestion for collaborative workgroups, Johnson (2006) offers helpful resources for such a process. The methods for searching the literature proffered by Gillam and Gillam (2006) also would be useful. Having a university academician as a collaborator also may be beneficial if SLPs want a refresher on interpreting research studies. This option does involve work in addition to the regular workload at Rosemary’s school, although the consolation is that by working with other SLPs the load will be lighter and participants can earn in-service points. However, a real service to practitioners would be for more academicians to take on the tasks of conducting and publishing systematic reviews. What if there are no studies on the SLP’s specific question? This is indeed a possibility because not every question SLPs may ask has been addressed scientifically. If that’s the case, SLPs should find studies as closely related as possible to their intervention question and then ask the rest of the questions listed above to guide decision-making. 5 Resources other than research studies can be considered in the decision-making process (e.g., expert opinions found in non-empirical journal articles and textbooks). This approach, however, is a slippery slope that can lead SLPs back to the old ways of data-free thinking that EBP is meant to replace. In the absence of scientific studies, SLPs will need to collect data to confirm or reject the effectiveness of the chosen approach or technique (See discussion of Question 6, below.) Question 2: How well do the studies related to your specific question and student(s)? Now that Rosemary has gathered scientific evidence, she needs to think about how well that evidence relates to her specific situation: i.e., her intervention question and the needs of her students. The Mastropieri et al. (2003) review was directed at reading comprehension for adolescents with LD, specifically addressed comprehension strategies, and included middle and high school students. In the Swanson and Hoskyn (2001) meta-analysis, 90% of the studies that were analyzed focused on reading (comprehension and vocabulary); the average IQ score of students studied was 96, similar to the group she is concerned about; and the age range in the studies was from 11-17 years of age. The KUCRL studies included low-achieving students, some with learning disabilities, with IQ scores in the average range, and adolescents in middle and high school were included. Taken together, these sources are a close match to the information Rosemary is seeking. Question 3: How convincing were the findings of the studies? Not all evidence is equal; some is more convincing than others because of the type or quality of research employed. Many sources have addressed the comparative strengths of particular research designs and the concomitant factors that contribute to the overall strength of the evidence (Dollaghan, 2004b; Gillam & Gillam, 2006, Johnson, 2006, Nye, et al., 2005). Rosemary does not consider herself an expert in research design but she knows that the research synthesis by Mastropieri et al. (2003) was published in a well-respected, peer-reviewed journal and came under the scrutiny of experts. She can say the same about Swanson and Hoskyn’s (2001) meta-analysis. Meta-analyses look at many studies and can give a larger picture of evidence in an area. Although there may be disagreement among meta-analysts about how one should conduct this procedure, it too stood the test of peer review. Even though the KUCRL studies were single-subject designs, the number of students involved 6 EBP Briefs (more than 120) across many years with similar findings reveals a pattern of positive outcomes for the adolescents receiving reading comprehension intervention. Overall, the preponderance of evidence from the three sources Rosemary considered points favorably in the direction of explicitly teaching strategies to adolescents struggling with reading comprehension. When Rosemary works with the other SLPs in her district to gather and appraise scientific evidence in other targeted areas of inquiry, she will suggest they utilize a variety of resources including the U.S. Department of Education (2003) publication, Identifying and Implementing Education Practices Supported by Rigorous Evidence: A User Friendly Guide, and the research quality indicators offered by Gersten et al. (2005). The SLPs will be on the lookout for studies that have control groups, statistically significant differences between treated and untreated groups, validity and reliability of outcome measures, effect sizes, and many other criteria. As with the task of conducting a literature review, it would be ideal for practitioners to have assistance with this evidence appraisal process from individual academicians and professional groups conducting systematic reviews. Question 4: What other factors should be considered in making a decision about what to do? Dollaghan (2004a) points out that EBP requires SLPs to identify and make use of the highest-quality scientific evidence as one component of our efforts to provide optimal services. The key word is “one”; many other factors are germane to selecting a literacy intervention approach. In making her decision, Rosemary also thinks about her own experiences; student-related issues; family culture, beliefs, and values; and school/district issues and constraints. She notes the following: • Students want to do work that will pay off; that helps them get better grades and do well on tests. • Students are not actively engaged in the current intervention; they don’t always attend. • Parents and students are concerned about removing students from class for interventions. • Students are doing poorly on district/state-administered tests. • Rosemary does not see progress with the current approach. • Rosemary does not know how to teach strategies. • The district doesn’t see a compelling need to provide more speech-language service to middle school students. • The cost of additional services is an issue for the district. • The critical shortage of SLPs is an issue in expanding services in secondary settings. Rosemary should consider all of these factors in conjunction with the scientific evidence when she makes a decision. Question 5: What is the best choice, considering the hard data and other factors? Making a practice decision is not an exact science. Ehren et al. (2005) and Gillam and Gillam (2006) suggested ways to integrate information by rating the value of various components, using a grid to weigh the comparative importance level of factors; i.e., study rankings, student/ parent factors, and clinician/school factors (Tables 1 and 2). The rationale for trying to compare factors is to show that in most instances scientific evidence from research studies will trump other factors. Although a grid may oversimplify a complex decision-making process, the important concept is that high-quality, well-designed and effectively implemented research should not be ignored even when other significant factors may pertain. It would be foolhardy for Rosemary to ignore the research about the beneficial effect of providing explicit strategy instruction even though she is unfamiliar with using this approach. The scientific evidence provides a cogent rationale for her to learn how to provide it. In addition, the strength of the evidence on strategy-teaching for reading comprehension with adolescents may mean that the district policy regarding service to adolescents should be examined. On the other hand, student/parent and clinician/ school factors may reinforce a scientifically based treatment decision. In this case example, students do not now attend therapy sessions and are not engaged when they do attend. Furthermore, Rosemary is unable to document student progress with the approach she is using now. Trying the strategy approach supported by the research evidence may turn things around. All things considered, Rosemary decided that she should teach reading comprehension strategies explicitly to her middle school students with reading comprehension problems, but that doing so for 30 minutes once a week will not be enough, based on the intervention intensity Literacy Intervention in Schools—Adolescents parameters identified by research. (Recall that the intervention prototype identified in the meta-analysis was 40 minutes of daily instruction four times a week over 20 sessions). Therefore, she also decided that she needs to collaborate with other special education teachers to achieve the intervention intensity (and necessary practice) supported in the literature. She will negotiate to provide services within the language arts class taught by the LD teachers, instead of pulling students out. She also will coordinate her therapy with intervention provided by the teachers. In addition, she decided that she should work with school and district staff to reexamine workloads of SLPs in the district to use resources more effectively because now she is convinced that they can be doing something more for adolescents. Question 6: Was the decision a good one? If so, how do you know? Before implementing the decision, Rosemary is wise to consider explicit strategy teaching as a promising practice for her students in her situation. She won’t know if it was a good choice until she tries it and gathers data about how well it worked. This step is a key to EBP and at the heart of “progress monitoring,” another common term used in schools. “Progress monitoring is a scientifically based practice that is used to assess and evaluate the effectiveness of instruction. Progress monitoring can be implemented with individual students or an entire class” (National Center on Student Progress Monitoring, n.d., p. 1). Rosemary and her collaborating special education teachers will collect data on strategy acquisition as students are learning and also will collect outcome data on reading achievement. They will reach an agreement on assessment measures and a timetable relevant to literacy achievement. Question 7: How are you doing with the EBP process? In addition to evaluating the EBP decision, Rosemary also should evaluate the EBP decision-making process. She can do so by asking herself the following questions from Ehren et al. (2005): • Did you find and evaluate applicable research findings? • Did you analyze pertinent standard care, student/parent factors, and SLP/school factors? • Did you weigh the evidence and other factors in a logical, savvy manner; that is, did the most important data trump other factors? • What would you do differently next time? 7 • Did your decision result in measurable student progress? • Were there any unforeseen consequences as a result of the decision? • What would you do differently next time? Conclusion Perhaps the most effective way of promoting EBP in schools is to recast the conversation in school terms, thinking of EBP as a way to promote standards-based education (student achievement outcomes), to meet accountability requirements of legal mandates, and to conduct business consistent with data-based decision making and progress monitoring. In a very real sense, EBP is about SLPs in schools selecting and engaging in practices that have the best chance of working. EBP also positions SLPs to withstand challenges to their procedures and outcomes by parents and supports SLPs in advocating with their schools and districts for service delivery that facilitates implementation of effective intervention. As SLPs move toward new or expanded roles with literacy, it is essential that they couple their literacy intervention efforts with EBP. 8 EBP Briefs References American Speech-Language-Hearing Association (n.d.). Key steps in the EBP process. Retrieved February 1, 2008 from http://www.asha.org/members/ebp/ American Speech-Language-Hearing Association. (2001). Roles and responsibilities of speech-language pathologists with respect to reading and writing in children and adolescents (position statement, guidelines, technical report and knowledge and skills required). Rockville, MD: Author. American Speech-Language-Hearing Association (2004). Report of the Joint Coordinating Committee on Evidence- Based Practice. Rockville, MD: Author. Bulgren, J.A., Hock, M.F., Schumaker, J.B., & Deshler, D.D. (1995). The effects of instruction in a paired associates strategy on the information mastery performance of students with learning disabilities. Learning Disabilities Research & Practice, 10 (1), 22-37. Cirrin, F. M. & Gillam, R. B. (2008). Language intervention practices for school-age children with spoken language disorders: A systematic review. Language, Speech, and Hearing Services in Schools, 39, 110–137. Clark, F.L., Deshler, D.D., Schumaker, J.B., Alley, G.R., & Warner, M.M. (1984). Visual imagery and selfquestioning: Strategies to improve comprehension of written material. Journal of Learning Disabilities, 17 (3), 145-149. Coe, R. (2002, September). It’s the effect size, stupid: What effect size is and why it is important. Paper presented at the British Educational Research Association annual conference, Exeter, UK. Cohen, J. (1969). Statistical power analysis for the behavioral sciences. NY: Academic Press. Dollaghan, C. (2004a). Evidence-based practice in communication disorders: What do we know, and when do we know it? Journal of Communication Disorders 37, 391–400. Dollaghan, C. (2004b, April). Evidence-based practice: Myths and realities. The ASHA Leader, 4-5, 12. Ehren, B. J. (2002). Speech-language pathologists contributing significantly to the academic success of high school students: A vision for professional growth. Topics in Language Disorders, 22(2), 60-80. Ehren, B. J. (2006). Partnerships to support reading comprehension for students with language impairment, Topics in Language Disorders, 26 (1), 41-53. Ehren, B.J. & Ehren, T.C. (2001). New or expanded literacy roles for speech-language pathologists: Making it happen in the schools. Seminars in Speech and Language, 22(3), 233-243. Ehren, B. J., Fey, M. E., Gillam, R. (2005, July). SLPs start your engines: Evidence-based practice in schools. Opening plenary session, American SpeechLanguage Hearing Association Schools Conference, Indianapolis, IN. Faggella-Luby, M., Schumaker, J.B., & Deshler, D.D. (2007). Embedded learning strategy instruction: Story-structure pedagogy in heterogeneous secondary literature classes. Learning Disability Quarterly, 30, 131-147. Fritschmann, N. S., Deshler, D. D., & Schumaker, J. B. (2007). The effect of instruction in an inference strategy on the reading comprehension skills of adolescents with disabilities, Learning Disabilities Quarterly, 30, 245-262. Fritschmann, N. S., Schumaker, J. B. & Deshler, D. D. (2007). INFER: The inferencing strategy instructor’s manual. Lawrence, KS: Edge Enterprises. Gersten, R., Fuchs, L., Williams, J., & Baker, S. (2001). Teaching reading comprehension strategies to students with learning disabilities: A review of research. Review of Education Research, 71(2), 279320. Gersten, R., Fuchs, L., Compton, D., Coyne, M., Greenwood, C., Innocenti, M. (2005). Quality indicators for group experimental and quasiexperimental research in special education. Exceptional Children, 71 (2), 149-164. Literacy Intervention in Schools—Adolescents Gillam, S. L. & Gillam, R. B. (2006). Making evidencebased decisions about child language intervention in schools. Language, Speech and Hearing Services in Schools, 37, 304-315. 9 No Child Left Behind Act of 2001, Pub. L. No. 107-110, 20 U.S. C. §§ 6301 et seq. Individuals with Disabilities Education Improvement Act of 2004, Pub. L. No. 108-446, 20 U.S. C. §§ 1400 et seq. Nye, C., Schwartz, J. & Turner, H. (2005, February/ March). Evidence-based practice for treating communication disorders: What helps? What harms? Based on what evidence? CSHA Magazine, 34 (3), 6-10, 12, 29. Johnson, C. J. (2006). Getting started in evidence-based practice for childhood speech-language disorders. American Journal of Speech-Language Pathology, 15, 20-35. Robey, R. R. & Dalebout, S. D. (1998). A tutorial on conducting meta-analysis of clinical outcomes research. Journal of Speech, Language, and Hearing Research, 44, 1227-1241. Justice, L. M., & Fey, M. E. (2004, September). Evidence- based practice in schools: Integrating craft and theory with science and data. The ASHA Leader, 4–5, 30–32. Roth, F. P. & Ehren, B. J. (Eds.) (2001). Implementing Intervention roles. Seminars in Speech and Language, 22 (3). Justice, L M., Invernizzi, M A., Meier, Joanne D. (2002). Designing and implementing an early literacy screening protocol: Suggestions for the speechlanguage pathologist. Language, Speech, & Hearing Services in Schools, 33 (2), 84-101. Law, J., Garrett, Z., & Nye, C. (2004). The efficacy of treatment for children with developmental speech and language delay/ disorder: A meta-analysis. Journal of Speech, Language, and Hearing Research. 47, 924–943. Mastropieri, M., Scruggs, T., Bakken J. P., & Whedon, C. (1996). Reading comprehension: A synthesis of research in learning disabilities. In T. E. Scruggs * M. A. Mastropieri (Eds.). Advances in learning and behavioral disabilities: Intervention research (vol. 10, Part B, pp. 201-227. Greenwich, CT: JAI Press. Mastropieri, M., Scruggs, T., & Graetz, J. (2003). Reading comprehension instruction for secondary students: Challenges for struggling students and teachers. Learning Disability Quarterly, 26 (2), 103-116. National Center on Student Progress Monitoring (n.d.). What is progress monitoring? Retrieved March 5, 2008 from www.studentprogress.org. Nelson, N.W. & Van Meter, A. M. (2006). Partnership for literacy in a writing lab approach. Topics in Language Disorders, 26 (1), 55-69. Roth, P. & Troia, G A.. (2006). Collaborative efforts to promote emergent literacy and efficient word recognition skills. Topics in Language Disorders, 26 (1), 24-41. Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R. B., & Richardson, W. S. (1996). Evidence-based medicine: What it is and what it isn’t. British Medical Journal, 312, 71–72. Schumaker, J.B., & Deshler, D.D. (1992). Validation of learning strategy interventions for students with learning disabilities: Results of a programmatic research effort. In B. Y. L. Wong (Ed.), Contemporary intervention research in learning disabilities: An international perspective (pp. 22-46). New York: Springer-Verlag. Schumaker, J.B., Deshler, D.D., Alley, G.R., Warner, M.M., & Denton, P. H. (1982). Multipass: A learning strategy for improving reading comprehension. Learning Disability Quarterly, 5 (3), 295-304. Schumaker, J.B., Deshler, D.D., Woodruff, S.K., Hock, M.F., Bulgren, J.A., & Lenz, B.K. (2006). Reading strategy interventions: Can literacy outcomes be enhanced for at-risk adolescents? Teaching Exceptional Children, 38(3), 64-68. Silliman, E. R. & Wilkinson, L. C. (Eds.) (2004). Language and literacy learning in schools. 10 EBP Briefs Swanson, H. L. (1999). Reading research for students with LD: A meta-analysis of intervention outcomes. Journal of Learning Disabilities, 32, 504-532. Swanson, H.L. & Hoskyn, M. (2001). A meta-analysis of intervention research for adolescent students with learning disabilities. Learning Disabilities Research & Practice, 16, 109-119. Ukrainetz, T. A. (2006). Contextualized language intervention: Scaffolding preK-12 literacy achievement. Eau Claire, WI: Thinking Publications. U. S. Department of Education Institute of Education Sciences (2003). Identifying and implementing educational practices supported by rigorous evidence: A user friendly guide. Washington, DC: Author. Vaughn, S., Gersten, R, & Chard, D. (2000). The underlying message in LD intervention research: Findings from research syntheses. Exceptional Children, 67(1), 99-114. Wallach, G.P. (2008). Language intervention for school-age students: Setting goals for academic success. St. Louis, MO: Mosby. Whitehurst, G. J. (2002, November). Federal reading initiatives: Potential roles for SLPs. Paper presented at the annual meeting of the American SpeechLanguage-Hearing Association, Atlanta, GA. Wiggins, G. & McTighe, J. (1998). Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development. Literacy Intervention in Schools—Adolescents 11 Table 1. Grid comparing factors in EBP decision-making along a continuum of importance Factors Continuum of Importance High Study Ranking Student/Parent Factors Ia Low Ib IIa IIb III I II III IV I II III Clinician/School Factors IV IV from Ehren et al, 2005 Table 2. Values of specific study rankings, student/parent factors, and clinician/school factors Study Ranking Studen/Parent Factors Clinican/School Factors Ia - Meta-Analysis or Systematic Review I - Cultural Values or Beliefs I – District-Wide Data Collected Systematically Ib - Randomized Controlled Study II - Level of Student/Parent Engagement II - Clinician-collected Treatment Data IIa - Controlled Study without Randomization III - Financial Resources III - Personal Clinical Judgment Based on Theoretical Orientation and Training IIb - QuasiExperimental or Multiple Baseline IV - Student/Parent Opinions IV - School Culture/Policy III - Nonexperimental (Correlational/Case) IV - Committee Report, Consensus Conference from Ehren et al, 2005