Getting Below the Surface to Understand Disappointing Outcomes

Long ago, I toured West Germany, visiting some family friends near Hanover. They suggested I go see Duderstadt, a picturesque town nearby (see picture of it below).

My wife, Nancy, and I drove into Duderstadt and walked around. It was indeed gorgeous, but very strange. Not a person was in sight. Every shop was closed. In the center of the town was a beautiful church. We reasoned that churches are always open. We walked to the church door, I stretched out my hand to open it, but inches away the door burst open. An entire wedding party streamed out into the street. The church was packed to the rafters with happy people, now following the bride and groom out of the church. Mystery solved.

If social scientists came to Duderstadt when we did but failed to see the wedding, they might make all sorts of false conclusions. An economist might see the empty shops and conclude that the economy of rural Germany is doomed, due to low productivity. A demographer might agree and blame this on the obviously declining workforce. But looking just the thickness of a church door beneath the surface, all could immediately understand what was happening.

My point here is a simple one. I am a quant. I believe in numbers and rigorous research designs. But at the same time, I also want to understand what is really going on, and the main numbers rarely tell the whole story.

I was thinking about this when I read the rather remarkable study by Carolyn Heinrich and her colleagues (2010), cited in my two previous blogs. Like many other researchers, she and her colleagues found near-zero impacts for Supplemental Educational Services. At the time this study took place, this was a surprise. How could all that additional instructional time after school not make a meaningful difference?

But instead of just presenting the overall (bad) findings, she poked around town, so to speak, to find out what was going on.

What she found was appalling, but also perfectly logical. Most eligible middle and high school students in Milwaukee who were offered after-school programs either failed to sign up, or if they did sign up, did not attend even a single day, or if they did attend a single day, they attended irregularly, thereafter. And why did they not sign up or attend? Most programs offered attractive incentives, such as iPods, very popular at the time, so about half of the eligible students did sign up, at least. But after the first day, when they got their incentives, students faced drudgery. Heinrich et al. cite evidence that most instruction was either teachers teaching immobile students, or students doing unsupervised worksheets. Heinrich et al.’s technical report had a sentence (dropped in the published report), which I quoted previously, but will quote again here: “One might also speculate that parents and students are, in fact, choosing rationally in not registering for or attending SES.”

A study of summer school by Borman & Dowling (2006) made a similar observation. K-1 students in Baltimore were randomly assigned to have an opportunity to attend three years of summer school. The summer school sessions included 7 weeks of 6-hour a day activities, including 2 ½ hours of reading and writing instruction, plus sports, art, and other enrichment activities. Most eligible students (79%) signed up and attended in the first summer, but fewer did so in the second summer (69%) and even fewer in the third summer (42%). The analyses focused on the students who were eligible for the first and second summers, and found no impact on reading achievement. There was a positive effect for the students who did show up and attended for two summers.

Many studies of summer school, after school, and SES programs overall (including both) have just reported the disappointing outcomes without exploring why they occurred. Such reports are important, if well done, but they offer little understanding of why. Could after school or summer school programs work better if we took into account the evidence on why they usually fail? Perhaps. For example, in my previous blog, I suggested that extended-time programs might do better if they provided one-to-one, or small-group tutoring. However, there is only suggestive evidence that this might be true, and there are good reasons that it might not be, because of the same attendance and motivation problems that may doom any program, no matter how good, when struggling students go to school during times when their friends are outside playing.

Econometric production function models predicting that more instruction leads to more learning are useless unless we take into account what students are actually being provided in extended-time programs and what their motivational state is likely to be. We have to look a bit below the surface to explain why disappointing outcomes are so often achieved, so we can avoid mistakes and repeat successes, rather than making the same mistakes over and over again.

Correction

My recent blog, “Avoiding the Errors of Supplemental Educational Services,” started with a summary of the progress of the Learning Recovery Act.  It was brought to my attention that my summary was not correct.  In fact, the Learning Recovery Act has been introduced in Congress, but is not part of the current reconciliation proposal moving through Congress and has not become law. The Congressional action cited in my last blog was referring to a non-binding budget resolution, the recent passage of which facilitated the creation of the $1.9 trillion reconciliation bill that is currently moving through Congress. Finally, while there is expected to be some amount of funding within that current reconciliation bill to address the issues discussed within my blog, reconciliation rules will prevent the Learning Recovery Act from being included in the current legislation as introduced. I apologize for this error.

References

Borman, G. D., & Dowling, N. M. (2006). Longitudinal achievement effects of multiyear summer school: Evidence from the Teach Baltimore randomized field trial. Educational Evaluation and Policy Analysis, 28(1), 25–48. https://doi.org/10.3102/01623737028001025

Heinrich, C. J., Meyer, R., H., & Whitten, G. W. (2010). Supplemental Education Services under No Child Left Behind: Who signs up and what do they gain? Education Evaluation and Policy Analysis, 32, 273-298.

Photo credit: Amrhingar, CC BY-SA 3.0, via Wikimedia Commons

This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.

Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org

Avoiding the Errors of Supplemental Educational Services (SES)

“The definition of insanity is doing the same thing over and over again, and expecting different results.” –Albert Einstein

Last Friday, the U.S. Senate and House of Representatives passed a $1.9 trillion recovery bill. Within it is the Learning Recovery Act (LRA). Both the overall bill and the Learning Recovery Act are timely and wonderful. In particular, the LRA emphasizes the importance of using research-based tutoring to help students who are struggling in reading or math. The linking of evidence to large-scale federal education funding began with the 2015 ESSA definition of proven educational programs, and the LRA would greatly increase the importance of evidence-based practices.

But if you sensed a “however” coming, you were right. The “however” is that the LRA requires investments of substantial funding in “school extension programs,” such as “summer school, extended day, or extended school year programs” for vulnerable students.

This is where the Einstein quote comes in. “School extension programs” sound a lot like Supplemental Educational Services (SES), part of No Child Left Behind that offered parents and children an array of services that had to be provided after school or in summer school.

The problem is, SES was a disaster. A meta-analysis of 28 studies of SES by Chappell et al. (2011) found a mean effect size of +0.04 for math and +0.02 for reading. A sophisticated study by Deke et al. (2014) found an effect size of +0.05 for math and -0.03 for reading. These effect sizes are just different flavors of zero. Zero was the outcome whichever way you looked at the evidence, with one awful exception: The lowest achievers, and special education students, actually performed significantly less well in the Deke et al. (2014) study if they were in SES than if they qualified but did not sign up. The effect sizes for these students were around -0.20 for reading and math. Heinrich et al. (2009) also reported that the lowest achievers were least likely to sign up for SES, and least likely to attend regularly if they did. All three major studies found that outcomes did not vary much depending on which type of provider or program they received. Considering that the per-pupil cost was estimated at $1,725 in 2021 dollars, these outcomes are distressing, but more important is the fact that despite the federal government’s willingness to spend quite a lot on them, millions of struggling students in desperate need of effective assistance did not benefit.

Why did SES fail? I have two major explanations. Heinrich et al. (2009), who added questionnaires and observations to find out what was going on, discovered that at least in Milwaukee, attendance in SES after-school programs was appalling (as I reported in my previous blog). In the final year studied, only 16% of eligible students were attending (less than half signed up at all, and of those, average attendance in the remedial program was only 34%). Worse, the students in greatest need were least likely to attend.

From their data and other studies they cite, Heinrich et al. (2010) paint a picture of students doing boring, repetitive worksheets unrelated to what they were doing in their school-day classes. Students were incentivized to sign up for SES services with incentives, such as iPods, gift cards, or movie passes. Students often attended just enough to get their incentives, but then stopped coming. In 2006-2007, a new policy limited incentives to educationally-related items, such as books and museum trips, and attendance dropped further. Restricting SES services to after-school and summertime, when attendance is not mandated and far from universal, means that students who did attend were in school while their friends were out playing. This is hardly a way to engage students’ motivation to attend or to exert effort. Low-achieving students see after school and summertime as their free time, which they are unlikely to give up willingly.

Beyond the problems of attendance and motivation in extended time, there was another key problem with SES. This was that none of the hundreds of programs offered to students in SES were proven to be effective beforehand (or ever) in rigorous evaluations. And there was no mechanism to find out which of them were working well, until very late in the program’s history. As a result, neither schools nor parents had any particular basis for selecting programs according to their likely impact. Program providers probably did their best, but there was no pressure on them to make certain that students benefited from SES services.

As I noted in my previous blog, evaluations of SES do not provide the only evidence that after school and summer school programs rarely work for struggling students. Reviews of summer school programs by Xie et al. (in press) and of after school programs (Dynarski et al., 2002; Kidron & Lindsay, 2014) have found similar outcomes, always for the same reasons: poor attendance and poor motivation of students in school when they would otherwise have free time.

Designing an Effective System of Services for Struggling Students

There are two policies that are needed to provide a system of services capable of substantially improving student achievement. One is to provide services during the ordinary school day and year, not in after school or summer school. The second is to strongly emphasize the use of programs proven to be highly effective in rigorous research.

Educational services provided during the school day are far more likely to be effective than those provided after school or in the summer. During the day, everyone expects students to be in school, including the students themselves. There are attendance problems during the regular school day, of course, especially in secondary schools, but these problems are much smaller than those in non-school time, and perhaps if students are receiving effective, personalized services in school and therefore succeeding, they might attend more regularly. Further, services during the school day are far easier to integrate with other educational services. Principals, for example, are far more likely to observe tutoring or other services if they take place during the day, and to take ownership for ensuring their effectiveness. School day services also entail far fewer non-educational costs, as they do not require changing bus schedules, cleaning and securing schools more hours each day, and so on.

The problem with in-school services is that they can disrupt the basic schedule. However, this need not be a problem. Schools could designate service periods for each grade level spread over the school day, so that tutors or other service providers can be continuously busy all day. Students should not be taken out of reading or math classes, but there is a strong argument that a student who is far below grade level in reading or math needs a reading or math tutor using a proven tutoring model far more than other classes, at least for a semester (the usual length of a tutoring sequence).

If schools are deeply reluctant to interrupt any of the ordinary curriculum, then they might extend their day to offer art, music, or other subjects during the after-school session. These popular subjects might attract students without incentives, especially if students have a choice of which to attend. This could create space for tutoring or other services during the school day. A schedule like this is virtually universal in Germany, which provides all sports, art, music, theater, and other activities after school, so all in-school time is available for academic instruction.

Use of proven programs makes sense throughout the school day. Tutoring should be the main focus of the Learning Recovery Act, because in this time of emergency need to help students recover from Covid school closures, nothing less will do. But in the longer term, adoption of proven classroom programs in reading, math, science, writing, and other subjects should provide a means of helping students succeed in all parts of the curriculum (see www.evidenceforessa.org).

In summer, 2021, there may be a particularly strong rationale for summer school, assuming schools are otherwise able to open.  The evidence is clear that doing ordinary instruction during the summer will not make much of a difference, but summer could be helpful if it is used as an opportunity to provide as many struggling students as possible in-person, one-to-one or one-to-small group tutoring in reading or math.  In the summer, students might receive tutoring more than once a day, every day for as long as six weeks.  This could make a particularly big difference for students who basically missed in-person kindergarten, first, or second grade, a crucial time for learning to read.  Tutoring is especially effective in those grades in reading, because phonics is relatively easy for tutors to teach.  Also, there is a large number of effective tutoring programs for grades K-2.  Early reading failure is very important to prevent, and can be prevented with tutoring, so the summer months may get be just the right time to help these students get a leg up on reading.

The Learning Recovery Act can make life-changing differences for millions of children in serious difficulties. If the LRA changes its emphasis to the implementation of proven tutoring programs during ordinary school times, it is likely to accomplish its mission.

SES served a useful purpose in showing us what not to do. Let’s take advantage of these expensive lessons and avoid repeating the same errors. Einstein would be so proud if we heed his advice.

Correction

My recent blog, “Avoiding the Errors of Supplemental Educational Services,” started with a summary of the progress of the Learning Recovery Act.  It was brought to my attention that my summary was not correct.  In fact, the Learning Recovery Act has been introduced in Congress, but is not part of the current reconciliation proposal moving through Congress and has not become law. The Congressional action cited in my last blog was referring to a non-binding budget resolution, the recent passage of which facilitated the creation of the $1.9 trillion reconciliation bill that is currently moving through Congress. Finally, while there is expected to be some amount of funding within that current reconciliation bill to address the issues discussed within my blog, reconciliation rules will prevent the Learning Recovery Act from being included in the current legislation as introduced.

References

Chappell, S., Nunnery, J., Pribesh, S., & Hager, J. (2011). A meta-analysis of Supplemental Education Services (SES) provider effects on student achievement. Journal of Education for Students Placed at Risk, 16 (1), 1-23.

Deke, J., Gill, B. Dragoset, L., & Bogen, K. (2014). Effectiveness of supplemental educational services. Journal of Research in Educational Effectiveness, 7, 137-165.

Dynarski, M. et al. (2003). When schools stay open late: The national evaluation of the 21st Century Community Learning Centers Programs (First year findings). Washington, DC: U.S. Department of Education.

Heinrich, C. J., Meyer, R., H., & Whitten, G. W. (2010). Supplemental Education Services under No Child Left Behind: Who signs up and what do they gain? Education Evaluation and Policy Analysis, 32, 273-298.

Kidron, Y., & Lindsay, J. (2014). The effects of increased learning time on student academic and nonacademic outcomes: Findings from a meta‑analytic review (REL 2014-015). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Appalachia.

Xie, C., Neitzel, A., Cheung, A., & Slavin, R. E. (2020). The effects of summer programs on K-12 students’ reading and mathematics achievement: A meta-analysis. Manuscript submitted for publication.

This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.

Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org

Why Isn’t Achievement Whirled Enough by Time? (Why Summer School, After School, and Extended Day Do Not Work Very Well)

“Had we but world enough and time…” wrote Andrew Marvell, an English poet in the late 1600s (He also had another job, highly relevant to this blog, which I will reveal at the end. No peeking!).

Marvell’s poem was about making the most of the limited time we have on Earth. In education, we understand this sentiment. Time is a key resource for teaching, not to be wasted under any circumstances.

In fact, educators have long tried to improve students’ achievement by increasing their time in school. In particular, struggling students have been invited or required to attend after school or summer school classes.

 Many school reformers have advocated expanded opportunities for extra-time instruction, as solutions to the learning losses due to Covid-19 school closures. In fact, the current draft of the Democrats’ relief bill emphasizes investments in after school and summer school programs to help these students catch up. Yet these very expensive efforts have not had much impact on reading or math learning in studies done before Covid, and are not likely to have much impact now (see my previous blog on this topic).

How can this be? Summer school, for example, offers several weeks of extra teaching in small classes tailored to the learning levels of the students. Yet summer school for reading has been completely ineffective, except for tutoring phonics in K-1. Math summer school studies involving disadvantaged and low-achieving students also found effect sizes near zero (Xie et al., 2020).

With respect to after-school programs, a review by Kidron & Lindsay (2014) found average effect sizes near zero.

A study in Milwaukee by Heinrich et al. (2009) of after school programs provided under Supplemental Education Services (SES) funding found effect sizes near zero for middle and high school students. The authors investigated the reasons for these disappointing findings. Among eligible students, 57% registered in the first year, dropping to 48% by the fourth year. Yet the bigger problem was attendance. As a percent of registered students, attendance dropped from 90% in the first year to 34% in the fourth, meaning that among all eligible students, only 16% attended in the final year. This abysmal attendance rate should not be surprising in light of the observation in the study that most of the after-school time was spent on worksheets, with little or no instruction. The Heinrich et al. (2009) paper contained the following depressing sentence:

“…one might also speculate that parents and students are, in fact, choosing rationally in not registering for or attending SES.” (p. 296).

Reviews of research on the impacts of all approaches to SES find average effects that are appalling (e.g., Chappell et al., 2011). I will write more about SES as a cautionary tale in a later blog, but one conclusion important to the blog is clear: Providing educational programs to struggling students after school or in the summer is unlikely to improve student achievement.

The reasons that additional time after school or in the summer does not enhance achievement is obvious, if you’ve ever been a teacher or a student. No one wants to be sitting in school while their friends are out playing. Extra time approaches that simply provide more of the same are probably boring, tedious, and soul-sapping. Imagine kids watching the clock, quietly cheering for every click. It is no wonder that students fail to register or fail to attend after school or summer school sessions, and learn little in them if they do.

The poet Andrew Marvell had it right. What is important is to make effective use of the time we have, rather than adding time. And his profession, other than being a poet? He was a tutor.

References

Chappell, S., Nunnery, J., Pribesh, S., & Hager, J. (2011). A meta-analysis of Supplemental Education Services (SES) provider effects on student achievement. Journal of Education for Students Placed at Risk, 16 (1), 1-23.

Heinrich, C. J., Meyer, R., H., & Whitten, G. W. (2010). Supplemental Education Services under No Child Left Behind: Who signs up and what do they gain? Education Evaluation and Policy Analysis, 32, 273-298.

Kidron, Y., & Lindsay, J. (2014). The effects of increased learning time on student academic and nonacademic outcomes: Findings from a meta‑analytic review (REL 2014-015). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Appalachia.

Xie, C., Neitzel, A., Cheung, A., & Slavin, R. E. (2020). The effects of summer programs on K-12 students’ reading and mathematics achievement: A meta-analysis. Manuscript submitted for publication.

This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.

Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org

Highlight Tutoring Among Post-Covid Solutions

I recently saw a summary of the education section of the giant, $1.9 trillion proposed relief bill now before Congress. Like all educators, I was delighted to see the plan to provide $130 billion to help schools re-open safely, and to fund efforts to remedy the learning losses so many students have experienced due to school closures.

However, I was disappointed to see that the draft bill suggests that educators can use whatever approaches they like, and it specifically mentioned summer school and after school programs as examples.

Clearly, the drafters of this legislation have not been reading my blogs! On September 10th I wrote a blog reviewing research on summer school and after school programs as well as tutoring and other approaches. More recently, I’ve been doing further research on these recommendations for schools to help struggling students. I put my latest findings into two tables, one for reading and one for math. These appear below.

As you can see, not all supplemental interventions for struggling students are created equal. Proven tutoring models (ones that were successfully evaluated in rigorous experiments) are far more effective than other strategies. The additional successful strategy is our own Success for All whole-school reform approach (Cheung et al., in press), but Success for All incorporates tutoring as a major component.

However, it is important to note that not all tutoring programs are proven to be effective. Programs that do not provide tutors with structured materials and guidance with extensive professional development and in-class coaching, or use unpaid tutors whose attendance may be sporadic, have not produced the remarkable outcomes typical of other tutoring programs.

Tutoring

As Tables 1 and 2 show, proven tutoring programs produce substantial positive effects on reading and math achievement, and nothing else comes close (see Gersten et al., 2020; Neitzel et al., in press; Nickow et al. 2020; Pellegrini et al., 2021; Wanzek et al., 2016).

Tables 1 and 2 only include results from programs that use teaching assistants, AmeriCorps members (who receive stipends), and unpaid volunteer tutors. I did not include programs that use teachers as tutors, because in the current post-Covid crisis, there is a teacher shortage, so it is unlikely that many certified teachers will serve as tutors. Also, research in both reading and math finds little difference in student outcomes between teachers and teaching assistants or AmeriCorps members, so there is little necessity to hire certified teachers as tutors. Unpaid tutors have not been as effective as paid tutors.

Both one-to-one and one-to-small group tutoring by teaching assistants can be effective. One-to-one is somewhat more effective in reading, on average (Neitzel et al., in press), but in math there is no difference in outcomes between one-to-one and one-to-small group (Pellegrini et al., 2021).

Success for All

Success for All is a whole-school reform approach. A recent review of 17 rigorous studies of Success for All found an effect size of +0.51 for students in the lowest 25% of their grades (Cheung et al., in press). However, such students typically receive one-to-one or one-to-small group tutoring for some time period during grades 1 to 3. Success for All also provides all teachers professional development and materials focusing on phonics in grades K-2 and comprehension in grades 2-6, as well as cooperative learning in all grades, parent support, social-emotional learning instruction, and many other elements. So Success for All is not just a tutoring approach, but tutoring plays a central role for the lowest-achieving students.

Summer School

A recent review of research on summer school by Xie et al. (2020) found few positive effects on reading or math achievement. In reading, there were two major exceptions, but in both cases the students were in grades K to 1, and the instruction involved one-to-small group tutoring in phonics. In math, none of the summer school studies involving low-achieving students found positive effects.

After School

A review of research on after-school instruction in reading and math found near-zero impacts in both subjects (Kidron & Lindsay, 2014).

Extended Day

A remarkable study of extended day instruction was carried out by Figlio et al. (2018). Schools were randomly assigned to receive one hour of additional reading instruction for a year, or to serve as a control group. The outcomes were positive but quite modest (ES=+0.09) considering the considerable expense.

Technology

Studies of computer-assisted instruction and other digital approaches have found minimal impacts for struggling students (Neitzel et al., in press; Pellegrini et al., 2021).

Policy Consequences

The evidence is clear that any effort intended to improve the achievement of students struggling in reading or mathematics should make extensive use of proven tutoring programs. Students who have fallen far behind in reading or math need programs known to make a great deal of difference in a modest time period, so struggling students can move toward grade level, where they can profit from ordinary teaching. In our current crisis, it is essential that we follow the evidence to give struggling students the best possible chance of success.

References

Cheung, A., Xie, C., Zhang, T., Neitzel, A., & Slavin, R. E. (in press). Success for All: A quantitative synthesis of evaluations. Journal of Research on Educational Effectiveness.

Figlio, D., Holden, K., & Ozek, U. (2018). Do students benefit from longer school days? Regression discontinuity evidence from Florida’s additional hour of literacy instruction. Economics of Education Review, 67, 171-183.

Gersten, R., Haymond, K., Newman-Gonchar, R., Dimino, J., & Jayanthi, M. (2020). Meta-analysis of the impact of reading interventions for students in the primary grades. Journal of Research on Educational Effectiveness, 13(2), 401–427.

Kidron, Y., & Lindsay, J. (2014). The effects of increased learning time on student academic and nonacademic outcomes: Findings from a meta‑analytic review (REL 2014-015). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Appalachia.

Neitzel, A., Lake, C., Pellegrini, M., & Slavin, R. (in press). A synthesis of quantitative research on programs for struggling readers in elementary schools. Reading Research Quarterly.

Pellegrini, M., Neitzel, A., Lake, C., & Slavin, R. (2021). Effective programs in elementary mathematics: A best-evidence synthesis. AERA Open, 7 (1), 1-29.

Wanzek, J., Vaughn, S., Scammacca, N., Gatlin, B., Walker, M. A., & Capin, P. (2016). Meta-analyses of the effects of tier 2 type reading interventions in grades K-3. Educational Psychology Review, 28(3), 551–576. doi:10.1007/s10648-015-9321-7

Xie, C., Neitzel, A., Cheung, A., & Slavin, R. E. (2020). The effects of summer programs on K-12 students’ reading and mathematics achievement: A meta-analysis. Manuscript submitted for publication.

This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.

Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org

Healing Covid-19’s Educational Losses: What is the Evidence?

I’ve written several blogs (here, here, here, here, here, and here) on what schools can do when they finally open permanently, to remedy what will surely be serious harm to the educational progress of millions of students. Without doubt, the students who are suffering the most from lengthy school closures are disadvantaged students, who are most likely to lack access to remote technology or regular support when their schools have been closed.

 Recently, there have been several articles circulated in the education press (e.g., Sawchuk, 2020) and newsletters laying out the options schools might consider to greatly improve the achievement of students who lost the most, and are performing far behind grade level.

The basic problem is that if schools simply start off with usual teaching for each grade level, this may be fine for students at or just below grade level, but for those who are far below level, this is likely to add catastrophe to catastrophe. Students who cannot read the material they are being taught, or who lack the prerequisite skills for their grade level, will experience failure and frustration. So the challenge is to provide students who are far behind with intensive, additional services likely to quickly accelerate their progress, so that they can then profit from ordinary, at-grade-level lessons.

In the publications I’ve seen, there have been several solutions frequently put forward. I thought this might be a good time to review the most common prescriptions in terms of their evidence basis in rigorous experimental or quasi-experimental research.

Extra Time

One proposal is to extend the school day or school year to provide additional time for instruction. This sounds logical; if the problem is time out of school, let’s add time in school.

The effects of extra time depend, of course, on what schools provide during that additional time. Simply providing more clock hours in which typical instruction is provided makes little difference. For example, in a large Florida study (Figlio, Holden, & Ozek, 2018), high-poverty schools were given a whole hour every day for a year, for additional reading instruction. This had a small impact on reading achievement (ES=+0.09) at a cost of about $800 per student, or $300,000-$400,000 per school. Also, in a review of research on secondary reading programs by Baye, Lake, Inns & Slavin (2019), my colleagues and I examined whether remedial programs were more effective if they were provided during additional time (one class period a day more than what the control group received for one or more years) or if they were provided during regular class time (the same amount of time the control group also received). The difference was essentially zero. The extra time did not matter. What did matter was what the schools provided (here and here).

After-School Programs

Some sources suggest providing after-school programs for students experiencing difficulties. A review of research on this topic by Kidron & Lindsay (2014) examined effects of after-school programs on student achievement in reading and mathematics. The effects were essentially zero. One problem is that students often did not attend regularly, or were poorly motivated when they did attend.

Summer School

As noted in a recent blog, positive effects of summer school were found only when intensive phonics instruction was provided in grades K or 1, but even in these cases, positive effects did not last to the following spring. Summer school is also very expensive.

Tutoring

By far the most effective approach for students struggling in reading or mathematics is tutoring (see blogs here, here, and here). Outcomes for one-to-one or one-to-small group tutoring average +0.20 to +0.30 in both reading and mathematics, and there are several particular programs that routinely report outcomes of +0.40 or more. Using teaching assistants with college degrees as tutors can make tutoring very cost-effective, especially in small-group programs.

Whole-School Reforms

There are a few whole-school reforms that can have substantial impacts on reading and mathematics achievement. A recent review of our elementary school reform model, Success for All (Cheung et al., 2020), found an average effect size of +0.24 for all students across 17 studies, and an average of +0.54 for low achievers.

A secondary reform model called BARR has reported positive reading and mathematics outcomes for ninth graders (T. Borman et al., 2017)

Conclusion

Clearly, something needs to be done about students returning to in-person education who are behind grade level in reading and/or mathematics. But resources devoted to helping these students need to be focused on approaches proven to work. This is not the time to invest in plausible but unproven programs. Students need the best we have that has been repeatedly shown to work.

References

Baye, A., Lake, C., Inns, A., & Slavin, R. (2019). Effective reading programs for secondary students. Reading Research Quarterly, 54 (2), 133-166.

Borman, T., Bos, H., O’Brien, B. C., Park, S. J., & Liu, F. (2017). i3 BARR validation study impact findings: Cohorts 1 and 2. Washington, DC: American Institutes for Research.

Cheung, A., Xie, C., Zhang, T., Neitzel, A., & Slavin, R. E. (2020). Success for All: A quantitative synthesis of evaluations. Manuscript submitted for publication. (Contact us for a copy.)

Figlio, D. N., Holden, K. L., & Ozek, U. (2018). Do students benefit from longer school days? Regression discontinuity evidence from Florida’s additional hour of literacy instruction. Economics of Education Review, 67, 171-183. https://doi.org/10.1016/j.econedurev.2018.06.003

Kidron, Y., & Lindsay, J. (2014). The effects of increased learning time on student academic and nonacademic outcomes: Findings from a meta‑analytic review (REL 2014-015). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Appalachia.

Sawchuk, S. (2020, August 26). Overcoming Covid-19 learning loss. Education Week, 40 (2), 6.

This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org