The Summertime Blues

            A long-ago rock song said it first: “There ain’t no cure for the summertime blues.”

            In the 1970s, Barbara Heyns (1978) discovered that over the summer, disadvantaged students lost a lot more of what they had learned in school than did advantaged students. Ever since then, educators have been trying to figure out how they could use time during the summer to help disadvantaged students catch up academically. I got interested in this recently because I have been trying to learn what kinds of educational interventions might be most impactful for the millions of students who have missed many months of school due to Covid-19 school closures. Along with tutoring and after school programs, summer school is routinely mentioned as a likely solution.

            Along with colleagues Chen Xie, Alan Cheung, and Amanda Neitzel, I have been looking at the literature on summer programs for disadvantaged students.

            There are two basic approaches to summer programs intended to help at-risk students. One of these, summer book reading, gives students reading assignments over the summer (e.g., Kim & Guryan, 2010). These generally have very small impacts, but on the other hand, they are relatively inexpensive.

            Of greater interest to the quest for powerful interventions to overcome Covid-19 learning losses are summer school programs in reading and mathematics. Studies of most of the summer school programs found they made little difference in outcomes. For example, an evaluation of a 5-week, six hour a day remedial program for middle school students found no significant differences in reading or math (Somers et al., 2015). However, there was one category of summer school programs that had at least a glimmer of promise. All three involved intensive, phonics-focused programs for students in kindergarten or first grade. Schachter & Jo (2005) reported substantial impacts of such a program, with a mean effect size of +1.16 on fall reading measures. However, by the following spring, a follow-up test showed a non-significant difference of +0.18. Zvoch & Stevens (2013), using similar approaches, found effect sizes of +0.60 for kindergarten and +0.78 for first grade. However, no measure of maintenance was reported. Borman & Dowling (2006) provided first graders with a 7-week reading-focused summer school. There were substantial positive effects by fall, but these disappeared by spring. The same students qualified for a second summer school experience after second grade, and this once again showed positive effects that faded by the following spring. There was no cumulative effect.

Because these studies showed no lasting impact, one might consider them a failure. However, it is important to note the impressive initial impacts, which might suggest that intensive reading instruction could be a part of a comprehensive approach for struggling readers in the early grades, if these gains were followed up during the school year with effective interventions. What summertime offers is an opportunity to use time differently (i.e., intensive phonics for young students who need it). It would make more sense to build on the apparent potential of focused summer school, rather than abandoning it based on its lack of long-term impacts.

            All by themselves, summer programs, based on the evidence we have so far “Ain’t no cure for the summertime blues.” But in next week’s blog, I discuss some ideas about how short-term interventions with powerful impacts, such as tutoring, pre-kindergarten,  and intensive phonics for students in grades K-1 in summer school, might be followed up with school-year interventions to produce long-term positive impacts. Perhaps summer school could be part of a cure for the school year blues.

References

Borman, G. D., & Dowling, Ν. M. (2006). Longitudinal achievement effects of multiyear summer school: Evidence from the Teach Baltimore randomized field trial. Educational Evaluation and Policy Analysis, 28, 25-48. doi:10.3102/01623737028001025

Heyns, B. (1978). Summer learning and the effect of schooling. New York: Academic Press.

Kim, J. S., & Guryan, J. (2010). The efficacy of a voluntary summer book reading intervention for low-income Latino children from language minority families. Journal of Educational Psychology, 102, 20-31. doi:10.1037/a0017270

Somers, M. A., Welbeck, R., Grossman, J. B., & Gooden, S. (2015). An analysis of the effects of an academic summer program for middle school students. Retrieved from ERIC website: https://files.eric.ed.gov/fulltext/ED558507.pdf

Schacter, J., & Jo, B. (2005). Learning when school is not in session: A reading summer day-camp intervention to improve the achievement of exiting first-grade students who are economically disadvantaged. Journal of Research in Reading, 28, 158-169. doi:10.1111/j.1467-9817.2005.00260.x

Zvoch, K., & Stevens, J. J. (2013). Summer school effects in a randomized field trial. Early Childhood Research Quarterly, 28(1), 24-32. doi:10.1016/j.ecresq.2012.05.002

Photo credit: American Education: Images of Teachers and Students in Action (CC BY-NC 4.0)

This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.

Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org

The Summer Slide: Fact or Fiction?

One of the things that “everyone knows” from educational research is that while advantaged students gain in achievement over the summer, disadvantaged students decline. However, the rate of gain during school time, from fall to spring, is about the same for advantaged and disadvantaged students. This pattern has led researchers such as Alexander, Entwisle, and Olson (2007) and Allington & McGill-Franzen (2018) to conclude that differential gain/loss over the summer completely explains the gap in achievement between advantaged and disadvantaged students. Middle class students are reading, going to the zoo, and going to the library, while disadvantaged students are less likely to do these school-like things.

The “summer slide,” as it’s called, has come up a lot lately, because it is being used to predict the amount of loss disadvantaged students will experience as a result of Covid-19 school closures. If disadvantaged students lose so much ground over 2 ½ months of summer vacation, imagine how much they will lose after five or seven or nine months (to January, 2021)!  Remarkably precise-looking estimates of how far behind students will be when school finally re-opens for all are circulating widely. These estimates are based on estimates of the losses due to “summer slide,” so they are naturally called “Covid slide.”

I am certain that most students, and especially disadvantaged students, are in fact losing substantial ground due to the long school closures. The months of school not attended, coupled with the apparent ineffectiveness of remote teaching for most students, do not bode well for a whole generation of children. But this is abnormal. Ordinary summer vacation is normal. Does ordinary summer vacation lead to enough “summer slide” to explain substantial gaps in achievement between advantaged and disadvantaged students?

 I’m pretty sure it does not. In fact, let me put this in caps:

SUMMER SLIDE IS PROBABLY A MYTH.

Recent studies of summer slide, mostly using NWEA MAP data from millions of children, are finding results that call summer slide into question (Kuhfeld, 2019; Quinn et al., 2016) or agree that it happens but that summer losses are similar for advantaged and disadvantaged students (Atteberry & McEachin, 2020). However, hiding in plain sight is the most conclusive evidence of all: NWEA’s table of norms for the MAP, a benchmark assessment widely used to monitor student achievement. The MAP is usually given three times a year. In the chart below, calculated from raw data on the NWEA website (teach.mapnwea.org), I compute the gains from fall to winter, winter to spring, and spring to fall (the last being “summer”). These are for grades 1 to 5 reading.

GradeFall to winterWinter to springSpring to fall (summer)
19.925.550.95
28.854.371.05
37.283.22-0.47
45.832.33-0.35
54.641.86-0.81
Mean7.303.470.07

NWEA’s chart is probably accurate. But it suggests something that cannot possibly be true. No, it’s not that students gain less in reading each year. That’s true. It is that students gain more than twice as much from fall to winter as they do from winter to spring. That cannot be true.Why would students gain so much more in the first semester than the second? One might argue that they are fresher in the fall, or something like that. But double the gain, in every elementary grade? That cannot be right.

 Here is my explanation. The fall score is depressed.

The only logical explanation for extraordinary fall-to-winter gain is that many students score poorly on the September test, but rapidly recover.

I think most elementary teachers already know this. Their experience is that students score very low when they return from summer vacation, but this is not their true reading level. For three decades, we have noticed this in our Success for All program, and we routinely recommend that teachers place students in our reading sequence not where they score in September, but no lower than they scored last spring. (If students score higher in September than they did on a spring test, we do use the September score).

What is happening, I believe, is that students do not forget how to read, they just momentarily forget how to take tests. Or perhaps teachers do not invest time in preparing students to take a pretest, which has few if any consequences, but they do prepare them for winter and spring tests. I do not know for sure how it happens, but I do know for sure, from experience, that fall scores tend to understate students’ capabilities, often by quite a lot. And if the fall score is artificially or temporarily low, then the whole summer loss story is wrong.

Another indicator that fall scores are, shall we say, a bit squirrely, is the finding by both Kuhfield (2019) and Atteberry & McEachin (2020) that there is a consistent negative correlation between school year gain and summer loss. That is, the students who gain the most from fall to spring lose the most from spring to fall. How can that be? What must be going on is just that students who get fall scores far below their actual ability quickly recover, and then make what appear to be fabulous gains from fall to spring. But that same temporarily low fall score gives them a summer loss. So of course there is a negative correlation, but it does not have any practical meaning.

So far, I’ve only been talking about whether there is a summer slide at all, for all students taken together. It may still be true, as found in the Heyns (1978) and Alexander, Entwisle, and Olson (2007) studies, that disadvantaged students are not gaining as much as advantaged students do over the summer. Recent studies by Atteberry & McEachin (2020) and Kuhfeld (2019) do not find much differential summer gain/loss according to social class. One the other hand, it could be that disadvantaged students are more susceptible to forgetting how to take tests. Or perhaps disadvantaged students are more likely to attend schools that put little emphasis on doing well on a September test that has no consequences for the students or the school. But it is unlikely they are truly forgetting how to read. The key point is that if fall tests are unreliable indicators of students’ actual skills, if they are just temporary dips that do not indicate what students can do, then taking them seriously in determining whether or not “summer slide” exists is not sensible.

By the way, before you begin thinking that while summer slide may not happen in reading but it must exist in math or other subjects, prepare to be disappointed again. The NWEA MAP scores for math, science, and language usage follow very similar patterns to those in reading.

Perhaps I’m wrong, but if I am, then we’d better start finding out about the amazing fall-to-winter surge, and see how we can make winter-to-spring gains that large! But if you don’t have a powerful substantive explanation for the fall-to-winter surge, you’re going to have to accept that summer slide isn’t a major factor in student achievement.

References

Alexander, K. L., Entwisle, D. R., & Olson, L. S. (2007). Lasting consequences of the summer learning gap. American Sociological Review, 72(2), 167-180.  doi:10.1177/000312240707200202

Allington, R. L., & McGill-Franzen, A. (Eds.). (2018). Summer reading: Closing the rich/poor reading achievement gap. New York, NY: Teachers College Press.

Atteberry, A., & McEachin, A. (2020). School’s out: The role of summers in understanding achievement disparities. American Educational Research Journal https://doi.org/10.3102/0002831220937285

Heyns, B. (1978). Summer learning and the effect of schooling. New York: Academic Press.

Kuhfeld, M (2019). Surprising new evidence on summer learning loss. Phi Delta Kappan, 101 (11), 25-29.

Quinn, D., Cook, N., McIntyre, J., & Gomez, C. J. (2016). Seasonal dynamics of academic achievement inequality by socioeconomic status and race/ethnicity: Updating and extending past research with new national data. Educational Researcher, 45 (8), 443-453.

This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.

 Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org

Could Intensive Education Rescue Struggling Readers?

Long, long ago, I heard about a really crazy idea. Apparently, a few private high schools were trying a scheduling plan in which instead of having students take all of their subjects every day, they would take one subject at a time for a month or six weeks. The idea was that with a total concentration on one subject, with no time lost in changing classes, students could make astonishing progress. At the end of each week, they could see the progress they’d made, and really feel learning happening.

Algebra? Solved!

French? Accompli!

Of course, I could not talk anyone into trying this. I almost got a Catholic school to try it, but when they realized that kids would have to take religion all day, that was that.

However, in these awful days, with schools nationwide closing for months due to Covid, I was thinking about a way to use a similar concept with students who have fallen far behind, or actually with any students who are far behind grade level for any reason.

What happens now with students who are far behind in, say, reading, is that they get a daily period of remedial instruction, or special education. For most of them, despite the very best efforts of dedicated teachers, this is not very effective. Day after day after day, they get instruction that at best moves them forward at a slow, steady pace. But after a while, students lose any hope of truly catching up, and when you lose hope, you lose motivation, and no one learns without motivation.

blog_8-13-20_tripletutor_333x500So here is my proposal. What if students who were far behind could enroll in a six-week intensive service designed to teach them to read, no matter what? They would attend an intensive class, perhaps all day, in which they receive a promise: this time, you’ll make it. No excuses. This is the best chance you’ll ever have. Students would be carefully assessed, including their vision and hearing as well as their reading levels. They would be assigned to one-to-small group or, if necessary, one-to-one instruction for much of the day. There might be music or sports or other activities between sessions, but imagine that students got three 40-minute tutoring sessions a day, on content exactly appropriate to their needs. The idea, as in intensive education, would be to enable the students to feel the thrill of learning, to see unmistakable gains in a day, extraordinary gains in a week. The tutoring could be to groups of four for most students, but students with the most difficult, most unusual problems could receive one-to-one tutoring.

The ideal time to do this intensive tutoring would be summer school. Actually, this has been done in a few studies. Schacter & Jo (2005) provided intensive phonics instruction to students after first grade in three disadvantaged schools in Los Angeles. The seven-week experience increased their test scores by an effect size of +1.16, compared to similar students who did not have the opportunity to attend summer school. Zvoch & Stevens (2015) also provided intensive phonics instruction in small groups in a 5-week reading summer school. The students were disadvantaged kindergartners and first graders in a medium-sized city in the Pacific Northwest. The effect sizes were +0.60 for kindergarten, +0.78 for first grade.

Summer is not the only good time for intensive reading instruction. Reading is so important that it would be arguably worthwhile to provide intensive six-week instruction (with time out for mathematics) and breaks for, say, sports and music, during the school year.

If intensive education were as effective as ordinary 40-minute daily tutoring, it might be no more expensive. A usual course of tutoring is 20 weeks, so triple tutoring sessions for six weeks would cost almost the same as 18 weeks of ordinary tutoring. In other words, if intensive tutoring is more effective than ordinary tutoring, then the additional benefits might cost little or nothing.

Intensive tutoring would make particular sense to try during summer, 2021, when millions of students will still be far behind in reading because of the lengthy school closures they will have experienced. I have no idea whether intensive tutoring will be more or less effective than ordinary one-to-small group tutoring (which is very, very effective; see here and here). Planfully concentrating tutoring during an intensive period of time certainly seems worth a try!

References

Schacter, J., & Jo, B. (2005). Learning when school is not in session: A reading summer day-camp intervention to improve the achievement of exiting first-grade students who are economically disadvantaged. Journal of Research in Reading, 28, 158-169. doi:10.1111/j.1467-9817.2005.00260.x

Zvoch, K., & Stevens, J. J. (2013). Summer school effects in a randomized field trial. Early Childhood Research Quarterly, 28(1), 24-32. doi:10.1016/j.ecresq.2012.05.002

Photo credit: American Education: Images of Teachers and Students in Action, (CC BY-NC 4.0)

 This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.

Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org

 

“Am I Even Real Anymore?” The Truth About Virtual Learning

“My 8-year-old was sobbing last night because she misses playing with her friends at recess, she misses her teacher, and she is worried that everyone has forgotten her. At one point, she asked me if she was even real anymore.”

This appeared in a letter that ran in the July 25th Baltimore Sun. It was written by Jenny Elliott, a Catonsville mother of elementary students. In it, Ms. Elliott tells how she and her husband have been unable to get their kids to do the virtual learning assignments her kids’ school has assigned.

“(Virtual learning) just doesn’t work for them. I can’t physically force them to stare at their devices and absorb information. We’ve yelled, we’ve begged, we’ve made a game of it…we’ve tried everything we can think of. We failed.”

One of the most poignant parts of Ms. Elliott’s letter is her feeling that everyone knows that virtual learning isn’t working for most kids, but no one wants to say so.

“I am begging someone to speak honestly about virtual learning. I have only the perspective of an elementary school parent, but I have to imagine this negatively impacts children at all levels. The communication coming (from authorities) all over the country…it feels delusional.”

Ms. Elliott expresses enormous guilt. “As a parent, I’ve had to see this every day for the last five months, and every day I feel crushing guilt that I can’t make any of it better.” She expresses gratitude for the efforts of her kids’ teachers, and feels sympathy for them. (“I love you, teachers. I am so sorry this is your reality too.”)

blog_8-6-20_computerbordom_500x338 Ms. Elliott notes that if anyone should be able to make virtual learning work, it should be her family: “…a secure living situation, two parents, food security, access to high-speed internet, access to an internet-powered device, etc.” I might add that Catonsville is in suburban Baltimore County, which has a national reputation for its substantial investments in technology over many years.

Since I have written many blogs about schools’ responses to the Covid school closures, I’ve been expecting a letter like this. Informally, my colleagues and I have been chatting to teachers and parents we know with kids in school. Almost every single one tells a story like Ms. Elliott’s. Highly educated parents, plenty of technology, tech-savvy kids, capable and hard-working teachers, all different ages, it does not seem to matter. Teachers and parents alike refer to motivated and successful students who log on and then pay no attention. The kids are  communicating on a different device with their friends, playing games, reading, whatever. There are kids who are engaged with virtual learning, but very few that we’ve heard about.

Much of the reporting about virtual learning has emphasized the lack of access to the Internet, and districts are spending billions to provide devices and improve access. There is a lot of talk about how school closures are increasing learning gaps because disadvantaged students lack access to the Internet, as though school closures are only a problem for disadvantaged students. But if Ms. Elliott is representative of many parents, and I’m sure she is, the problem is far larger than that of students who lack access to technology.

Everyone involved with schools seems to know this, but they do not want to talk about it. There seems to be a giant, unspoken pall of guilt that keeps the reality of what is happening in virtual learning from being discussed openly. Parents feel guilty because they feel deficient if they are not able to get their kids to respond to virtual learning. Teachers feel guilty because they don’t want to admit that they are not able to get more of their students to pay attention. School administrators want to be perceived to be doing something, anything, to combat the educational effects of extended school closures, so while they do talk about the need to obtain more devices and offer teachers more professional development, they do not like to talk about the kids who do have devices, but don’t do much with them. They promise that things will soon be better, with more devices, more professional development, and better lessons turning the tide. Ms. Elliott is sympathetic, but doubtful. “I appreciate those efforts and wholeheartedly believe the educational system is doing the absolute best they can…but I can’t pretend that the virtual school plans will work for our kids.”

Ms. Elliott states at the beginning of her letter that she has no solution to suggest, but she just wants the truth to be known. I have no sure-fire solutions myself. But I do know one thing. Any workable solutions there may be will have to begin with a firm understanding of what is really happening in schools using virtual learning.

In most of the U.S., opening schools in August or September should be out of the question. The rates of new Covid cases remain far too high, and no amount of social distancing inside schools can be safe for students or staff when there are so many carriers of the disease outside of schools. The only true solution, until cures or vaccines are widely available, is to return to the one thing that has worked throughout the world: mandating universal use of masks, shutting down businesses that put people close to each other, and so on. This is the only thing that saved China and South Korea and Italy and Spain and New York City, and it is the only solution now. The faster we return to what works, the sooner we can fully open schools, and then start the long process of healing the terrible damage being done to our children’s learning.

I am not suggesting giving up on virtual learning. If schools will be closed for a long time, it is all we have. But I am pessimistic about trying to fix the current approach to virtual learning. I think we could use all those computers and educators online to much greater effort by providing online tutoring to individuals and small groups, for example, rather than trying to create a classroom community out of children working from home. Perhaps there are ways other than tutoring to use online instruction effectively, but I do not know them. In any case, we need immediate investment in development and evaluation to find the most effective and cost-effective solutions possible, while we wait for a safe time to open schools. We’ll all get through this, one way or the other, but in order to minimize the negative impact on student learning, let’s start with the truth, and then build and use the evidence of what works.

 This blog was developed with support from Arnold Ventures. The views expressed here do not necessarily reflect those of Arnold Ventures.

Note: If you would like to subscribe to Robert Slavin’s weekly blogs, just send your email address to thebee@bestevidence.org